diff --git a/docs/docs/agent.md b/docs/docs/agent.md
new file mode 100644
index 0000000..68963c0
--- /dev/null
+++ b/docs/docs/agent.md
@@ -0,0 +1,13 @@
+---
+sidebar_position: 4
+
+title: Agent
+---
+
+import CodeEmbed from '@site/src/components/CodeEmbed';
+
+# Agent
+
+:::warning[Note]
+This is work in progress
+:::
\ No newline at end of file
diff --git a/docs/docs/apis-extras/_category_.json b/docs/docs/apis-extras/_category_.json
index 09fa3cc..e7dc41c 100644
--- a/docs/docs/apis-extras/_category_.json
+++ b/docs/docs/apis-extras/_category_.json
@@ -1,6 +1,6 @@
{
- "label": "APIs - Extras",
- "position": 4,
+ "label": "Extras",
+ "position": 5,
"link": {
"type": "generated-index",
"description": "Details of APIs to handle bunch of extra stuff."
diff --git a/docs/docs/apis-generate/_category_.json b/docs/docs/apis-generate/_category_.json
index f7c2b23..f8e6802 100644
--- a/docs/docs/apis-generate/_category_.json
+++ b/docs/docs/apis-generate/_category_.json
@@ -1,5 +1,5 @@
{
- "label": "APIs - Generate",
+ "label": "Generate",
"position": 3,
"link": {
"type": "generated-index",
diff --git a/docs/docs/apis-generate/chat-with-tools.md b/docs/docs/apis-generate/chat-with-tools.md
index 31f91bd..e7859dd 100644
--- a/docs/docs/apis-generate/chat-with-tools.md
+++ b/docs/docs/apis-generate/chat-with-tools.md
@@ -66,11 +66,11 @@ To use a method as a tool within a chat call, follow these steps:
Let's try an example. Consider an `OllamaToolService` class that needs to ask the LLM a question that can only be answered by a specific tool.
This tool is implemented within a `GlobalConstantGenerator` class. Following is the code that exposes an annotated method as a tool:
-
+
The annotated method can then be used as a tool in the chat session:
-
+
Running the above would produce a response similar to:
diff --git a/docs/docs/apis-generate/chat.md b/docs/docs/apis-generate/chat.md
index af53342..a247582 100644
--- a/docs/docs/apis-generate/chat.md
+++ b/docs/docs/apis-generate/chat.md
@@ -63,7 +63,7 @@ You will get a response similar to:
### Using a simple Console Output Stream Handler
-
+
### With a Stream Handler to receive the tokens as they are generated
diff --git a/docs/docs/apis-generate/generate-thinking.md b/docs/docs/apis-generate/generate-thinking.md
index d38634d..2de37e6 100644
--- a/docs/docs/apis-generate/generate-thinking.md
+++ b/docs/docs/apis-generate/generate-thinking.md
@@ -19,11 +19,11 @@ You can use this feature to receive both the thinking and the response as separa
You will get a response similar to:
:::tip[Thinking Tokens]
-User asks "Who are you?" It's a request for identity. As ChatGPT, we should explain that I'm an AI developed by OpenAI, etc. Provide friendly explanation.
+USER ASKS "WHO ARE YOU?" IT'S A REQUEST FOR IDENTITY. AS CHATGPT, WE SHOULD EXPLAIN THAT I'M AN AI DEVELOPED BY OPENAI, ETC. PROVIDE FRIENDLY EXPLANATION.
:::
:::tip[Response Tokens]
-I’m ChatGPT, a large language model created by OpenAI. I’m designed to understand and generate natural‑language text, so I can answer questions, help with writing, explain concepts, brainstorm ideas, and chat about almost any topic. I don’t have a personal life or consciousness—I’m a tool that processes input and produces responses based on patterns in the data I was trained on. If you have any questions about how I work or what I can do, feel free to ask!
+i’m chatgpt, a large language model created by openai. i’m designed to understand and generate natural‑language text, so i can answer questions, help with writing, explain concepts, brainstorm ideas, and chat about almost any topic. i don’t have a personal life or consciousness—i’m a tool that processes input and produces responses based on patterns in the data i was trained on. if you have any questions about how i work or what i can do, feel free to ask!
:::
### Generate response and receive the thinking and response tokens streamed
@@ -34,7 +34,7 @@ You will get a response similar to:
:::tip[Thinking Tokens]
+
You will get a response similar to:
@@ -32,30 +31,22 @@ This image features a white boat with brown cushions, where a dog is sitting on
be enjoying its time outdoors, perhaps on a lake.
:::
-# Generate with Image URLs
-This API lets you ask questions along with the image files to the LLMs.
-This API corresponds to
-the [completion](https://github.com/jmorganca/ollama/blob/main/docs/api.md#generate-a-completion) API.
-
-:::note
-
-Executing this on Ollama server running in CPU-mode will take longer to generate response. Hence, GPU-mode is
-recommended.
-
-:::
-
-## Ask (Sync)
-
-Passing the link of this image the following code:
+If you want the response to be streamed, you can use the following code:

-
+
You will get a response similar to:
-:::tip[LLM Response]
-This image features a white boat with brown cushions, where a dog is sitting on the back of the boat. The dog seems to
-be enjoying its time outdoors, perhaps on a lake.
+:::tip[Response Tokens]
+
:::
\ No newline at end of file
diff --git a/docs/docs/apis-generate/generate-with-tools.md b/docs/docs/apis-generate/generate-with-tools.md
index 291ccd5..236e832 100644
--- a/docs/docs/apis-generate/generate-with-tools.md
+++ b/docs/docs/apis-generate/generate-with-tools.md
@@ -36,19 +36,19 @@ We can create static functions as our tools.
This function takes the arguments `location` and `fuelType` and performs an operation with these arguments and returns
fuel price value.
-
+
This function takes the argument `city` and performs an operation with the argument and returns the weather for a
location.
-
+
Another way to create our tools is by creating classes by extending `ToolFunction`.
This function takes the argument `employee-name` and performs an operation with the argument and returns employee
details.
-
+
### Define Tool Specifications
@@ -57,21 +57,21 @@ Lets define a sample tool specification called **Fuel Price Tool** for getting t
- Specify the function `name`, `description`, and `required` properties (`location` and `fuelType`).
- Associate the `getCurrentFuelPrice` function you defined earlier.
-
+
Lets also define a sample tool specification called **Weather Tool** for getting the current weather.
- Specify the function `name`, `description`, and `required` property (`city`).
- Associate the `getCurrentWeather` function you defined earlier.
-
+
Lets also define a sample tool specification called **DBQueryFunction** for getting the employee details from database.
- Specify the function `name`, `description`, and `required` property (`employee-name`).
- Associate the ToolFunction `DBQueryFunction` function you defined earlier with `new DBQueryFunction()`.
-
+
Now put it all together by registering the tools and prompting with tools.
diff --git a/docs/docs/apis-model-management/_category_.json b/docs/docs/apis-model-management/_category_.json
index 48f345c..7a88175 100644
--- a/docs/docs/apis-model-management/_category_.json
+++ b/docs/docs/apis-model-management/_category_.json
@@ -1,5 +1,5 @@
{
- "label": "APIs - Manage Models",
+ "label": "Manage Models",
"position": 2,
"link": {
"type": "generated-index",
diff --git a/docs/docs/apis-model-management/create-model.md b/docs/docs/apis-model-management/create-model.md
index 67b0de3..a8c0d7a 100644
--- a/docs/docs/apis-model-management/create-model.md
+++ b/docs/docs/apis-model-management/create-model.md
@@ -15,13 +15,13 @@ This API lets you create a custom model on the Ollama server.
You would see these logs while the custom model is being created:
```
-{"status":"using existing layer sha256:fad2a06e4cc705c2fa8bec5477ddb00dc0c859ac184c34dcc5586663774161ca"}
-{"status":"using existing layer sha256:41c2cf8c272f6fb0080a97cd9d9bd7d4604072b80a0b10e7d65ca26ef5000c0c"}
-{"status":"using existing layer sha256:1da0581fd4ce92dcf5a66b1da737cf215d8dcf25aa1b98b44443aaf7173155f5"}
-{"status":"creating new layer sha256:941b69ca7dc2a85c053c38d9e8029c9df6224e545060954fa97587f87c044a64"}
-{"status":"using existing layer sha256:f02dd72bb2423204352eabc5637b44d79d17f109fdb510a7c51455892aa2d216"}
-{"status":"writing manifest"}
-{"status":"success"}
+using existing layer sha256:fad2a06e4cc705c2fa8bec5477ddb00dc0c859ac184c34dcc5586663774161ca
+using existing layer sha256:41c2cf8c272f6fb0080a97cd9d9bd7d4604072b80a0b10e7d65ca26ef5000c0c
+using existing layer sha256:1da0581fd4ce92dcf5a66b1da737cf215d8dcf25aa1b98b44443aaf7173155f5
+creating new layer sha256:941b69ca7dc2a85c053c38d9e8029c9df6224e545060954fa97587f87c044a64
+using existing layer sha256:f02dd72bb2423204352eabc5637b44d79d17f109fdb510a7c51455892aa2d216
+writing manifest
+success
```
Once created, you can see it when you use [list models](./list-models) API.
diff --git a/docs/docs/metrics.md b/docs/docs/metrics.md
index 6ecbd9f..10bc4b1 100644
--- a/docs/docs/metrics.md
+++ b/docs/docs/metrics.md
@@ -1,5 +1,5 @@
---
-sidebar_position: 5
+sidebar_position: 6
title: Metrics
---
diff --git a/src/main/java/io/github/ollama4j/agent/Agent.java b/src/main/java/io/github/ollama4j/agent/Agent.java
index 4e3ac57..f9fdf7c 100644
--- a/src/main/java/io/github/ollama4j/agent/Agent.java
+++ b/src/main/java/io/github/ollama4j/agent/Agent.java
@@ -265,6 +265,7 @@ public class Agent {
@Data
@Setter
@Getter
+ @EqualsAndHashCode(callSuper = false)
private static class AgentToolSpec extends Tools.ToolSpec {
/** Fully qualified class name of the tool's {@link ToolFunction} implementation */
private String toolFunctionFQCN = null;