Add documentation for the new Agent feature and update sidebar positions for Metrics and API categories. Adjust code examples in various API documentation to reflect correct paths and improve clarity. Enhance the Agent class with an equals and hash code method for better functionality.

This commit is contained in:
Amith Koujalgi 2025-10-19 14:03:10 +05:30
parent 866c08f590
commit f0e5a9e172
12 changed files with 52 additions and 47 deletions

13
docs/docs/agent.md Normal file
View File

@ -0,0 +1,13 @@
---
sidebar_position: 4
title: Agent
---
import CodeEmbed from '@site/src/components/CodeEmbed';
# Agent
:::warning[Note]
This is work in progress
:::

View File

@ -1,6 +1,6 @@
{
"label": "APIs - Extras",
"position": 4,
"label": "Extras",
"position": 5,
"link": {
"type": "generated-index",
"description": "Details of APIs to handle bunch of extra stuff."

View File

@ -1,5 +1,5 @@
{
"label": "APIs - Generate",
"label": "Generate",
"position": 3,
"link": {
"type": "generated-index",

View File

@ -66,11 +66,11 @@ To use a method as a tool within a chat call, follow these steps:
Let's try an example. Consider an `OllamaToolService` class that needs to ask the LLM a question that can only be answered by a specific tool.
This tool is implemented within a `GlobalConstantGenerator` class. Following is the code that exposes an annotated method as a tool:
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/toolcalling/annotated/GlobalConstantGenerator.java"/>
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/tools/annotated/GlobalConstantGenerator.java"/>
The annotated method can then be used as a tool in the chat session:
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/toolcalling/annotated/AnnotatedToolCallingExample.java"/>
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/AnnotatedToolCallingExample.java"/>
Running the above would produce a response similar to:

View File

@ -63,7 +63,7 @@ You will get a response similar to:
### Using a simple Console Output Stream Handler
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/ConsoleOutputStreamHandlerExample.java" />
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/ChatWithConsoleHandlerExample.java" />
### With a Stream Handler to receive the tokens as they are generated

View File

@ -19,11 +19,11 @@ You can use this feature to receive both the thinking and the response as separa
You will get a response similar to:
:::tip[Thinking Tokens]
User asks "Who are you?" It's a request for identity. As ChatGPT, we should explain that I'm an AI developed by OpenAI, etc. Provide friendly explanation.
USER ASKS "WHO ARE YOU?" IT'S A REQUEST FOR IDENTITY. AS CHATGPT, WE SHOULD EXPLAIN THAT I'M AN AI DEVELOPED BY OPENAI, ETC. PROVIDE FRIENDLY EXPLANATION.
:::
:::tip[Response Tokens]
Im ChatGPT, a large language model created by OpenAI. Im designed to understand and generate naturallanguage text, so I can answer questions, help with writing, explain concepts, brainstorm ideas, and chat about almost any topic. I dont have a personal life or consciousness—Im a tool that processes input and produces responses based on patterns in the data I was trained on. If you have any questions about how I work or what I can do, feel free to ask!
im chatgpt, a large language model created by openai. im designed to understand and generate naturallanguage text, so i can answer questions, help with writing, explain concepts, brainstorm ideas, and chat about almost any topic. i dont have a personal life or consciousness—im a tool that processes input and produces responses based on patterns in the data i was trained on. if you have any questions about how i work or what i can do, feel free to ask!
:::
### Generate response and receive the thinking and response tokens streamed
@ -34,7 +34,7 @@ You will get a response similar to:
:::tip[Thinking Tokens]
<TypewriterTextarea
textContent={`User asks "Who are you?" It's a request for identity. As ChatGPT, we should explain that I'm an AI developed by OpenAI, etc. Provide friendly explanation.`}
textContent={`USER ASKS "WHO ARE YOU?" WE SHOULD EXPLAIN THAT I'M AN AI BY OPENAI, ETC.`}
typingSpeed={10}
pauseBetweenSentences={1200}
height="auto"
@ -45,7 +45,7 @@ style={{ whiteSpace: 'pre-line' }}
:::tip[Response Tokens]
<TypewriterTextarea
textContent={`Im ChatGPT, a large language model created by OpenAI. Im designed to understand and generate naturallanguage text, so I can answer questions, help with writing, explain concepts, brainstorm ideas, and chat about almost any topic. I dont have a personal life or consciousness—Im a tool that processes input and produces responses based on patterns in the data I was trained on. If you have any questions about how I work or what I can do, feel free to ask!`}
textContent={`im chatgpt, a large language model created by openai.`}
typingSpeed={10}
pauseBetweenSentences={1200}
height="auto"

View File

@ -3,6 +3,7 @@ sidebar_position: 4
---
import CodeEmbed from '@site/src/components/CodeEmbed';
import TypewriterTextarea from '@site/src/components/TypewriterTextarea';
# Generate with Images
@ -17,13 +18,11 @@ recommended.
:::
## Synchronous mode
If you have this image downloaded and you pass the path to the downloaded image to the following code:
![Img](https://t3.ftcdn.net/jpg/02/96/63/80/360_F_296638053_0gUVA4WVBKceGsIr7LNqRWSnkusi07dq.jpg)
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/GenerateWithImageFile.java" />
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/GenerateWithImageFileSimple.java" />
You will get a response similar to:
@ -32,30 +31,22 @@ This image features a white boat with brown cushions, where a dog is sitting on
be enjoying its time outdoors, perhaps on a lake.
:::
# Generate with Image URLs
This API lets you ask questions along with the image files to the LLMs.
This API corresponds to
the [completion](https://github.com/jmorganca/ollama/blob/main/docs/api.md#generate-a-completion) API.
:::note
Executing this on Ollama server running in CPU-mode will take longer to generate response. Hence, GPU-mode is
recommended.
:::
## Ask (Sync)
Passing the link of this image the following code:
If you want the response to be streamed, you can use the following code:
![Img](https://t3.ftcdn.net/jpg/02/96/63/80/360_F_296638053_0gUVA4WVBKceGsIr7LNqRWSnkusi07dq.jpg)
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/GenerateWithImageURL.java" />
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/GenerateWithImageFileStreaming.java" />
You will get a response similar to:
:::tip[LLM Response]
This image features a white boat with brown cushions, where a dog is sitting on the back of the boat. The dog seems to
be enjoying its time outdoors, perhaps on a lake.
:::tip[Response Tokens]
<TypewriterTextarea
textContent={`This image features a white boat with brown cushions, where a dog is sitting on the back of the boat. The dog seems to be enjoying its time outdoors, perhaps on a lake.`}
typingSpeed={10}
pauseBetweenSentences={1200}
height="auto"
width="100%"
style={{ whiteSpace: 'pre-line' }}
/>
:::

View File

@ -36,19 +36,19 @@ We can create static functions as our tools.
This function takes the arguments `location` and `fuelType` and performs an operation with these arguments and returns
fuel price value.
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/toolcalling/tools/FuelPriceTool.java"/ >
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/tools/toolfunctions/FuelPriceToolFunction.java"/ >
This function takes the argument `city` and performs an operation with the argument and returns the weather for a
location.
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/toolcalling/tools/WeatherTool.java"/ >
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/tools/toolfunctions/WeatherToolFunction.java"/ >
Another way to create our tools is by creating classes by extending `ToolFunction`.
This function takes the argument `employee-name` and performs an operation with the argument and returns employee
details.
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/toolcalling/tools/DBQueryFunction.java"/ >
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/tools/toolfunctions/EmployeeFinderToolFunction.java"/ >
### Define Tool Specifications
@ -57,21 +57,21 @@ Lets define a sample tool specification called **Fuel Price Tool** for getting t
- Specify the function `name`, `description`, and `required` properties (`location` and `fuelType`).
- Associate the `getCurrentFuelPrice` function you defined earlier.
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/toolcalling/toolspecs/FuelPriceToolSpec.java"/ >
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/tools/toolspecs/FuelPriceToolSpec.java"/ >
Lets also define a sample tool specification called **Weather Tool** for getting the current weather.
- Specify the function `name`, `description`, and `required` property (`city`).
- Associate the `getCurrentWeather` function you defined earlier.
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/toolcalling/toolspecs/WeatherToolSpec.java"/ >
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/tools/toolspecs/WeatherToolSpec.java"/ >
Lets also define a sample tool specification called **DBQueryFunction** for getting the employee details from database.
- Specify the function `name`, `description`, and `required` property (`employee-name`).
- Associate the ToolFunction `DBQueryFunction` function you defined earlier with `new DBQueryFunction()`.
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/toolcalling/toolspecs/DatabaseQueryToolSpec.java"/ >
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/tools/toolspecs/EmployeeFinderToolSpec.java"/ >
Now put it all together by registering the tools and prompting with tools.

View File

@ -1,5 +1,5 @@
{
"label": "APIs - Manage Models",
"label": "Manage Models",
"position": 2,
"link": {
"type": "generated-index",

View File

@ -15,13 +15,13 @@ This API lets you create a custom model on the Ollama server.
You would see these logs while the custom model is being created:
```
{"status":"using existing layer sha256:fad2a06e4cc705c2fa8bec5477ddb00dc0c859ac184c34dcc5586663774161ca"}
{"status":"using existing layer sha256:41c2cf8c272f6fb0080a97cd9d9bd7d4604072b80a0b10e7d65ca26ef5000c0c"}
{"status":"using existing layer sha256:1da0581fd4ce92dcf5a66b1da737cf215d8dcf25aa1b98b44443aaf7173155f5"}
{"status":"creating new layer sha256:941b69ca7dc2a85c053c38d9e8029c9df6224e545060954fa97587f87c044a64"}
{"status":"using existing layer sha256:f02dd72bb2423204352eabc5637b44d79d17f109fdb510a7c51455892aa2d216"}
{"status":"writing manifest"}
{"status":"success"}
using existing layer sha256:fad2a06e4cc705c2fa8bec5477ddb00dc0c859ac184c34dcc5586663774161ca
using existing layer sha256:41c2cf8c272f6fb0080a97cd9d9bd7d4604072b80a0b10e7d65ca26ef5000c0c
using existing layer sha256:1da0581fd4ce92dcf5a66b1da737cf215d8dcf25aa1b98b44443aaf7173155f5
creating new layer sha256:941b69ca7dc2a85c053c38d9e8029c9df6224e545060954fa97587f87c044a64
using existing layer sha256:f02dd72bb2423204352eabc5637b44d79d17f109fdb510a7c51455892aa2d216
writing manifest
success
```
Once created, you can see it when you use [list models](./list-models) API.

View File

@ -1,5 +1,5 @@
---
sidebar_position: 5
sidebar_position: 6
title: Metrics
---

View File

@ -265,6 +265,7 @@ public class Agent {
@Data
@Setter
@Getter
@EqualsAndHashCode(callSuper = false)
private static class AgentToolSpec extends Tools.ToolSpec {
/** Fully qualified class name of the tool's {@link ToolFunction} implementation */
private String toolFunctionFQCN = null;