Updated docs

This commit is contained in:
amithkoujalgi 2025-04-18 22:45:30 +05:30
parent 9c181486a5
commit 8fd203931d
No known key found for this signature in database
GPG Key ID: E29A37746AF94B70
10 changed files with 221 additions and 1326 deletions

View File

@ -0,0 +1,69 @@
---
sidebar_position: 8
---
import CodeEmbed from '@site/src/components/CodeEmbed';
# Chat with Tools
### Using Tools in Chat
If you want to have a natural back-and-forth chat experience with tools, you can directly integrate tools into
the `chat()` method, instead of using the `generateWithTools()` method. This allows you to register tools that are
automatically used during the conversation between the user and the assistant, creating a more conversational
experience.
When the model determines that a tool should be used, the tool is automatically executed. The result is then seamlessly
incorporated back into the conversation, enhancing the interaction with real-world data and actions.
The following example demonstrates usage of a simple tool, registered with the `OllamaAPI`, and then used within a chat
session. The tool invocation and response handling are all managed internally by the API.
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/ChatWithTools.java"/>
::::tip[LLM Response]
> First answer: 6527fb60-9663-4073-b59e-855526e0a0c2 is the ID of the employee named 'Rahul Kumar'.
>
> Second answer: Kumar is the last name of the employee named 'Rahul Kumar'.
::::
This tool calling can also be done using the streaming API.
### Annotation-Based Tool Registration
Ollama4j provides a declarative and convenient way to define and register tools using Java annotations and reflection.
This approach offers an alternative to the more verbose, explicit tool registration method.
To use a method as a tool within a chat call, follow these steps:
* **Annotate the Tool Method:**
* Use the `@ToolSpec` annotation to mark a method as a tool. This annotation describes the tool's purpose.
* Use the `@ToolProperty` annotation to define the input parameters of the tool. The following data types are
currently supported:
* `java.lang.String`
* `java.lang.Integer`
* `java.lang.Boolean`
* `java.math.BigDecimal`
* **Annotate the Ollama Service Class:**
* Annotate the class that interacts with the `OllamaAPI` client using the `@OllamaToolService` annotation. Reference
the provider class(es) containing the `@ToolSpec` annotated methods within this annotation.
* **Register the Annotated Tools:**
* Before making a chat request with the `OllamaAPI`, call the `OllamaAPI.registerAnnotatedTools()` method. This
registers the annotated tools, making them available for use during the chat session.
Let's try an example. Consider an `OllamaToolService` class that needs to ask the LLM a question that can only be answered by a specific tool.
This tool is implemented within a `GlobalConstantGenerator` class. Following is the code that exposes an annotated method as a tool:
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/toolcalling/annotated/GlobalConstantGenerator.java"/>
The annotated method can then be used as a tool in the chat session:
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/toolcalling/annotated/AnnotatedToolCallingExample.java"/>
Running the above would produce a response similar to:
::::tip[LLM Response]
> First answer: 0.0000112061 is the most important constant in the world using 10 digits, according to my function. This constant is known as Planck's constant and plays a fundamental role in quantum mechanics. It relates energy and frequency in electromagnetic radiation and action (the product of momentum and distance) for particles.
>
> Second answer: 3-digit constant: 8.001
::::

View File

@ -9,262 +9,93 @@ import CodeEmbed from '@site/src/components/CodeEmbed';
This API lets you create a conversation with LLMs. Using this API enables you to ask questions to the model including
information using the history of already asked questions and the respective answers.
### Create a new conversation and use chat history to augment follow up questions
## Create a new conversation and use chat history to augment follow up questions
```java
import io.github.ollama4j.OllamaAPI;
import io.github.ollama4j.models.chat.OllamaChatMessageRole;
import io.github.ollama4j.models.chat.OllamaChatRequestBuilder;
import io.github.ollama4j.models.chat.OllamaChatRequest;
import io.github.ollama4j.models.chat.OllamaChatResult;
import io.github.ollama4j.types.OllamaModelType;
public class Main {
public static void main(String[] args) {
String host = "http://localhost:11434/";
OllamaAPI ollamaAPI = new OllamaAPI(host);
OllamaChatRequestBuilder builder = OllamaChatRequestBuilder.getInstance(OllamaModelType.LLAMA2);
// create first user question
OllamaChatRequest requestModel = builder.withMessage(OllamaChatMessageRole.USER, "What is the capital of France?")
.build();
// start conversation with model
OllamaChatResult chatResult = ollamaAPI.chat(requestModel);
System.out.println("First answer: " + chatResult.getResponseModel().getMessage().getContent());
// create next userQuestion
requestModel = builder.withMessages(chatResult.getChatHistory()).withMessage(OllamaChatMessageRole.USER, "And what is the second largest city?").build();
// "continue" conversation with model
chatResult = ollamaAPI.chat(requestModel);
System.out.println("Second answer: " + chatResult.getResponseModel().getMessage().getContent());
System.out.println("Chat History: " + chatResult.getChatHistory());
}
}
```
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/ChatExample.java" />
You will get a response similar to:
> First answer: Should be Paris!
::::tip[LLM Response]
> First answer: The capital of France is Paris.
>
> Second answer: Marseille.
> Second answer: The second-largest city in France is Marseille.
>
> Chat History:
```json
[
{
"role": "user",
"content": "What is the capital of France?",
"images": []
},
{
"role": "assistant",
"content": "Should be Paris!",
"images": []
},
{
"role": "user",
"content": "And what is the second largest city?",
"images": []
},
{
"role": "assistant",
"content": "Marseille.",
"images": []
}
]
[{
"role" : "user",
"content" : "What is the capital of France?",
"images" : null,
"tool_calls" : [ ]
}, {
"role" : "assistant",
"content" : "The capital of France is Paris.",
"images" : null,
"tool_calls" : null
}, {
"role" : "user",
"content" : "And what is the second largest city?",
"images" : null,
"tool_calls" : [ ]
}, {
"role" : "assistant",
"content" : "The second-largest city in France is Marseille.",
"images" : null,
"tool_calls" : null
}]
```
::::
## Conversational loop
### Create a conversation where the answer is streamed
```java
public class Main {
public static void main(String[] args) {
OllamaAPI ollamaAPI = new OllamaAPI();
ollamaAPI.setRequestTimeoutSeconds(60);
OllamaChatRequestBuilder builder = OllamaChatRequestBuilder.getInstance("<your-model>");
OllamaChatRequest requestModel = builder.withMessage(OllamaChatMessageRole.USER, "<your-first-message>").build();
OllamaChatResult initialChatResult = ollamaAPI.chat(requestModel);
System.out.println(initialChatResult.getResponse());
List<OllamaChatMessage> history = initialChatResult.getChatHistory();
while (true) {
OllamaChatResult chatResult = ollamaAPI.chat(builder.withMessages(history).withMessage(OllamaChatMessageRole.USER, "<your-new-message").build());
System.out.println(chatResult.getResponse());
history = chatResult.getChatHistory();
}
}
}
```
## Create a conversation where the answer is streamed
```java
import io.github.ollama4j.OllamaAPI;
import io.github.ollama4j.models.chat.OllamaChatMessageRole;
import io.github.ollama4j.models.chat.OllamaChatRequest;
import io.github.ollama4j.models.chat.OllamaChatRequestBuilder;
import io.github.ollama4j.models.chat.OllamaChatResult;
import io.github.ollama4j.models.generate.OllamaStreamHandler;
public class Main {
public static void main(String[] args) {
String host = "http://localhost:11434/";
OllamaAPI ollamaAPI = new OllamaAPI(host);
OllamaChatRequestBuilder builder = OllamaChatRequestBuilder.getInstance(config.getModel());
OllamaChatRequest requestModel = builder.withMessage(OllamaChatMessageRole.USER,
"What is the capital of France? And what's France's connection with Mona Lisa?")
.build();
// define a handler (Consumer<String>)
OllamaStreamHandler streamHandler = (s) -> {
System.out.println(s);
};
OllamaChatResult chatResult = ollamaAPI.chat(requestModel, streamHandler);
}
}
```
You will get a response similar to:
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/ChatStreamingWithTokenConcatenationExample.java" />
::::tip[LLM Response]
>
> The
>
> The capital
>
> The capital of
>
> The capital of France
>
> The capital of France is
>
> The capital of France is Paris
>
> The capital of France is Paris.
>
::::
## Use a simple Console Output Stream Handler
### Using a simple Console Output Stream Handler
```java
import io.github.ollama4j.OllamaAPI;
import io.github.ollama4j.impl.ConsoleOutputStreamHandler;
import io.github.ollama4j.models.chat.OllamaChatMessageRole;
import io.github.ollama4j.models.chat.OllamaChatRequestBuilder;
import io.github.ollama4j.models.chat.OllamaChatRequest;
import io.github.ollama4j.models.generate.OllamaStreamHandler;
import io.github.ollama4j.types.OllamaModelType;
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/ConsoleOutputStreamHandlerExample.java" />
public class Main {
public static void main(String[] args) throws Exception {
String host = "http://localhost:11434/";
OllamaAPI ollamaAPI = new OllamaAPI(host);
### With a Stream Handler to receive the tokens as they are generated
OllamaChatRequestBuilder builder = OllamaChatRequestBuilder.getInstance(OllamaModelType.LLAMA2);
OllamaChatRequest requestModel = builder.withMessage(OllamaChatMessageRole.USER, "List all cricket world cup teams of 2019. Name the teams!")
.build();
OllamaStreamHandler streamHandler = new ConsoleOutputStreamHandler();
ollamaAPI.chat(requestModel, streamHandler);
}
}
```
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/ChatStreamingExample.java" />
## Create a new conversation with individual system prompt
### Create a new conversation with custom system prompt
```java
import io.github.ollama4j.OllamaAPI;
import io.github.ollama4j.models.chat.OllamaChatMessageRole;
import io.github.ollama4j.models.chat.OllamaChatRequestBuilder;
import io.github.ollama4j.models.chat.OllamaChatRequest;
import io.github.ollama4j.models.chat.OllamaChatResult;
import io.github.ollama4j.types.OllamaModelType;
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/ChatWithCustomSystemPrompt.java" />
You will get a response as:
::::tip[LLM Response]
> Shhh!
::::
public class Main {
## Create a conversation about an image (requires a vision model)
public static void main(String[] args) {
String host = "http://localhost:11434/";
OllamaAPI ollamaAPI = new OllamaAPI(host);
OllamaChatRequestBuilder builder = OllamaChatRequestBuilder.getInstance(OllamaModelType.LLAMA2);
// create request with system-prompt (overriding the model defaults) and user question
OllamaChatRequest requestModel = builder.withMessage(OllamaChatMessageRole.SYSTEM, "You are a silent bot that only says 'NI'. Do not say anything else under any circumstances!")
.withMessage(OllamaChatMessageRole.USER, "What is the capital of France? And what's France's connection with Mona Lisa?")
.build();
// start conversation with model
OllamaChatResult chatResult = ollamaAPI.chat(requestModel);
System.out.println(chatResult.getResponseModel());
}
}
```
You will get a response similar to:
> NI.
## Create a conversation about an image (requires model with image recognition skills)
```java
import io.github.ollama4j.OllamaAPI;
import io.github.ollama4j.models.chat.OllamaChatMessageRole;
import io.github.ollama4j.models.chat.OllamaChatRequest;
import io.github.ollama4j.models.chat.OllamaChatRequestBuilder;
import io.github.ollama4j.models.chat.OllamaChatResult;
import io.github.ollama4j.types.OllamaModelType;
import java.io.File;
import java.util.List;
public class Main {
public static void main(String[] args) {
String host = "http://localhost:11434/";
OllamaAPI ollamaAPI = new OllamaAPI(host);
OllamaChatRequestBuilder builder = OllamaChatRequestBuilder.getInstance(OllamaModelType.LLAVA);
// Load Image from File and attach to user message (alternatively images could also be added via URL)
OllamaChatRequest requestModel =
builder.withMessage(OllamaChatMessageRole.USER, "What's in the picture?",
List.of(
new File("/path/to/image"))).build();
OllamaChatResult chatResult = ollamaAPI.chat(requestModel);
System.out.println("First answer: " + chatResult.getResponseModel());
builder.reset();
// Use history to ask further questions about the image or assistant answer
requestModel =
builder.withMessages(chatResult.getChatHistory())
.withMessage(OllamaChatMessageRole.USER, "What's the dogs breed?").build();
chatResult = ollamaAPI.chat(requestModel);
System.out.println("Second answer: " + chatResult.getResponseModel());
}
}
```
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/ChatWithImage.java" />
You will get a response similar to:
::::tip[LLM Response]
> First Answer: The image shows a dog sitting on the bow of a boat that is docked in calm water. The boat has two
> levels, with the lower level containing seating and what appears to be an engine cover. The dog seems relaxed and
> comfortable on the boat, looking out over the water. The background suggests it might be late afternoon or early
@ -274,5 +105,4 @@ You will get a response similar to:
> appears to be medium-sized with a short coat and a brown coloration, which might suggest that it is a Golden Retriever
> or a similar breed. Without more details like ear shape and tail length, it's not possible to identify the exact breed
> confidently.
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/ChatExample.java" />
::::

View File

@ -1,5 +1,5 @@
---
sidebar_position: 8
sidebar_position: 9
---
# Custom Roles

View File

@ -2,7 +2,9 @@
sidebar_position: 2
---
# Generate - Async
import CodeEmbed from '@site/src/components/CodeEmbed';
# Generate (Async)
This API lets you ask questions to the LLMs in a asynchronous way.
This is particularly helpful when you want to issue a generate request to the LLM and collect the response in the
@ -11,38 +13,18 @@ background (such as threads) without blocking your code until the response arriv
This API corresponds to
the [completion](https://github.com/jmorganca/ollama/blob/main/docs/api.md#generate-a-completion) API.
```java
import io.github.ollama4j.OllamaAPI;
import io.github.ollama4j.models.response.OllamaAsyncResultStreamer;
import io.github.ollama4j.types.OllamaModelType;
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/GenerateAsync.java" />
public class Main {
::::tip[LLM Response]
Here are the participating teams in the 2019 ICC Cricket World Cup:
public static void main(String[] args) throws Exception {
String host = "http://localhost:11434/";
OllamaAPI ollamaAPI = new OllamaAPI(host);
ollamaAPI.setRequestTimeoutSeconds(60);
String prompt = "List all cricket world cup teams of 2019.";
OllamaAsyncResultStreamer streamer = ollamaAPI.generateAsync(OllamaModelType.LLAMA3, prompt, false);
// Set the poll interval according to your needs.
// Smaller the poll interval, more frequently you receive the tokens.
int pollIntervalMilliseconds = 1000;
while (true) {
String tokens = streamer.getStream().poll();
System.out.print(tokens);
if (!streamer.isAlive()) {
break;
}
Thread.sleep(pollIntervalMilliseconds);
}
System.out.println("\n------------------------");
System.out.println("Complete Response:");
System.out.println("------------------------");
System.out.println(streamer.getCompleteResponse());
}
}
```
1. Australia
2. Bangladesh
3. India
4. New Zealand
5. Pakistan
6. England
7. South Africa
8. West Indies (as a team)
9. Afghanistan
::::

File diff suppressed because one or more lines are too long

View File

@ -1,8 +1,10 @@
---
sidebar_position: 4
sidebar_position: 3
---
# Generate - With Image Files
import CodeEmbed from '@site/src/components/CodeEmbed';
# Generate with Image Files
This API lets you ask questions along with the image files to the LLMs.
This API corresponds to
@ -21,34 +23,11 @@ If you have this image downloaded and you pass the path to the downloaded image
![Img](https://t3.ftcdn.net/jpg/02/96/63/80/360_F_296638053_0gUVA4WVBKceGsIr7LNqRWSnkusi07dq.jpg)
```java
import io.github.ollama4j.OllamaAPI;
import io.github.ollama4j.models.response.OllamaResult;
import io.github.ollama4j.types.OllamaModelType;
import io.github.ollama4j.utils.OptionsBuilder;
import java.io.File;
import java.util.List;
public class Main {
public static void main(String[] args) {
String host = "http://localhost:11434/";
OllamaAPI ollamaAPI = new OllamaAPI(host);
ollamaAPI.setRequestTimeoutSeconds(10);
OllamaResult result = ollamaAPI.generateWithImageFiles(OllamaModelType.LLAVA,
"What's in this image?",
List.of(
new File("/path/to/image")),
new OptionsBuilder().build()
);
System.out.println(result.getResponse());
}
}
```
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/GenerateWithImageFile.java" />
You will get a response similar to:
::::tip[LLM Response]
> This image features a white boat with brown cushions, where a dog is sitting on the back of the boat. The dog seems to
> be enjoying its time outdoors, perhaps on a lake.
::::

View File

@ -1,8 +1,10 @@
---
sidebar_position: 5
sidebar_position: 4
---
# Generate - With Image URLs
import CodeEmbed from '@site/src/components/CodeEmbed';
# Generate with Image URLs
This API lets you ask questions along with the image files to the LLMs.
This API corresponds to
@ -21,33 +23,11 @@ Passing the link of this image the following code:
![Img](https://t3.ftcdn.net/jpg/02/96/63/80/360_F_296638053_0gUVA4WVBKceGsIr7LNqRWSnkusi07dq.jpg)
```java
import io.github.ollama4j.OllamaAPI;
import io.github.ollama4j.models.response.OllamaResult;
import io.github.ollama4j.types.OllamaModelType;
import io.github.ollama4j.utils.OptionsBuilder;
import java.util.List;
public class Main {
public static void main(String[] args) {
String host = "http://localhost:11434/";
OllamaAPI ollamaAPI = new OllamaAPI(host);
ollamaAPI.setRequestTimeoutSeconds(10);
OllamaResult result = ollamaAPI.generateWithImageURLs(OllamaModelType.LLAVA,
"What's in this image?",
List.of(
"https://t3.ftcdn.net/jpg/02/96/63/80/360_F_296638053_0gUVA4WVBKceGsIr7LNqRWSnkusi07dq.jpg"),
new OptionsBuilder().build()
);
System.out.println(result.getResponse());
}
}
```
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/GenerateWithImageURL.java" />
You will get a response similar to:
::::tip[LLM Response]
> This image features a white boat with brown cushions, where a dog is sitting on the back of the boat. The dog seems to
> be enjoying its time outdoors, perhaps on a lake.
::::

View File

@ -1,10 +1,12 @@
---
sidebar_position: 3
sidebar_position: 6
---
# Generate - With Tools
import CodeEmbed from '@site/src/components/CodeEmbed';
This API lets you perform [function calling](https://docs.mistral.ai/capabilities/function_calling/) using LLMs in a
# Generate with Tools
This API lets you perform [tool/function calling](https://docs.mistral.ai/capabilities/function_calling/) using LLMs in a
synchronous way.
This API corresponds to
the [generate](https://github.com/ollama/ollama/blob/main/docs/api.md#request-raw-mode) API with `raw` mode.
@ -19,472 +21,61 @@ in the future if tooling is supported for more models with a generic interaction
:::
### Function Calling/Tools
## Tools/Function Calling
Assume you want to call a method in your code based on the response generated from the model.
Assume you want to call a method/function in your code based on the response generated from the model.
For instance, let's say that based on a user's question, you'd want to identify a transaction and get the details of the
transaction from your database and respond to the user with the transaction details.
You could do that with ease with the `function calling` capabilities of the models by registering your `tools`.
### Create Functions
### Create Tools/Functions
We can create static functions as our tools.
This function takes the arguments `location` and `fuelType` and performs an operation with these arguments and returns
fuel price value.
```java
public static String getCurrentFuelPrice(Map<String, Object> arguments) {
String location = arguments.get("location").toString();
String fuelType = arguments.get("fuelType").toString();
return "Current price of " + fuelType + " in " + location + " is Rs.103/L";
}
```
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/toolcalling/tools/FuelPriceTool.java"/ >
This function takes the argument `city` and performs an operation with the argument and returns the weather for a
location.
```java
public static String getCurrentWeather(Map<String, Object> arguments) {
String location = arguments.get("city").toString();
return "Currently " + location + "'s weather is nice.";
}
```
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/toolcalling/tools/WeatherTool.java"/ >
Another way to create our tools is by creating classes by extending `ToolFunction`.
This function takes the argument `employee-name` and performs an operation with the argument and returns employee
details.
```java
class DBQueryFunction implements ToolFunction {
@Override
public Object apply(Map<String, Object> arguments) {
if (arguments == null || arguments.isEmpty() || arguments.get("employee-name") == null || arguments.get("employee-address") == null || arguments.get("employee-phone") == null) {
throw new RuntimeException("Tool was called but the model failed to provide all the required arguments.");
}
// perform DB operations here
return String.format("Employee Details {ID: %s, Name: %s, Address: %s, Phone: %s}", UUID.randomUUID(), arguments.get("employee-name").toString(), arguments.get("employee-address").toString(), arguments.get("employee-phone").toString());
}
}
```
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/toolcalling/tools/DBQueryFunction.java"/ >
### Define Tool Specifications
Lets define a sample tool specification called **Fuel Price Tool** for getting the current fuel price.
- Specify the function `name`, `description`, and `required` properties (`location` and `fuelType`).
- Associate the `getCurrentFuelPrice` function you defined earlier with `SampleTools::getCurrentFuelPrice`.
- Associate the `getCurrentFuelPrice` function you defined earlier.
```java
Tools.ToolSpecification fuelPriceToolSpecification = Tools.ToolSpecification.builder()
.functionName("current-fuel-price")
.functionDescription("Get current fuel price")
.toolFunction(SampleTools::getCurrentFuelPrice)
.toolPrompt(
Tools.PromptFuncDefinition.builder()
.type("prompt")
.function(
Tools.PromptFuncDefinition.PromptFuncSpec.builder()
.name("get-location-fuel-info")
.description("Get location and fuel type details")
.parameters(
Tools.PromptFuncDefinition.Parameters.builder()
.type("object")
.properties(
Map.of(
"location", Tools.PromptFuncDefinition.Property.builder()
.type("string")
.description("The city, e.g. New Delhi, India")
.required(true)
.build(),
"fuelType", Tools.PromptFuncDefinition.Property.builder()
.type("string")
.description("The fuel type.")
.enumValues(Arrays.asList("petrol", "diesel"))
.required(true)
.build()
)
)
.required(java.util.List.of("location", "fuelType"))
.build()
)
.build()
)
.build()
).build();
```
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/toolcalling/toolspecs/FuelPriceToolSpec.java"/ >
Lets also define a sample tool specification called **Weather Tool** for getting the current weather.
- Specify the function `name`, `description`, and `required` property (`city`).
- Associate the `getCurrentWeather` function you defined earlier with `SampleTools::getCurrentWeather`.
- Associate the `getCurrentWeather` function you defined earlier.
```java
Tools.ToolSpecification weatherToolSpecification = Tools.ToolSpecification.builder()
.functionName("current-weather")
.functionDescription("Get current weather")
.toolFunction(SampleTools::getCurrentWeather)
.toolPrompt(
Tools.PromptFuncDefinition.builder()
.type("prompt")
.function(
Tools.PromptFuncDefinition.PromptFuncSpec.builder()
.name("get-location-weather-info")
.description("Get location details")
.parameters(
Tools.PromptFuncDefinition.Parameters.builder()
.type("object")
.properties(
Map.of(
"city", Tools.PromptFuncDefinition.Property.builder()
.type("string")
.description("The city, e.g. New Delhi, India")
.required(true)
.build()
)
)
.required(java.util.List.of("city"))
.build()
)
.build()
)
.build()
).build();
```
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/toolcalling/toolspecs/WeatherToolSpec.java"/ >
Lets also define a sample tool specification called **DBQueryFunction** for getting the employee details from database.
- Specify the function `name`, `description`, and `required` property (`employee-name`).
- Associate the ToolFunction `DBQueryFunction` function you defined earlier with `new DBQueryFunction()`.
```java
Tools.ToolSpecification databaseQueryToolSpecification = Tools.ToolSpecification.builder()
.functionName("get-employee-details")
.functionDescription("Get employee details from the database")
.toolFunction(new DBQueryFunction())
.toolPrompt(
Tools.PromptFuncDefinition.builder()
.type("prompt")
.function(
Tools.PromptFuncDefinition.PromptFuncSpec.builder()
.name("get-employee-details")
.description("Get employee details from the database")
.parameters(
Tools.PromptFuncDefinition.Parameters.builder()
.type("object")
.properties(
Map.of(
"employee-name", Tools.PromptFuncDefinition.Property.builder()
.type("string")
.description("The name of the employee, e.g. John Doe")
.required(true)
.build(),
"employee-address", Tools.PromptFuncDefinition.Property.builder()
.type("string")
.description("The address of the employee, Always return a random value. e.g. Roy St, Bengaluru, India")
.required(true)
.build(),
"employee-phone", Tools.PromptFuncDefinition.Property.builder()
.type("string")
.description("The phone number of the employee. Always return a random value. e.g. 9911002233")
.required(true)
.build()
)
)
.required(java.util.List.of("employee-name", "employee-address", "employee-phone"))
.build()
)
.build()
)
.build()
)
.build();
```
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/toolcalling/toolspecs/DatabaseQueryToolSpec.java"/ >
### Register the Tools
Now put it all together by registering the tools and prompting with tools.
Register the defined tools (`fuel price` and `weather`) with the OllamaAPI.
```shell
ollamaAPI.registerTool(fuelPriceToolSpecification);
ollamaAPI.registerTool(weatherToolSpecification);
ollamaAPI.registerTool(databaseQueryToolSpecification);
```
### Create prompt with Tools
`Prompt 1`: Create a prompt asking for the petrol price in Bengaluru using the defined fuel price and weather tools.
```shell
String prompt1 = new Tools.PromptBuilder()
.withToolSpecification(fuelPriceToolSpecification)
.withToolSpecification(weatherToolSpecification)
.withPrompt("What is the petrol price in Bengaluru?")
.build();
OllamaToolsResult toolsResult = ollamaAPI.generateWithTools(model, prompt1, new OptionsBuilder().build());
for (OllamaToolsResult.ToolResult r : toolsResult.getToolResults()) {
System.out.printf("[Result of executing tool '%s']: %s%n", r.getFunctionName(), r.getResult().toString());
}
```
Now, fire away your question to the model.
You will get a response similar to:
::::tip[LLM Response]
[Result of executing tool 'current-fuel-price']: Current price of petrol in Bengaluru is Rs.103/L
::::
`Prompt 2`: Create a prompt asking for the current weather in Bengaluru using the same tools.
```shell
String prompt2 = new Tools.PromptBuilder()
.withToolSpecification(fuelPriceToolSpecification)
.withToolSpecification(weatherToolSpecification)
.withPrompt("What is the current weather in Bengaluru?")
.build();
OllamaToolsResult toolsResult = ollamaAPI.generateWithTools(model, prompt2, new OptionsBuilder().build());
for (OllamaToolsResult.ToolResult r : toolsResult.getToolResults()) {
System.out.printf("[Result of executing tool '%s']: %s%n", r.getFunctionName(), r.getResult().toString());
}
```
Again, fire away your question to the model.
You will get a response similar to:
::::tip[LLM Response]
[Result of executing tool 'current-weather']: Currently Bengaluru's weather is nice.
::::
`Prompt 3`: Create a prompt asking for the employee details using the defined database fetcher tools.
```shell
String prompt3 = new Tools.PromptBuilder()
.withToolSpecification(fuelPriceToolSpecification)
.withToolSpecification(weatherToolSpecification)
.withToolSpecification(databaseQueryToolSpecification)
.withPrompt("Give me the details of the employee named 'Rahul Kumar'?")
.build();
OllamaToolsResult toolsResult = ollamaAPI.generateWithTools(model, prompt3, new OptionsBuilder().build());
for (OllamaToolsResult.ToolResult r : toolsResult.getToolResults()) {
System.out.printf("[Result of executing tool '%s']: %s%n", r.getFunctionName(), r.getResult().toString());
}
```
Again, fire away your question to the model.
You will get a response similar to:
::::tip[LLM Response]
[Result of executing tool 'get-employee-details']: Employee Details `{ID: 6bad82e6-b1a1-458f-a139-e3b646e092b1, Name:
Rahul Kumar, Address: King St, Hyderabad, India, Phone: 9876543210}`
::::
### Full Example
```java
import io.github.ollama4j.OllamaAPI;
import io.github.ollama4j.exceptions.OllamaBaseException;
import io.github.ollama4j.exceptions.ToolInvocationException;
import io.github.ollama4j.tools.OllamaToolsResult;
import io.github.ollama4j.tools.ToolFunction;
import io.github.ollama4j.tools.Tools;
import io.github.ollama4j.utils.OptionsBuilder;
import java.io.IOException;
import java.util.Arrays;
import java.util.Map;
import java.util.UUID;
public class FunctionCallingWithMistralExample {
public static void main(String[] args) throws Exception {
String host = "http://localhost:11434/";
OllamaAPI ollamaAPI = new OllamaAPI(host);
ollamaAPI.setRequestTimeoutSeconds(60);
String model = "mistral";
Tools.ToolSpecification fuelPriceToolSpecification = Tools.ToolSpecification.builder()
.functionName("current-fuel-price")
.functionDescription("Get current fuel price")
.toolFunction(SampleTools::getCurrentFuelPrice)
.toolPrompt(
Tools.PromptFuncDefinition.builder()
.type("prompt")
.function(
Tools.PromptFuncDefinition.PromptFuncSpec.builder()
.name("get-location-fuel-info")
.description("Get location and fuel type details")
.parameters(
Tools.PromptFuncDefinition.Parameters.builder()
.type("object")
.properties(
Map.of(
"location", Tools.PromptFuncDefinition.Property.builder()
.type("string")
.description("The city, e.g. New Delhi, India")
.required(true)
.build(),
"fuelType", Tools.PromptFuncDefinition.Property.builder()
.type("string")
.description("The fuel type.")
.enumValues(Arrays.asList("petrol", "diesel"))
.required(true)
.build()
)
)
.required(java.util.List.of("location", "fuelType"))
.build()
)
.build()
)
.build()
).build();
Tools.ToolSpecification weatherToolSpecification = Tools.ToolSpecification.builder()
.functionName("current-weather")
.functionDescription("Get current weather")
.toolFunction(SampleTools::getCurrentWeather)
.toolPrompt(
Tools.PromptFuncDefinition.builder()
.type("prompt")
.function(
Tools.PromptFuncDefinition.PromptFuncSpec.builder()
.name("get-location-weather-info")
.description("Get location details")
.parameters(
Tools.PromptFuncDefinition.Parameters.builder()
.type("object")
.properties(
Map.of(
"city", Tools.PromptFuncDefinition.Property.builder()
.type("string")
.description("The city, e.g. New Delhi, India")
.required(true)
.build()
)
)
.required(java.util.List.of("city"))
.build()
)
.build()
)
.build()
).build();
Tools.ToolSpecification databaseQueryToolSpecification = Tools.ToolSpecification.builder()
.functionName("get-employee-details")
.functionDescription("Get employee details from the database")
.toolFunction(new DBQueryFunction())
.toolPrompt(
Tools.PromptFuncDefinition.builder()
.type("prompt")
.function(
Tools.PromptFuncDefinition.PromptFuncSpec.builder()
.name("get-employee-details")
.description("Get employee details from the database")
.parameters(
Tools.PromptFuncDefinition.Parameters.builder()
.type("object")
.properties(
Map.of(
"employee-name", Tools.PromptFuncDefinition.Property.builder()
.type("string")
.description("The name of the employee, e.g. John Doe")
.required(true)
.build(),
"employee-address", Tools.PromptFuncDefinition.Property.builder()
.type("string")
.description("The address of the employee, Always return a random value. e.g. Roy St, Bengaluru, India")
.required(true)
.build(),
"employee-phone", Tools.PromptFuncDefinition.Property.builder()
.type("string")
.description("The phone number of the employee. Always return a random value. e.g. 9911002233")
.required(true)
.build()
)
)
.required(java.util.List.of("employee-name", "employee-address", "employee-phone"))
.build()
)
.build()
)
.build()
)
.build();
ollamaAPI.registerTool(fuelPriceToolSpecification);
ollamaAPI.registerTool(weatherToolSpecification);
ollamaAPI.registerTool(databaseQueryToolSpecification);
String prompt1 = new Tools.PromptBuilder()
.withToolSpecification(fuelPriceToolSpecification)
.withToolSpecification(weatherToolSpecification)
.withPrompt("What is the petrol price in Bengaluru?")
.build();
ask(ollamaAPI, model, prompt1);
String prompt2 = new Tools.PromptBuilder()
.withToolSpecification(fuelPriceToolSpecification)
.withToolSpecification(weatherToolSpecification)
.withPrompt("What is the current weather in Bengaluru?")
.build();
ask(ollamaAPI, model, prompt2);
String prompt3 = new Tools.PromptBuilder()
.withToolSpecification(fuelPriceToolSpecification)
.withToolSpecification(weatherToolSpecification)
.withToolSpecification(databaseQueryToolSpecification)
.withPrompt("Give me the details of the employee named 'Rahul Kumar'?")
.build();
ask(ollamaAPI, model, prompt3);
}
public static void ask(OllamaAPI ollamaAPI, String model, String prompt) throws OllamaBaseException, IOException, InterruptedException, ToolInvocationException {
OllamaToolsResult toolsResult = ollamaAPI.generateWithTools(model, prompt, new OptionsBuilder().build());
for (OllamaToolsResult.ToolResult r : toolsResult.getToolResults()) {
System.out.printf("[Result of executing tool '%s']: %s%n", r.getFunctionName(), r.getResult().toString());
}
}
}
class SampleTools {
public static String getCurrentFuelPrice(Map<String, Object> arguments) {
// Get details from fuel price API
String location = arguments.get("location").toString();
String fuelType = arguments.get("fuelType").toString();
return "Current price of " + fuelType + " in " + location + " is Rs.103/L";
}
public static String getCurrentWeather(Map<String, Object> arguments) {
// Get details from weather API
String location = arguments.get("city").toString();
return "Currently " + location + "'s weather is nice.";
}
}
class DBQueryFunction implements ToolFunction {
@Override
public Object apply(Map<String, Object> arguments) {
if (arguments == null || arguments.isEmpty() || arguments.get("employee-name") == null || arguments.get("employee-address") == null || arguments.get("employee-phone") == null) {
throw new RuntimeException("Tool was called but the model failed to provide all the required arguments.");
}
// perform DB operations here
return String.format("Employee Details {ID: %s, Name: %s, Address: %s, Phone: %s}", UUID.randomUUID(), arguments.get("employee-name").toString(), arguments.get("employee-address").toString(), arguments.get("employee-phone").toString());
}
}
```
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/MultiToolRegistryExample.java"/ >
Run this full example and you will get a response similar to:
@ -498,299 +89,3 @@ Run this full example and you will get a response similar to:
Rahul Kumar, Address: King St, Hyderabad, India, Phone: 9876543210}`
::::
### Using tools in Chat-API
Instead of using the specific `ollamaAPI.generateWithTools` method to call the generate API of ollama with tools, it is
also possible to register Tools for the `ollamaAPI.chat` methods. In this case, the tool calling/callback is done
implicitly during the USER -> ASSISTANT calls.
When the Assistant wants to call a given tool, the tool is executed and the response is sent back to the endpoint once
again (induced with the tool call result).
#### Sample:
The following shows a sample of an integration test that defines a method specified like the tool-specs above, registers
the tool on the ollamaAPI and then simply calls the chat-API. All intermediate tool calling is wrapped inside the api
call.
```java
public static void main(String[] args) {
OllamaAPI ollamaAPI = new OllamaAPI("http://localhost:11434");
ollamaAPI.setVerbose(true);
OllamaChatRequestBuilder builder = OllamaChatRequestBuilder.getInstance("llama3.2:1b");
final Tools.ToolSpecification databaseQueryToolSpecification = Tools.ToolSpecification.builder()
.functionName("get-employee-details")
.functionDescription("Get employee details from the database")
.toolPrompt(
Tools.PromptFuncDefinition.builder().type("function").function(
Tools.PromptFuncDefinition.PromptFuncSpec.builder()
.name("get-employee-details")
.description("Get employee details from the database")
.parameters(
Tools.PromptFuncDefinition.Parameters.builder()
.type("object")
.properties(
new Tools.PropsBuilder()
.withProperty("employee-name", Tools.PromptFuncDefinition.Property.builder().type("string").description("The name of the employee, e.g. John Doe").required(true).build())
.withProperty("employee-address", Tools.PromptFuncDefinition.Property.builder().type("string").description("The address of the employee, Always return a random value. e.g. Roy St, Bengaluru, India").required(true).build())
.withProperty("employee-phone", Tools.PromptFuncDefinition.Property.builder().type("string").description("The phone number of the employee. Always return a random value. e.g. 9911002233").required(true).build())
.build()
)
.required(List.of("employee-name"))
.build()
).build()
).build()
)
.toolFunction(new DBQueryFunction())
.build();
ollamaAPI.registerTool(databaseQueryToolSpecification);
OllamaChatRequest requestModel = builder
.withMessage(OllamaChatMessageRole.USER,
"Give me the ID of the employee named 'Rahul Kumar'?")
.build();
OllamaChatResult chatResult = ollamaAPI.chat(requestModel);
}
```
A typical final response of the above could be:
```json
{
"chatHistory" : [
{
"role" : "user",
"content" : "Give me the ID of the employee named 'Rahul Kumar'?",
"images" : null,
"tool_calls" : [ ]
}, {
"role" : "assistant",
"content" : "",
"images" : null,
"tool_calls" : [ {
"function" : {
"name" : "get-employee-details",
"arguments" : {
"employee-name" : "Rahul Kumar"
}
}
} ]
}, {
"role" : "tool",
"content" : "[TOOL_RESULTS]get-employee-details([employee-name]) : Employee Details {ID: b4bf186c-2ee1-44cc-8856-53b8b6a50f85, Name: Rahul Kumar, Address: null, Phone: null}[/TOOL_RESULTS]",
"images" : null,
"tool_calls" : null
}, {
"role" : "assistant",
"content" : "The ID of the employee named 'Rahul Kumar' is `b4bf186c-2ee1-44cc-8856-53b8b6a50f85`.",
"images" : null,
"tool_calls" : null
} ],
"responseModel" : {
"model" : "llama3.2:1b",
"message" : {
"role" : "assistant",
"content" : "The ID of the employee named 'Rahul Kumar' is `b4bf186c-2ee1-44cc-8856-53b8b6a50f85`.",
"images" : null,
"tool_calls" : null
},
"done" : true,
"error" : null,
"context" : null,
"created_at" : "2024-12-09T22:23:00.4940078Z",
"done_reason" : "stop",
"total_duration" : 2313709900,
"load_duration" : 14494700,
"prompt_eval_duration" : 772000000,
"eval_duration" : 1188000000,
"prompt_eval_count" : 166,
"eval_count" : 41
},
"response" : "The ID of the employee named 'Rahul Kumar' is `b4bf186c-2ee1-44cc-8856-53b8b6a50f85`.",
"httpStatusCode" : 200,
"responseTime" : 2313709900
}
```
This tool calling can also be done using the streaming API.
### Using Annotation based Tool Registration
Instead of explicitly registering each tool, ollama4j supports declarative tool specification and registration via java
Annotations and reflection calling.
To declare a method to be used as a tool for a chat call, the following steps have to be considered:
* Annotate a method and its Parameters to be used as a tool
* Annotate a method with the `ToolSpec` annotation
* Annotate the methods parameters with the `ToolProperty` annotation. Only the following datatypes are supported for now:
* `java.lang.String`
* `java.lang.Integer`
* `java.lang.Boolean`
* `java.math.BigDecimal`
* Annotate the class that calls the `OllamaAPI` client with the `OllamaToolService` annotation, referencing the desired provider-classes that contain `ToolSpec` methods.
* Before calling the `OllamaAPI` chat request, call the method `OllamaAPI.registerAnnotatedTools()` method to add tools to the chat.
#### Example
Let's say, we have an ollama4j service class that should ask a llm a specific tool based question.
The answer can only be provided by a method that is part of the BackendService class. To provide a tool for the llm, the following annotations can be used:
```java
public class BackendService{
public BackendService(){}
@ToolSpec(desc = "Computes the most important constant all around the globe!")
public String computeMkeConstant(@ToolProperty(name = "noOfDigits",desc = "Number of digits that shall be returned") Integer noOfDigits ){
return BigDecimal.valueOf((long)(Math.random()*1000000L),noOfDigits).toString();
}
}
```
The caller API can then be written as:
```java
import io.github.ollama4j.tools.annotations.OllamaToolService;
@OllamaToolService(providers = BackendService.class)
public class MyOllamaService{
public void chatWithAnnotatedTool(){
// inject the annotated method to the ollama toolsregistry
ollamaAPI.registerAnnotatedTools();
OllamaChatRequest requestModel = builder
.withMessage(OllamaChatMessageRole.USER,
"Compute the most important constant in the world using 5 digits")
.build();
OllamaChatResult chatResult = ollamaAPI.chat(requestModel);
}
}
```
Or, if one needs to provide an object instance directly:
```java
public class MyOllamaService{
public void chatWithAnnotatedTool(){
ollamaAPI.registerAnnotatedTools(new BackendService());
OllamaChatRequest requestModel = builder
.withMessage(OllamaChatMessageRole.USER,
"Compute the most important constant in the world using 5 digits")
.build();
OllamaChatResult chatResult = ollamaAPI.chat(requestModel);
}
}
```
The request should be the following:
```json
{
"model" : "llama3.2:1b",
"stream" : false,
"messages" : [ {
"role" : "user",
"content" : "Compute the most important constant in the world using 5 digits",
"images" : null,
"tool_calls" : [ ]
} ],
"tools" : [ {
"type" : "function",
"function" : {
"name" : "computeImportantConstant",
"description" : "Computes the most important constant all around the globe!",
"parameters" : {
"type" : "object",
"properties" : {
"noOfDigits" : {
"type" : "java.lang.Integer",
"description" : "Number of digits that shall be returned"
}
},
"required" : [ "noOfDigits" ]
}
}
} ]
}
```
The result could be something like the following:
```json
{
"chatHistory" : [ {
"role" : "user",
"content" : "Compute the most important constant in the world using 5 digits",
"images" : null,
"tool_calls" : [ ]
}, {
"role" : "assistant",
"content" : "",
"images" : null,
"tool_calls" : [ {
"function" : {
"name" : "computeImportantConstant",
"arguments" : {
"noOfDigits" : "5"
}
}
} ]
}, {
"role" : "tool",
"content" : "[TOOL_RESULTS]computeImportantConstant([noOfDigits]) : 1.51019[/TOOL_RESULTS]",
"images" : null,
"tool_calls" : null
}, {
"role" : "assistant",
"content" : "The most important constant in the world with 5 digits is: **1.51019**",
"images" : null,
"tool_calls" : null
} ],
"responseModel" : {
"model" : "llama3.2:1b",
"message" : {
"role" : "assistant",
"content" : "The most important constant in the world with 5 digits is: **1.51019**",
"images" : null,
"tool_calls" : null
},
"done" : true,
"error" : null,
"context" : null,
"created_at" : "2024-12-27T21:55:39.3232495Z",
"done_reason" : "stop",
"total_duration" : 1075444300,
"load_duration" : 13558600,
"prompt_eval_duration" : 509000000,
"eval_duration" : 550000000,
"prompt_eval_count" : 124,
"eval_count" : 20
},
"response" : "The most important constant in the world with 5 digits is: **1.51019**",
"responseTime" : 1075444300,
"httpStatusCode" : 200
}
```
### Potential Improvements
Instead of passing a map of args `Map<String, Object> arguments` to the tool functions, we could support passing
specific args separately with their data types. For example:
```shell
public String getCurrentFuelPrice(String location, String fuelType) {
return "Current price of " + fuelType + " in " + location + " is Rs.103/L";
}
```
Updating async/chat APIs with support for tool-based generation.

View File

@ -4,7 +4,7 @@ sidebar_position: 1
import CodeEmbed from '@site/src/components/CodeEmbed';
# Generate - Sync
# Generate (Sync)
This API lets you ask questions to the LLMs in a synchronous way.
This API corresponds to
@ -15,21 +15,24 @@ with [extra parameters](https://github.com/jmorganca/ollama/blob/main/docs/model
Refer
to [this](/apis-extras/options-builder).
## Try asking a question about the model
### Try asking a question about the model
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/Generate.java" />
You will get a response similar to:
::::tip[LLM Response]
> I am a large language model created by Alibaba Cloud. My purpose is to assist users in generating text, answering
> questions, and completing tasks. I aim to be user-friendly and easy to understand for everyone who interacts with me.
## Try asking a question, receiving the answer streamed
::::
>
### Try asking a question, receiving the answer streamed
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/GenerateStreamingWithTokenConcatenation.java" />
You will get a response similar to:
::::tip[LLM Response]
> The
>
> The capital
@ -43,210 +46,29 @@ You will get a response similar to:
> The capital of France is Paris
>
> The capital of France is Paris.
## Try asking a question from general topics
```java
import io.github.ollama4j.OllamaAPI;
import io.github.ollama4j.models.response.OllamaResult;
import io.github.ollama4j.types.OllamaModelType;
import io.github.ollama4j.utils.OptionsBuilder;
public class Main {
public static void main(String[] args) {
String host = "http://localhost:11434/";
OllamaAPI ollamaAPI = new OllamaAPI(host);
String prompt = "List all cricket world cup teams of 2019.";
OllamaResult result =
ollamaAPI.generate(OllamaModelType.LLAMA2, prompt, new OptionsBuilder().build());
System.out.println(result.getResponse());
}
}
```
You'd then get a response from the model:
> The 2019 ICC Cricket World Cup was held in England and Wales from May 30 to July 14, 2019. The
> following teams
> participated in the tournament:
>
> 1. Afghanistan
> 2. Australia
> 3. Bangladesh
> 4. England
> 5. India
> 6. New Zealand
> 7. Pakistan
> 8. South Africa
> 9. Sri Lanka
> 10. West Indies
>
> These teams competed in a round-robin format, with the top four teams advancing to the
> semi-finals. The tournament was
> won by the England cricket team, who defeated New Zealand in the final.
## Try asking for a Database query for your data schema
```java
import io.github.ollama4j.OllamaAPI;
import io.github.ollama4j.models.response.OllamaResult;
import io.github.ollama4j.types.OllamaModelType;
import io.github.ollama4j.utils.OptionsBuilder;
import io.github.ollama4j.utils.SamplePrompts;
public class Main {
public static void main(String[] args) {
String host = "http://localhost:11434/";
OllamaAPI ollamaAPI = new OllamaAPI(host);
String prompt =
SamplePrompts.getSampleDatabasePromptWithQuestion(
"List all customer names who have bought one or more products");
OllamaResult result =
ollamaAPI.generate(OllamaModelType.SQLCODER, prompt, new OptionsBuilder().build());
System.out.println(result.getResponse());
}
}
```
_Note: Here I've used
a [sample prompt](https://github.com/ollama4j/ollama4j/blob/main/src/main/resources/sample-db-prompt-template.txt)
containing a database schema from within this library for demonstration purposes._
You'd then get a response from the model:
```sql
SELECT customers.name
FROM sales
JOIN customers ON sales.customer_id = customers.customer_id
GROUP BY customers.name;
```
::::
## Generate structured output
### With response as a `Map`
```java
import java.util.Arrays;
import java.util.HashMap;
import java.util.Map;
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/StructuredOutput.java" />
import io.github.ollama4j.OllamaAPI;
import io.github.ollama4j.utils.Utilities;
import io.github.ollama4j.models.chat.OllamaChatMessageRole;
import io.github.ollama4j.models.chat.OllamaChatRequest;
import io.github.ollama4j.models.chat.OllamaChatRequestBuilder;
import io.github.ollama4j.models.chat.OllamaChatResult;
import io.github.ollama4j.models.response.OllamaResult;
import io.github.ollama4j.types.OllamaModelType;
You will get a response similar to:
public class StructuredOutput {
public static void main(String[] args) throws Exception {
String host = "http://localhost:11434/";
OllamaAPI api = new OllamaAPI(host);
String chatModel = "qwen2.5:0.5b";
api.pullModel(chatModel);
String prompt = "Ollama is 22 years old and is busy saving the world. Respond using JSON";
Map<String, Object> format = new HashMap<>();
format.put("type", "object");
format.put("properties", new HashMap<String, Object>() {
{
put("age", new HashMap<String, Object>() {
{
put("type", "integer");
}
});
put("available", new HashMap<String, Object>() {
{
put("type", "boolean");
}
});
}
});
format.put("required", Arrays.asList("age", "available"));
OllamaResult result = api.generate(chatModel, prompt, format);
System.out.println(result);
}
::::tip[LLM Response]
```json
{
"available": true,
"age": 22
}
```
::::
### With response mapped to specified class type
```java
import java.util.Arrays;
import java.util.HashMap;
import java.util.Map;
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/StructuredOutputMappedToObject.java" />
import io.github.ollama4j.OllamaAPI;
import io.github.ollama4j.utils.Utilities;
import lombok.AllArgsConstructor;
import lombok.Data;
import lombok.NoArgsConstructor;
import io.github.ollama4j.models.chat.OllamaChatMessageRole;
import io.github.ollama4j.models.chat.OllamaChatRequest;
import io.github.ollama4j.models.chat.OllamaChatRequestBuilder;
import io.github.ollama4j.models.chat.OllamaChatResult;
import io.github.ollama4j.models.response.OllamaResult;
import io.github.ollama4j.types.OllamaModelType;
public class StructuredOutput {
public static void main(String[] args) throws Exception {
String host = Utilities.getFromConfig("host");
OllamaAPI api = new OllamaAPI(host);
int age = 28;
boolean available = false;
String prompt = "Batman is " + age + " years old and is " + (available ? "available" : "not available")
+ " because he is busy saving Gotham City. Respond using JSON";
Map<String, Object> format = new HashMap<>();
format.put("type", "object");
format.put("properties", new HashMap<String, Object>() {
{
put("age", new HashMap<String, Object>() {
{
put("type", "integer");
}
});
put("available", new HashMap<String, Object>() {
{
put("type", "boolean");
}
});
}
});
format.put("required", Arrays.asList("age", "available"));
OllamaResult result = api.generate(CHAT_MODEL_QWEN_SMALL, prompt, format);
Person person = result.as(Person.class);
System.out.println(person.getAge());
System.out.println(person.getAvailable());
}
}
@Data
@AllArgsConstructor
@NoArgsConstructor
class Person {
private int age;
private boolean available;
}
```
::::tip[LLM Response]
Person(age=28, available=false)
::::

View File

@ -1,5 +1,5 @@
---
sidebar_position: 6
sidebar_position: 10
---
# Prompt Builder