mirror of
https://github.com/amithkoujalgi/ollama4j.git
synced 2025-05-15 11:57:12 +02:00
Merge pull request #125 from ollama4j/docs-updates
All checks were successful
Mark stale issues / stale (push) Successful in 57s
All checks were successful
Mark stale issues / stale (push) Successful in 57s
Docs updates
This commit is contained in:
commit
63e2fc2c49
4
Makefile
4
Makefile
@ -29,10 +29,10 @@ list-releases:
|
|||||||
--compressed \
|
--compressed \
|
||||||
--silent | jq -r '.components[].version'
|
--silent | jq -r '.components[].version'
|
||||||
|
|
||||||
docs:
|
docs-build:
|
||||||
npm i --prefix docs && npm run build --prefix docs
|
npm i --prefix docs && npm run build --prefix docs
|
||||||
|
|
||||||
docs-dev:
|
docs-serve:
|
||||||
npm i --prefix docs && npm run start --prefix docs
|
npm i --prefix docs && npm run start --prefix docs
|
||||||
|
|
||||||
start-cpu:
|
start-cpu:
|
||||||
|
@ -1,9 +0,0 @@
|
|||||||
---
|
|
||||||
slug: welcome
|
|
||||||
title: Welcome
|
|
||||||
authors: [ amith ]
|
|
||||||
tags: [ Java, AI, LLM, GenAI, GenerativeAI, Ollama, Ollama4J, OpenSource, Developers
|
|
||||||
]
|
|
||||||
---
|
|
||||||
|
|
||||||
Welcome Java Developers!
|
|
@ -1,6 +1,6 @@
|
|||||||
---
|
---
|
||||||
slug: release-post
|
slug: release-post
|
||||||
title: Release
|
title: First Release 🚀
|
||||||
authors: [ amith ]
|
authors: [ amith ]
|
||||||
tags: [ Java, AI, LLM, GenAI, GenerativeAI, Ollama, Ollama4j, OpenSource, Developers
|
tags: [ Java, AI, LLM, GenAI, GenerativeAI, Ollama, Ollama4j, OpenSource, Developers
|
||||||
]
|
]
|
||||||
|
69
docs/docs/apis-generate/chat-with-tools.md
Normal file
69
docs/docs/apis-generate/chat-with-tools.md
Normal file
@ -0,0 +1,69 @@
|
|||||||
|
---
|
||||||
|
sidebar_position: 8
|
||||||
|
---
|
||||||
|
|
||||||
|
import CodeEmbed from '@site/src/components/CodeEmbed';
|
||||||
|
|
||||||
|
# Chat with Tools
|
||||||
|
|
||||||
|
### Using Tools in Chat
|
||||||
|
|
||||||
|
If you want to have a natural back-and-forth chat experience with tools, you can directly integrate tools into
|
||||||
|
the `chat()` method, instead of using the `generateWithTools()` method. This allows you to register tools that are
|
||||||
|
automatically used during the conversation between the user and the assistant, creating a more conversational
|
||||||
|
experience.
|
||||||
|
|
||||||
|
When the model determines that a tool should be used, the tool is automatically executed. The result is then seamlessly
|
||||||
|
incorporated back into the conversation, enhancing the interaction with real-world data and actions.
|
||||||
|
|
||||||
|
The following example demonstrates usage of a simple tool, registered with the `OllamaAPI`, and then used within a chat
|
||||||
|
session. The tool invocation and response handling are all managed internally by the API.
|
||||||
|
|
||||||
|
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/ChatWithTools.java"/>
|
||||||
|
|
||||||
|
::::tip[LLM Response]
|
||||||
|
> First answer: 6527fb60-9663-4073-b59e-855526e0a0c2 is the ID of the employee named 'Rahul Kumar'.
|
||||||
|
>
|
||||||
|
> Second answer: Kumar is the last name of the employee named 'Rahul Kumar'.
|
||||||
|
::::
|
||||||
|
|
||||||
|
This tool calling can also be done using the streaming API.
|
||||||
|
|
||||||
|
### Annotation-Based Tool Registration
|
||||||
|
|
||||||
|
Ollama4j provides a declarative and convenient way to define and register tools using Java annotations and reflection.
|
||||||
|
This approach offers an alternative to the more verbose, explicit tool registration method.
|
||||||
|
|
||||||
|
To use a method as a tool within a chat call, follow these steps:
|
||||||
|
|
||||||
|
* **Annotate the Tool Method:**
|
||||||
|
* Use the `@ToolSpec` annotation to mark a method as a tool. This annotation describes the tool's purpose.
|
||||||
|
* Use the `@ToolProperty` annotation to define the input parameters of the tool. The following data types are
|
||||||
|
currently supported:
|
||||||
|
* `java.lang.String`
|
||||||
|
* `java.lang.Integer`
|
||||||
|
* `java.lang.Boolean`
|
||||||
|
* `java.math.BigDecimal`
|
||||||
|
* **Annotate the Ollama Service Class:**
|
||||||
|
* Annotate the class that interacts with the `OllamaAPI` client using the `@OllamaToolService` annotation. Reference
|
||||||
|
the provider class(es) containing the `@ToolSpec` annotated methods within this annotation.
|
||||||
|
* **Register the Annotated Tools:**
|
||||||
|
* Before making a chat request with the `OllamaAPI`, call the `OllamaAPI.registerAnnotatedTools()` method. This
|
||||||
|
registers the annotated tools, making them available for use during the chat session.
|
||||||
|
|
||||||
|
Let's try an example. Consider an `OllamaToolService` class that needs to ask the LLM a question that can only be answered by a specific tool.
|
||||||
|
This tool is implemented within a `GlobalConstantGenerator` class. Following is the code that exposes an annotated method as a tool:
|
||||||
|
|
||||||
|
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/toolcalling/annotated/GlobalConstantGenerator.java"/>
|
||||||
|
|
||||||
|
The annotated method can then be used as a tool in the chat session:
|
||||||
|
|
||||||
|
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/toolcalling/annotated/AnnotatedToolCallingExample.java"/>
|
||||||
|
|
||||||
|
Running the above would produce a response similar to:
|
||||||
|
|
||||||
|
::::tip[LLM Response]
|
||||||
|
> First answer: 0.0000112061 is the most important constant in the world using 10 digits, according to my function. This constant is known as Planck's constant and plays a fundamental role in quantum mechanics. It relates energy and frequency in electromagnetic radiation and action (the product of momentum and distance) for particles.
|
||||||
|
>
|
||||||
|
> Second answer: 3-digit constant: 8.001
|
||||||
|
::::
|
@ -2,267 +2,100 @@
|
|||||||
sidebar_position: 7
|
sidebar_position: 7
|
||||||
---
|
---
|
||||||
|
|
||||||
|
import CodeEmbed from '@site/src/components/CodeEmbed';
|
||||||
|
|
||||||
# Chat
|
# Chat
|
||||||
|
|
||||||
This API lets you create a conversation with LLMs. Using this API enables you to ask questions to the model including
|
This API lets you create a conversation with LLMs. Using this API enables you to ask questions to the model including
|
||||||
information using the history of already asked questions and the respective answers.
|
information using the history of already asked questions and the respective answers.
|
||||||
|
|
||||||
|
### Create a new conversation and use chat history to augment follow up questions
|
||||||
|
|
||||||
|
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/ChatExample.java" />
|
||||||
## Create a new conversation and use chat history to augment follow up questions
|
|
||||||
|
|
||||||
```java
|
|
||||||
import io.github.ollama4j.OllamaAPI;
|
|
||||||
import io.github.ollama4j.models.chat.OllamaChatMessageRole;
|
|
||||||
import io.github.ollama4j.models.chat.OllamaChatRequestBuilder;
|
|
||||||
import io.github.ollama4j.models.chat.OllamaChatRequest;
|
|
||||||
import io.github.ollama4j.models.chat.OllamaChatResult;
|
|
||||||
import io.github.ollama4j.types.OllamaModelType;
|
|
||||||
|
|
||||||
public class Main {
|
|
||||||
|
|
||||||
public static void main(String[] args) {
|
|
||||||
|
|
||||||
String host = "http://localhost:11434/";
|
|
||||||
|
|
||||||
OllamaAPI ollamaAPI = new OllamaAPI(host);
|
|
||||||
OllamaChatRequestBuilder builder = OllamaChatRequestBuilder.getInstance(OllamaModelType.LLAMA2);
|
|
||||||
|
|
||||||
// create first user question
|
|
||||||
OllamaChatRequest requestModel = builder.withMessage(OllamaChatMessageRole.USER, "What is the capital of France?")
|
|
||||||
.build();
|
|
||||||
|
|
||||||
// start conversation with model
|
|
||||||
OllamaChatResult chatResult = ollamaAPI.chat(requestModel);
|
|
||||||
|
|
||||||
System.out.println("First answer: " + chatResult.getResponseModel().getMessage().getContent());
|
|
||||||
|
|
||||||
// create next userQuestion
|
|
||||||
requestModel = builder.withMessages(chatResult.getChatHistory()).withMessage(OllamaChatMessageRole.USER, "And what is the second largest city?").build();
|
|
||||||
|
|
||||||
// "continue" conversation with model
|
|
||||||
chatResult = ollamaAPI.chat(requestModel);
|
|
||||||
|
|
||||||
System.out.println("Second answer: " + chatResult.getResponseModel().getMessage().getContent());
|
|
||||||
|
|
||||||
System.out.println("Chat History: " + chatResult.getChatHistory());
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
```
|
|
||||||
|
|
||||||
You will get a response similar to:
|
You will get a response similar to:
|
||||||
|
|
||||||
> First answer: Should be Paris!
|
::::tip[LLM Response]
|
||||||
|
|
||||||
|
> First answer: The capital of France is Paris.
|
||||||
>
|
>
|
||||||
> Second answer: Marseille.
|
> Second answer: The second-largest city in France is Marseille.
|
||||||
>
|
>
|
||||||
> Chat History:
|
> Chat History:
|
||||||
|
|
||||||
```json
|
```json
|
||||||
[
|
[{
|
||||||
{
|
|
||||||
"role" : "user",
|
"role" : "user",
|
||||||
"content" : "What is the capital of France?",
|
"content" : "What is the capital of France?",
|
||||||
"images": []
|
"images" : null,
|
||||||
},
|
"tool_calls" : [ ]
|
||||||
{
|
}, {
|
||||||
"role" : "assistant",
|
"role" : "assistant",
|
||||||
"content": "Should be Paris!",
|
"content" : "The capital of France is Paris.",
|
||||||
"images": []
|
"images" : null,
|
||||||
},
|
"tool_calls" : null
|
||||||
{
|
}, {
|
||||||
"role" : "user",
|
"role" : "user",
|
||||||
"content" : "And what is the second largest city?",
|
"content" : "And what is the second largest city?",
|
||||||
"images": []
|
"images" : null,
|
||||||
},
|
"tool_calls" : [ ]
|
||||||
{
|
}, {
|
||||||
"role" : "assistant",
|
"role" : "assistant",
|
||||||
"content": "Marseille.",
|
"content" : "The second-largest city in France is Marseille.",
|
||||||
"images": []
|
"images" : null,
|
||||||
}
|
"tool_calls" : null
|
||||||
]
|
}]
|
||||||
```
|
```
|
||||||
|
::::
|
||||||
|
|
||||||
## Conversational loop
|
### Create a conversation where the answer is streamed
|
||||||
|
|
||||||
```java
|
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/ChatStreamingWithTokenConcatenationExample.java" />
|
||||||
public class Main {
|
|
||||||
|
|
||||||
public static void main(String[] args) {
|
|
||||||
|
|
||||||
OllamaAPI ollamaAPI = new OllamaAPI();
|
|
||||||
ollamaAPI.setRequestTimeoutSeconds(60);
|
|
||||||
|
|
||||||
OllamaChatRequestBuilder builder = OllamaChatRequestBuilder.getInstance("<your-model>");
|
|
||||||
|
|
||||||
OllamaChatRequest requestModel = builder.withMessage(OllamaChatMessageRole.USER, "<your-first-message>").build();
|
|
||||||
OllamaChatResult initialChatResult = ollamaAPI.chat(requestModel);
|
|
||||||
System.out.println(initialChatResult.getResponse());
|
|
||||||
|
|
||||||
List<OllamaChatMessage> history = initialChatResult.getChatHistory();
|
|
||||||
|
|
||||||
while (true) {
|
|
||||||
OllamaChatResult chatResult = ollamaAPI.chat(builder.withMessages(history).withMessage(OllamaChatMessageRole.USER, "<your-new-message").build());
|
|
||||||
System.out.println(chatResult.getResponse());
|
|
||||||
history = chatResult.getChatHistory();
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
```
|
|
||||||
|
|
||||||
## Create a conversation where the answer is streamed
|
|
||||||
|
|
||||||
```java
|
|
||||||
import io.github.ollama4j.OllamaAPI;
|
|
||||||
import io.github.ollama4j.models.chat.OllamaChatMessageRole;
|
|
||||||
import io.github.ollama4j.models.chat.OllamaChatRequest;
|
|
||||||
import io.github.ollama4j.models.chat.OllamaChatRequestBuilder;
|
|
||||||
import io.github.ollama4j.models.chat.OllamaChatResult;
|
|
||||||
import io.github.ollama4j.models.generate.OllamaStreamHandler;
|
|
||||||
|
|
||||||
|
|
||||||
public class Main {
|
|
||||||
|
|
||||||
public static void main(String[] args) {
|
|
||||||
|
|
||||||
String host = "http://localhost:11434/";
|
|
||||||
|
|
||||||
OllamaAPI ollamaAPI = new OllamaAPI(host);
|
|
||||||
OllamaChatRequestBuilder builder = OllamaChatRequestBuilder.getInstance(config.getModel());
|
|
||||||
OllamaChatRequest requestModel = builder.withMessage(OllamaChatMessageRole.USER,
|
|
||||||
"What is the capital of France? And what's France's connection with Mona Lisa?")
|
|
||||||
.build();
|
|
||||||
|
|
||||||
// define a handler (Consumer<String>)
|
|
||||||
OllamaStreamHandler streamHandler = (s) -> {
|
|
||||||
System.out.println(s);
|
|
||||||
};
|
|
||||||
|
|
||||||
OllamaChatResult chatResult = ollamaAPI.chat(requestModel, streamHandler);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
```
|
|
||||||
|
|
||||||
You will get a response similar to:
|
|
||||||
|
|
||||||
|
::::tip[LLM Response]
|
||||||
|
>
|
||||||
> The
|
> The
|
||||||
|
>
|
||||||
> The capital
|
> The capital
|
||||||
|
>
|
||||||
> The capital of
|
> The capital of
|
||||||
|
>
|
||||||
> The capital of France
|
> The capital of France
|
||||||
|
>
|
||||||
> The capital of France is
|
> The capital of France is
|
||||||
|
>
|
||||||
> The capital of France is Paris
|
> The capital of France is Paris
|
||||||
|
>
|
||||||
> The capital of France is Paris.
|
> The capital of France is Paris.
|
||||||
|
>
|
||||||
|
::::
|
||||||
|
|
||||||
## Use a simple Console Output Stream Handler
|
### Using a simple Console Output Stream Handler
|
||||||
|
|
||||||
```java
|
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/ConsoleOutputStreamHandlerExample.java" />
|
||||||
import io.github.ollama4j.OllamaAPI;
|
|
||||||
import io.github.ollama4j.impl.ConsoleOutputStreamHandler;
|
|
||||||
import io.github.ollama4j.models.chat.OllamaChatMessageRole;
|
|
||||||
import io.github.ollama4j.models.chat.OllamaChatRequestBuilder;
|
|
||||||
import io.github.ollama4j.models.chat.OllamaChatRequest;
|
|
||||||
import io.github.ollama4j.models.generate.OllamaStreamHandler;
|
|
||||||
import io.github.ollama4j.types.OllamaModelType;
|
|
||||||
|
|
||||||
public class Main {
|
### With a Stream Handler to receive the tokens as they are generated
|
||||||
public static void main(String[] args) throws Exception {
|
|
||||||
String host = "http://localhost:11434/";
|
|
||||||
OllamaAPI ollamaAPI = new OllamaAPI(host);
|
|
||||||
|
|
||||||
OllamaChatRequestBuilder builder = OllamaChatRequestBuilder.getInstance(OllamaModelType.LLAMA2);
|
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/ChatStreamingExample.java" />
|
||||||
OllamaChatRequest requestModel = builder.withMessage(OllamaChatMessageRole.USER, "List all cricket world cup teams of 2019. Name the teams!")
|
|
||||||
.build();
|
|
||||||
OllamaStreamHandler streamHandler = new ConsoleOutputStreamHandler();
|
|
||||||
ollamaAPI.chat(requestModel, streamHandler);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
```
|
|
||||||
|
|
||||||
## Create a new conversation with individual system prompt
|
### Create a new conversation with custom system prompt
|
||||||
|
|
||||||
```java
|
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/ChatWithCustomSystemPrompt.java" />
|
||||||
import io.github.ollama4j.OllamaAPI;
|
|
||||||
import io.github.ollama4j.models.chat.OllamaChatMessageRole;
|
You will get a response as:
|
||||||
import io.github.ollama4j.models.chat.OllamaChatRequestBuilder;
|
|
||||||
import io.github.ollama4j.models.chat.OllamaChatRequest;
|
::::tip[LLM Response]
|
||||||
import io.github.ollama4j.models.chat.OllamaChatResult;
|
> Shhh!
|
||||||
import io.github.ollama4j.types.OllamaModelType;
|
::::
|
||||||
|
|
||||||
|
|
||||||
public class Main {
|
## Create a conversation about an image (requires a vision model)
|
||||||
|
|
||||||
public static void main(String[] args) {
|
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/ChatWithImage.java" />
|
||||||
|
|
||||||
String host = "http://localhost:11434/";
|
|
||||||
|
|
||||||
OllamaAPI ollamaAPI = new OllamaAPI(host);
|
|
||||||
OllamaChatRequestBuilder builder = OllamaChatRequestBuilder.getInstance(OllamaModelType.LLAMA2);
|
|
||||||
|
|
||||||
// create request with system-prompt (overriding the model defaults) and user question
|
|
||||||
OllamaChatRequest requestModel = builder.withMessage(OllamaChatMessageRole.SYSTEM, "You are a silent bot that only says 'NI'. Do not say anything else under any circumstances!")
|
|
||||||
.withMessage(OllamaChatMessageRole.USER, "What is the capital of France? And what's France's connection with Mona Lisa?")
|
|
||||||
.build();
|
|
||||||
|
|
||||||
// start conversation with model
|
|
||||||
OllamaChatResult chatResult = ollamaAPI.chat(requestModel);
|
|
||||||
|
|
||||||
System.out.println(chatResult.getResponseModel());
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
```
|
|
||||||
|
|
||||||
You will get a response similar to:
|
|
||||||
|
|
||||||
> NI.
|
|
||||||
|
|
||||||
## Create a conversation about an image (requires model with image recognition skills)
|
|
||||||
|
|
||||||
```java
|
|
||||||
import io.github.ollama4j.OllamaAPI;
|
|
||||||
import io.github.ollama4j.models.chat.OllamaChatMessageRole;
|
|
||||||
import io.github.ollama4j.models.chat.OllamaChatRequest;
|
|
||||||
import io.github.ollama4j.models.chat.OllamaChatRequestBuilder;
|
|
||||||
import io.github.ollama4j.models.chat.OllamaChatResult;
|
|
||||||
import io.github.ollama4j.types.OllamaModelType;
|
|
||||||
|
|
||||||
import java.io.File;
|
|
||||||
import java.util.List;
|
|
||||||
|
|
||||||
public class Main {
|
|
||||||
|
|
||||||
public static void main(String[] args) {
|
|
||||||
|
|
||||||
String host = "http://localhost:11434/";
|
|
||||||
|
|
||||||
OllamaAPI ollamaAPI = new OllamaAPI(host);
|
|
||||||
OllamaChatRequestBuilder builder = OllamaChatRequestBuilder.getInstance(OllamaModelType.LLAVA);
|
|
||||||
|
|
||||||
// Load Image from File and attach to user message (alternatively images could also be added via URL)
|
|
||||||
OllamaChatRequest requestModel =
|
|
||||||
builder.withMessage(OllamaChatMessageRole.USER, "What's in the picture?",
|
|
||||||
List.of(
|
|
||||||
new File("/path/to/image"))).build();
|
|
||||||
|
|
||||||
OllamaChatResult chatResult = ollamaAPI.chat(requestModel);
|
|
||||||
System.out.println("First answer: " + chatResult.getResponseModel());
|
|
||||||
|
|
||||||
builder.reset();
|
|
||||||
|
|
||||||
// Use history to ask further questions about the image or assistant answer
|
|
||||||
requestModel =
|
|
||||||
builder.withMessages(chatResult.getChatHistory())
|
|
||||||
.withMessage(OllamaChatMessageRole.USER, "What's the dogs breed?").build();
|
|
||||||
|
|
||||||
chatResult = ollamaAPI.chat(requestModel);
|
|
||||||
System.out.println("Second answer: " + chatResult.getResponseModel());
|
|
||||||
}
|
|
||||||
}
|
|
||||||
```
|
|
||||||
|
|
||||||
You will get a response similar to:
|
You will get a response similar to:
|
||||||
|
|
||||||
|
::::tip[LLM Response]
|
||||||
> First Answer: The image shows a dog sitting on the bow of a boat that is docked in calm water. The boat has two
|
> First Answer: The image shows a dog sitting on the bow of a boat that is docked in calm water. The boat has two
|
||||||
> levels, with the lower level containing seating and what appears to be an engine cover. The dog seems relaxed and
|
> levels, with the lower level containing seating and what appears to be an engine cover. The dog seems relaxed and
|
||||||
> comfortable on the boat, looking out over the water. The background suggests it might be late afternoon or early
|
> comfortable on the boat, looking out over the water. The background suggests it might be late afternoon or early
|
||||||
@ -272,11 +105,4 @@ You will get a response similar to:
|
|||||||
> appears to be medium-sized with a short coat and a brown coloration, which might suggest that it is a Golden Retriever
|
> appears to be medium-sized with a short coat and a brown coloration, which might suggest that it is a Golden Retriever
|
||||||
> or a similar breed. Without more details like ear shape and tail length, it's not possible to identify the exact breed
|
> or a similar breed. Without more details like ear shape and tail length, it's not possible to identify the exact breed
|
||||||
> confidently.
|
> confidently.
|
||||||
|
::::
|
||||||
|
|
||||||
[//]: # (Generated using: https://emgithub.com/)
|
|
||||||
<iframe style={{ width: '100%', height: '919px', border: 'none' }} allow="clipboard-write" src="https://emgithub.com/iframe.html?target=https%3A%2F%2Fgithub.com%2Follama4j%2Follama4j-examples%2Fblob%2Fmain%2Fsrc%2Fmain%2Fjava%2Fio%2Fgithub%2Follama4j%2Fexamples%2FChatExample.java&style=default&type=code&showBorder=on&showLineNumbers=on&showFileMeta=on&showFullPath=on&showCopy=on" />
|
|
||||||
|
|
||||||
<a href="https://github.com/ollama4j/ollama4j-examples/blob/main/src/main/java/io/github/ollama4j/examples/ChatExample.java" target="_blank">
|
|
||||||
View ChatExample.java on GitHub
|
|
||||||
</a>
|
|
||||||
|
@ -1,5 +1,5 @@
|
|||||||
---
|
---
|
||||||
sidebar_position: 8
|
sidebar_position: 9
|
||||||
---
|
---
|
||||||
|
|
||||||
# Custom Roles
|
# Custom Roles
|
||||||
|
@ -2,7 +2,9 @@
|
|||||||
sidebar_position: 2
|
sidebar_position: 2
|
||||||
---
|
---
|
||||||
|
|
||||||
# Generate - Async
|
import CodeEmbed from '@site/src/components/CodeEmbed';
|
||||||
|
|
||||||
|
# Generate (Async)
|
||||||
|
|
||||||
This API lets you ask questions to the LLMs in a asynchronous way.
|
This API lets you ask questions to the LLMs in a asynchronous way.
|
||||||
This is particularly helpful when you want to issue a generate request to the LLM and collect the response in the
|
This is particularly helpful when you want to issue a generate request to the LLM and collect the response in the
|
||||||
@ -11,38 +13,18 @@ background (such as threads) without blocking your code until the response arriv
|
|||||||
This API corresponds to
|
This API corresponds to
|
||||||
the [completion](https://github.com/jmorganca/ollama/blob/main/docs/api.md#generate-a-completion) API.
|
the [completion](https://github.com/jmorganca/ollama/blob/main/docs/api.md#generate-a-completion) API.
|
||||||
|
|
||||||
```java
|
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/GenerateAsync.java" />
|
||||||
import io.github.ollama4j.OllamaAPI;
|
|
||||||
import io.github.ollama4j.models.response.OllamaAsyncResultStreamer;
|
|
||||||
import io.github.ollama4j.types.OllamaModelType;
|
|
||||||
|
|
||||||
public class Main {
|
::::tip[LLM Response]
|
||||||
|
Here are the participating teams in the 2019 ICC Cricket World Cup:
|
||||||
|
|
||||||
public static void main(String[] args) throws Exception {
|
1. Australia
|
||||||
String host = "http://localhost:11434/";
|
2. Bangladesh
|
||||||
OllamaAPI ollamaAPI = new OllamaAPI(host);
|
3. India
|
||||||
ollamaAPI.setRequestTimeoutSeconds(60);
|
4. New Zealand
|
||||||
String prompt = "List all cricket world cup teams of 2019.";
|
5. Pakistan
|
||||||
OllamaAsyncResultStreamer streamer = ollamaAPI.generateAsync(OllamaModelType.LLAMA3, prompt, false);
|
6. England
|
||||||
|
7. South Africa
|
||||||
// Set the poll interval according to your needs.
|
8. West Indies (as a team)
|
||||||
// Smaller the poll interval, more frequently you receive the tokens.
|
9. Afghanistan
|
||||||
int pollIntervalMilliseconds = 1000;
|
::::
|
||||||
|
|
||||||
while (true) {
|
|
||||||
String tokens = streamer.getStream().poll();
|
|
||||||
System.out.print(tokens);
|
|
||||||
if (!streamer.isAlive()) {
|
|
||||||
break;
|
|
||||||
}
|
|
||||||
Thread.sleep(pollIntervalMilliseconds);
|
|
||||||
}
|
|
||||||
|
|
||||||
System.out.println("\n------------------------");
|
|
||||||
System.out.println("Complete Response:");
|
|
||||||
System.out.println("------------------------");
|
|
||||||
|
|
||||||
System.out.println(streamer.getCompleteResponse());
|
|
||||||
}
|
|
||||||
}
|
|
||||||
```
|
|
File diff suppressed because one or more lines are too long
@ -1,8 +1,10 @@
|
|||||||
---
|
---
|
||||||
sidebar_position: 4
|
sidebar_position: 3
|
||||||
---
|
---
|
||||||
|
|
||||||
# Generate - With Image Files
|
import CodeEmbed from '@site/src/components/CodeEmbed';
|
||||||
|
|
||||||
|
# Generate with Image Files
|
||||||
|
|
||||||
This API lets you ask questions along with the image files to the LLMs.
|
This API lets you ask questions along with the image files to the LLMs.
|
||||||
This API corresponds to
|
This API corresponds to
|
||||||
@ -21,34 +23,11 @@ If you have this image downloaded and you pass the path to the downloaded image
|
|||||||
|
|
||||||

|

|
||||||
|
|
||||||
```java
|
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/GenerateWithImageFile.java" />
|
||||||
import io.github.ollama4j.OllamaAPI;
|
|
||||||
import io.github.ollama4j.models.response.OllamaResult;
|
|
||||||
import io.github.ollama4j.types.OllamaModelType;
|
|
||||||
import io.github.ollama4j.utils.OptionsBuilder;
|
|
||||||
|
|
||||||
import java.io.File;
|
|
||||||
import java.util.List;
|
|
||||||
|
|
||||||
public class Main {
|
|
||||||
|
|
||||||
public static void main(String[] args) {
|
|
||||||
String host = "http://localhost:11434/";
|
|
||||||
OllamaAPI ollamaAPI = new OllamaAPI(host);
|
|
||||||
ollamaAPI.setRequestTimeoutSeconds(10);
|
|
||||||
|
|
||||||
OllamaResult result = ollamaAPI.generateWithImageFiles(OllamaModelType.LLAVA,
|
|
||||||
"What's in this image?",
|
|
||||||
List.of(
|
|
||||||
new File("/path/to/image")),
|
|
||||||
new OptionsBuilder().build()
|
|
||||||
);
|
|
||||||
System.out.println(result.getResponse());
|
|
||||||
}
|
|
||||||
}
|
|
||||||
```
|
|
||||||
|
|
||||||
You will get a response similar to:
|
You will get a response similar to:
|
||||||
|
|
||||||
|
::::tip[LLM Response]
|
||||||
> This image features a white boat with brown cushions, where a dog is sitting on the back of the boat. The dog seems to
|
> This image features a white boat with brown cushions, where a dog is sitting on the back of the boat. The dog seems to
|
||||||
> be enjoying its time outdoors, perhaps on a lake.
|
> be enjoying its time outdoors, perhaps on a lake.
|
||||||
|
::::
|
@ -1,8 +1,10 @@
|
|||||||
---
|
---
|
||||||
sidebar_position: 5
|
sidebar_position: 4
|
||||||
---
|
---
|
||||||
|
|
||||||
# Generate - With Image URLs
|
import CodeEmbed from '@site/src/components/CodeEmbed';
|
||||||
|
|
||||||
|
# Generate with Image URLs
|
||||||
|
|
||||||
This API lets you ask questions along with the image files to the LLMs.
|
This API lets you ask questions along with the image files to the LLMs.
|
||||||
This API corresponds to
|
This API corresponds to
|
||||||
@ -21,33 +23,11 @@ Passing the link of this image the following code:
|
|||||||
|
|
||||||

|

|
||||||
|
|
||||||
```java
|
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/GenerateWithImageURL.java" />
|
||||||
import io.github.ollama4j.OllamaAPI;
|
|
||||||
import io.github.ollama4j.models.response.OllamaResult;
|
|
||||||
import io.github.ollama4j.types.OllamaModelType;
|
|
||||||
import io.github.ollama4j.utils.OptionsBuilder;
|
|
||||||
|
|
||||||
import java.util.List;
|
|
||||||
|
|
||||||
public class Main {
|
|
||||||
|
|
||||||
public static void main(String[] args) {
|
|
||||||
String host = "http://localhost:11434/";
|
|
||||||
OllamaAPI ollamaAPI = new OllamaAPI(host);
|
|
||||||
ollamaAPI.setRequestTimeoutSeconds(10);
|
|
||||||
|
|
||||||
OllamaResult result = ollamaAPI.generateWithImageURLs(OllamaModelType.LLAVA,
|
|
||||||
"What's in this image?",
|
|
||||||
List.of(
|
|
||||||
"https://t3.ftcdn.net/jpg/02/96/63/80/360_F_296638053_0gUVA4WVBKceGsIr7LNqRWSnkusi07dq.jpg"),
|
|
||||||
new OptionsBuilder().build()
|
|
||||||
);
|
|
||||||
System.out.println(result.getResponse());
|
|
||||||
}
|
|
||||||
}
|
|
||||||
```
|
|
||||||
|
|
||||||
You will get a response similar to:
|
You will get a response similar to:
|
||||||
|
|
||||||
|
::::tip[LLM Response]
|
||||||
> This image features a white boat with brown cushions, where a dog is sitting on the back of the boat. The dog seems to
|
> This image features a white boat with brown cushions, where a dog is sitting on the back of the boat. The dog seems to
|
||||||
> be enjoying its time outdoors, perhaps on a lake.
|
> be enjoying its time outdoors, perhaps on a lake.
|
||||||
|
::::
|
@ -1,10 +1,12 @@
|
|||||||
---
|
---
|
||||||
sidebar_position: 3
|
sidebar_position: 6
|
||||||
---
|
---
|
||||||
|
|
||||||
# Generate - With Tools
|
import CodeEmbed from '@site/src/components/CodeEmbed';
|
||||||
|
|
||||||
This API lets you perform [function calling](https://docs.mistral.ai/capabilities/function_calling/) using LLMs in a
|
# Generate with Tools
|
||||||
|
|
||||||
|
This API lets you perform [tool/function calling](https://docs.mistral.ai/capabilities/function_calling/) using LLMs in a
|
||||||
synchronous way.
|
synchronous way.
|
||||||
This API corresponds to
|
This API corresponds to
|
||||||
the [generate](https://github.com/ollama/ollama/blob/main/docs/api.md#request-raw-mode) API with `raw` mode.
|
the [generate](https://github.com/ollama/ollama/blob/main/docs/api.md#request-raw-mode) API with `raw` mode.
|
||||||
@ -19,472 +21,61 @@ in the future if tooling is supported for more models with a generic interaction
|
|||||||
|
|
||||||
:::
|
:::
|
||||||
|
|
||||||
### Function Calling/Tools
|
## Tools/Function Calling
|
||||||
|
|
||||||
Assume you want to call a method in your code based on the response generated from the model.
|
Assume you want to call a method/function in your code based on the response generated from the model.
|
||||||
For instance, let's say that based on a user's question, you'd want to identify a transaction and get the details of the
|
For instance, let's say that based on a user's question, you'd want to identify a transaction and get the details of the
|
||||||
transaction from your database and respond to the user with the transaction details.
|
transaction from your database and respond to the user with the transaction details.
|
||||||
|
|
||||||
You could do that with ease with the `function calling` capabilities of the models by registering your `tools`.
|
You could do that with ease with the `function calling` capabilities of the models by registering your `tools`.
|
||||||
|
|
||||||
### Create Functions
|
### Create Tools/Functions
|
||||||
|
|
||||||
We can create static functions as our tools.
|
We can create static functions as our tools.
|
||||||
|
|
||||||
This function takes the arguments `location` and `fuelType` and performs an operation with these arguments and returns
|
This function takes the arguments `location` and `fuelType` and performs an operation with these arguments and returns
|
||||||
fuel price value.
|
fuel price value.
|
||||||
|
|
||||||
```java
|
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/toolcalling/tools/FuelPriceTool.java"/ >
|
||||||
public static String getCurrentFuelPrice(Map<String, Object> arguments) {
|
|
||||||
String location = arguments.get("location").toString();
|
|
||||||
String fuelType = arguments.get("fuelType").toString();
|
|
||||||
return "Current price of " + fuelType + " in " + location + " is Rs.103/L";
|
|
||||||
}
|
|
||||||
```
|
|
||||||
|
|
||||||
This function takes the argument `city` and performs an operation with the argument and returns the weather for a
|
This function takes the argument `city` and performs an operation with the argument and returns the weather for a
|
||||||
location.
|
location.
|
||||||
|
|
||||||
```java
|
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/toolcalling/tools/WeatherTool.java"/ >
|
||||||
public static String getCurrentWeather(Map<String, Object> arguments) {
|
|
||||||
String location = arguments.get("city").toString();
|
|
||||||
return "Currently " + location + "'s weather is nice.";
|
|
||||||
}
|
|
||||||
```
|
|
||||||
|
|
||||||
Another way to create our tools is by creating classes by extending `ToolFunction`.
|
Another way to create our tools is by creating classes by extending `ToolFunction`.
|
||||||
|
|
||||||
This function takes the argument `employee-name` and performs an operation with the argument and returns employee
|
This function takes the argument `employee-name` and performs an operation with the argument and returns employee
|
||||||
details.
|
details.
|
||||||
|
|
||||||
```java
|
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/toolcalling/tools/DBQueryFunction.java"/ >
|
||||||
class DBQueryFunction implements ToolFunction {
|
|
||||||
@Override
|
|
||||||
public Object apply(Map<String, Object> arguments) {
|
|
||||||
if (arguments == null || arguments.isEmpty() || arguments.get("employee-name") == null || arguments.get("employee-address") == null || arguments.get("employee-phone") == null) {
|
|
||||||
throw new RuntimeException("Tool was called but the model failed to provide all the required arguments.");
|
|
||||||
}
|
|
||||||
// perform DB operations here
|
|
||||||
return String.format("Employee Details {ID: %s, Name: %s, Address: %s, Phone: %s}", UUID.randomUUID(), arguments.get("employee-name").toString(), arguments.get("employee-address").toString(), arguments.get("employee-phone").toString());
|
|
||||||
}
|
|
||||||
}
|
|
||||||
```
|
|
||||||
|
|
||||||
### Define Tool Specifications
|
### Define Tool Specifications
|
||||||
|
|
||||||
Lets define a sample tool specification called **Fuel Price Tool** for getting the current fuel price.
|
Lets define a sample tool specification called **Fuel Price Tool** for getting the current fuel price.
|
||||||
|
|
||||||
- Specify the function `name`, `description`, and `required` properties (`location` and `fuelType`).
|
- Specify the function `name`, `description`, and `required` properties (`location` and `fuelType`).
|
||||||
- Associate the `getCurrentFuelPrice` function you defined earlier with `SampleTools::getCurrentFuelPrice`.
|
- Associate the `getCurrentFuelPrice` function you defined earlier.
|
||||||
|
|
||||||
```java
|
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/toolcalling/toolspecs/FuelPriceToolSpec.java"/ >
|
||||||
Tools.ToolSpecification fuelPriceToolSpecification = Tools.ToolSpecification.builder()
|
|
||||||
.functionName("current-fuel-price")
|
|
||||||
.functionDescription("Get current fuel price")
|
|
||||||
.toolFunction(SampleTools::getCurrentFuelPrice)
|
|
||||||
.toolPrompt(
|
|
||||||
Tools.PromptFuncDefinition.builder()
|
|
||||||
.type("prompt")
|
|
||||||
.function(
|
|
||||||
Tools.PromptFuncDefinition.PromptFuncSpec.builder()
|
|
||||||
.name("get-location-fuel-info")
|
|
||||||
.description("Get location and fuel type details")
|
|
||||||
.parameters(
|
|
||||||
Tools.PromptFuncDefinition.Parameters.builder()
|
|
||||||
.type("object")
|
|
||||||
.properties(
|
|
||||||
Map.of(
|
|
||||||
"location", Tools.PromptFuncDefinition.Property.builder()
|
|
||||||
.type("string")
|
|
||||||
.description("The city, e.g. New Delhi, India")
|
|
||||||
.required(true)
|
|
||||||
.build(),
|
|
||||||
"fuelType", Tools.PromptFuncDefinition.Property.builder()
|
|
||||||
.type("string")
|
|
||||||
.description("The fuel type.")
|
|
||||||
.enumValues(Arrays.asList("petrol", "diesel"))
|
|
||||||
.required(true)
|
|
||||||
.build()
|
|
||||||
)
|
|
||||||
)
|
|
||||||
.required(java.util.List.of("location", "fuelType"))
|
|
||||||
.build()
|
|
||||||
)
|
|
||||||
.build()
|
|
||||||
)
|
|
||||||
.build()
|
|
||||||
).build();
|
|
||||||
```
|
|
||||||
|
|
||||||
Lets also define a sample tool specification called **Weather Tool** for getting the current weather.
|
Lets also define a sample tool specification called **Weather Tool** for getting the current weather.
|
||||||
|
|
||||||
- Specify the function `name`, `description`, and `required` property (`city`).
|
- Specify the function `name`, `description`, and `required` property (`city`).
|
||||||
- Associate the `getCurrentWeather` function you defined earlier with `SampleTools::getCurrentWeather`.
|
- Associate the `getCurrentWeather` function you defined earlier.
|
||||||
|
|
||||||
```java
|
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/toolcalling/toolspecs/WeatherToolSpec.java"/ >
|
||||||
Tools.ToolSpecification weatherToolSpecification = Tools.ToolSpecification.builder()
|
|
||||||
.functionName("current-weather")
|
|
||||||
.functionDescription("Get current weather")
|
|
||||||
.toolFunction(SampleTools::getCurrentWeather)
|
|
||||||
.toolPrompt(
|
|
||||||
Tools.PromptFuncDefinition.builder()
|
|
||||||
.type("prompt")
|
|
||||||
.function(
|
|
||||||
Tools.PromptFuncDefinition.PromptFuncSpec.builder()
|
|
||||||
.name("get-location-weather-info")
|
|
||||||
.description("Get location details")
|
|
||||||
.parameters(
|
|
||||||
Tools.PromptFuncDefinition.Parameters.builder()
|
|
||||||
.type("object")
|
|
||||||
.properties(
|
|
||||||
Map.of(
|
|
||||||
"city", Tools.PromptFuncDefinition.Property.builder()
|
|
||||||
.type("string")
|
|
||||||
.description("The city, e.g. New Delhi, India")
|
|
||||||
.required(true)
|
|
||||||
.build()
|
|
||||||
)
|
|
||||||
)
|
|
||||||
.required(java.util.List.of("city"))
|
|
||||||
.build()
|
|
||||||
)
|
|
||||||
.build()
|
|
||||||
)
|
|
||||||
.build()
|
|
||||||
).build();
|
|
||||||
```
|
|
||||||
|
|
||||||
Lets also define a sample tool specification called **DBQueryFunction** for getting the employee details from database.
|
Lets also define a sample tool specification called **DBQueryFunction** for getting the employee details from database.
|
||||||
|
|
||||||
- Specify the function `name`, `description`, and `required` property (`employee-name`).
|
- Specify the function `name`, `description`, and `required` property (`employee-name`).
|
||||||
- Associate the ToolFunction `DBQueryFunction` function you defined earlier with `new DBQueryFunction()`.
|
- Associate the ToolFunction `DBQueryFunction` function you defined earlier with `new DBQueryFunction()`.
|
||||||
|
|
||||||
```java
|
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/toolcalling/toolspecs/DatabaseQueryToolSpec.java"/ >
|
||||||
Tools.ToolSpecification databaseQueryToolSpecification = Tools.ToolSpecification.builder()
|
|
||||||
.functionName("get-employee-details")
|
|
||||||
.functionDescription("Get employee details from the database")
|
|
||||||
.toolFunction(new DBQueryFunction())
|
|
||||||
.toolPrompt(
|
|
||||||
Tools.PromptFuncDefinition.builder()
|
|
||||||
.type("prompt")
|
|
||||||
.function(
|
|
||||||
Tools.PromptFuncDefinition.PromptFuncSpec.builder()
|
|
||||||
.name("get-employee-details")
|
|
||||||
.description("Get employee details from the database")
|
|
||||||
.parameters(
|
|
||||||
Tools.PromptFuncDefinition.Parameters.builder()
|
|
||||||
.type("object")
|
|
||||||
.properties(
|
|
||||||
Map.of(
|
|
||||||
"employee-name", Tools.PromptFuncDefinition.Property.builder()
|
|
||||||
.type("string")
|
|
||||||
.description("The name of the employee, e.g. John Doe")
|
|
||||||
.required(true)
|
|
||||||
.build(),
|
|
||||||
"employee-address", Tools.PromptFuncDefinition.Property.builder()
|
|
||||||
.type("string")
|
|
||||||
.description("The address of the employee, Always return a random value. e.g. Roy St, Bengaluru, India")
|
|
||||||
.required(true)
|
|
||||||
.build(),
|
|
||||||
"employee-phone", Tools.PromptFuncDefinition.Property.builder()
|
|
||||||
.type("string")
|
|
||||||
.description("The phone number of the employee. Always return a random value. e.g. 9911002233")
|
|
||||||
.required(true)
|
|
||||||
.build()
|
|
||||||
)
|
|
||||||
)
|
|
||||||
.required(java.util.List.of("employee-name", "employee-address", "employee-phone"))
|
|
||||||
.build()
|
|
||||||
)
|
|
||||||
.build()
|
|
||||||
)
|
|
||||||
.build()
|
|
||||||
)
|
|
||||||
.build();
|
|
||||||
```
|
|
||||||
|
|
||||||
### Register the Tools
|
Now put it all together by registering the tools and prompting with tools.
|
||||||
|
|
||||||
Register the defined tools (`fuel price` and `weather`) with the OllamaAPI.
|
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/MultiToolRegistryExample.java"/ >
|
||||||
|
|
||||||
```shell
|
|
||||||
ollamaAPI.registerTool(fuelPriceToolSpecification);
|
|
||||||
ollamaAPI.registerTool(weatherToolSpecification);
|
|
||||||
ollamaAPI.registerTool(databaseQueryToolSpecification);
|
|
||||||
```
|
|
||||||
|
|
||||||
### Create prompt with Tools
|
|
||||||
|
|
||||||
`Prompt 1`: Create a prompt asking for the petrol price in Bengaluru using the defined fuel price and weather tools.
|
|
||||||
|
|
||||||
```shell
|
|
||||||
String prompt1 = new Tools.PromptBuilder()
|
|
||||||
.withToolSpecification(fuelPriceToolSpecification)
|
|
||||||
.withToolSpecification(weatherToolSpecification)
|
|
||||||
.withPrompt("What is the petrol price in Bengaluru?")
|
|
||||||
.build();
|
|
||||||
OllamaToolsResult toolsResult = ollamaAPI.generateWithTools(model, prompt1, new OptionsBuilder().build());
|
|
||||||
for (OllamaToolsResult.ToolResult r : toolsResult.getToolResults()) {
|
|
||||||
System.out.printf("[Result of executing tool '%s']: %s%n", r.getFunctionName(), r.getResult().toString());
|
|
||||||
}
|
|
||||||
```
|
|
||||||
|
|
||||||
Now, fire away your question to the model.
|
|
||||||
|
|
||||||
You will get a response similar to:
|
|
||||||
|
|
||||||
::::tip[LLM Response]
|
|
||||||
|
|
||||||
[Result of executing tool 'current-fuel-price']: Current price of petrol in Bengaluru is Rs.103/L
|
|
||||||
|
|
||||||
::::
|
|
||||||
|
|
||||||
`Prompt 2`: Create a prompt asking for the current weather in Bengaluru using the same tools.
|
|
||||||
|
|
||||||
```shell
|
|
||||||
String prompt2 = new Tools.PromptBuilder()
|
|
||||||
.withToolSpecification(fuelPriceToolSpecification)
|
|
||||||
.withToolSpecification(weatherToolSpecification)
|
|
||||||
.withPrompt("What is the current weather in Bengaluru?")
|
|
||||||
.build();
|
|
||||||
OllamaToolsResult toolsResult = ollamaAPI.generateWithTools(model, prompt2, new OptionsBuilder().build());
|
|
||||||
for (OllamaToolsResult.ToolResult r : toolsResult.getToolResults()) {
|
|
||||||
System.out.printf("[Result of executing tool '%s']: %s%n", r.getFunctionName(), r.getResult().toString());
|
|
||||||
}
|
|
||||||
```
|
|
||||||
|
|
||||||
Again, fire away your question to the model.
|
|
||||||
|
|
||||||
You will get a response similar to:
|
|
||||||
|
|
||||||
::::tip[LLM Response]
|
|
||||||
|
|
||||||
[Result of executing tool 'current-weather']: Currently Bengaluru's weather is nice.
|
|
||||||
|
|
||||||
::::
|
|
||||||
|
|
||||||
`Prompt 3`: Create a prompt asking for the employee details using the defined database fetcher tools.
|
|
||||||
|
|
||||||
```shell
|
|
||||||
String prompt3 = new Tools.PromptBuilder()
|
|
||||||
.withToolSpecification(fuelPriceToolSpecification)
|
|
||||||
.withToolSpecification(weatherToolSpecification)
|
|
||||||
.withToolSpecification(databaseQueryToolSpecification)
|
|
||||||
.withPrompt("Give me the details of the employee named 'Rahul Kumar'?")
|
|
||||||
.build();
|
|
||||||
OllamaToolsResult toolsResult = ollamaAPI.generateWithTools(model, prompt3, new OptionsBuilder().build());
|
|
||||||
for (OllamaToolsResult.ToolResult r : toolsResult.getToolResults()) {
|
|
||||||
System.out.printf("[Result of executing tool '%s']: %s%n", r.getFunctionName(), r.getResult().toString());
|
|
||||||
}
|
|
||||||
```
|
|
||||||
|
|
||||||
Again, fire away your question to the model.
|
|
||||||
|
|
||||||
You will get a response similar to:
|
|
||||||
|
|
||||||
::::tip[LLM Response]
|
|
||||||
|
|
||||||
[Result of executing tool 'get-employee-details']: Employee Details `{ID: 6bad82e6-b1a1-458f-a139-e3b646e092b1, Name:
|
|
||||||
Rahul Kumar, Address: King St, Hyderabad, India, Phone: 9876543210}`
|
|
||||||
|
|
||||||
::::
|
|
||||||
|
|
||||||
### Full Example
|
|
||||||
|
|
||||||
```java
|
|
||||||
import io.github.ollama4j.OllamaAPI;
|
|
||||||
import io.github.ollama4j.exceptions.OllamaBaseException;
|
|
||||||
import io.github.ollama4j.exceptions.ToolInvocationException;
|
|
||||||
import io.github.ollama4j.tools.OllamaToolsResult;
|
|
||||||
import io.github.ollama4j.tools.ToolFunction;
|
|
||||||
import io.github.ollama4j.tools.Tools;
|
|
||||||
import io.github.ollama4j.utils.OptionsBuilder;
|
|
||||||
|
|
||||||
import java.io.IOException;
|
|
||||||
import java.util.Arrays;
|
|
||||||
import java.util.Map;
|
|
||||||
import java.util.UUID;
|
|
||||||
|
|
||||||
public class FunctionCallingWithMistralExample {
|
|
||||||
public static void main(String[] args) throws Exception {
|
|
||||||
String host = "http://localhost:11434/";
|
|
||||||
OllamaAPI ollamaAPI = new OllamaAPI(host);
|
|
||||||
ollamaAPI.setRequestTimeoutSeconds(60);
|
|
||||||
|
|
||||||
String model = "mistral";
|
|
||||||
|
|
||||||
Tools.ToolSpecification fuelPriceToolSpecification = Tools.ToolSpecification.builder()
|
|
||||||
.functionName("current-fuel-price")
|
|
||||||
.functionDescription("Get current fuel price")
|
|
||||||
.toolFunction(SampleTools::getCurrentFuelPrice)
|
|
||||||
.toolPrompt(
|
|
||||||
Tools.PromptFuncDefinition.builder()
|
|
||||||
.type("prompt")
|
|
||||||
.function(
|
|
||||||
Tools.PromptFuncDefinition.PromptFuncSpec.builder()
|
|
||||||
.name("get-location-fuel-info")
|
|
||||||
.description("Get location and fuel type details")
|
|
||||||
.parameters(
|
|
||||||
Tools.PromptFuncDefinition.Parameters.builder()
|
|
||||||
.type("object")
|
|
||||||
.properties(
|
|
||||||
Map.of(
|
|
||||||
"location", Tools.PromptFuncDefinition.Property.builder()
|
|
||||||
.type("string")
|
|
||||||
.description("The city, e.g. New Delhi, India")
|
|
||||||
.required(true)
|
|
||||||
.build(),
|
|
||||||
"fuelType", Tools.PromptFuncDefinition.Property.builder()
|
|
||||||
.type("string")
|
|
||||||
.description("The fuel type.")
|
|
||||||
.enumValues(Arrays.asList("petrol", "diesel"))
|
|
||||||
.required(true)
|
|
||||||
.build()
|
|
||||||
)
|
|
||||||
)
|
|
||||||
.required(java.util.List.of("location", "fuelType"))
|
|
||||||
.build()
|
|
||||||
)
|
|
||||||
.build()
|
|
||||||
)
|
|
||||||
.build()
|
|
||||||
).build();
|
|
||||||
|
|
||||||
Tools.ToolSpecification weatherToolSpecification = Tools.ToolSpecification.builder()
|
|
||||||
.functionName("current-weather")
|
|
||||||
.functionDescription("Get current weather")
|
|
||||||
.toolFunction(SampleTools::getCurrentWeather)
|
|
||||||
.toolPrompt(
|
|
||||||
Tools.PromptFuncDefinition.builder()
|
|
||||||
.type("prompt")
|
|
||||||
.function(
|
|
||||||
Tools.PromptFuncDefinition.PromptFuncSpec.builder()
|
|
||||||
.name("get-location-weather-info")
|
|
||||||
.description("Get location details")
|
|
||||||
.parameters(
|
|
||||||
Tools.PromptFuncDefinition.Parameters.builder()
|
|
||||||
.type("object")
|
|
||||||
.properties(
|
|
||||||
Map.of(
|
|
||||||
"city", Tools.PromptFuncDefinition.Property.builder()
|
|
||||||
.type("string")
|
|
||||||
.description("The city, e.g. New Delhi, India")
|
|
||||||
.required(true)
|
|
||||||
.build()
|
|
||||||
)
|
|
||||||
)
|
|
||||||
.required(java.util.List.of("city"))
|
|
||||||
.build()
|
|
||||||
)
|
|
||||||
.build()
|
|
||||||
)
|
|
||||||
.build()
|
|
||||||
).build();
|
|
||||||
|
|
||||||
Tools.ToolSpecification databaseQueryToolSpecification = Tools.ToolSpecification.builder()
|
|
||||||
.functionName("get-employee-details")
|
|
||||||
.functionDescription("Get employee details from the database")
|
|
||||||
.toolFunction(new DBQueryFunction())
|
|
||||||
.toolPrompt(
|
|
||||||
Tools.PromptFuncDefinition.builder()
|
|
||||||
.type("prompt")
|
|
||||||
.function(
|
|
||||||
Tools.PromptFuncDefinition.PromptFuncSpec.builder()
|
|
||||||
.name("get-employee-details")
|
|
||||||
.description("Get employee details from the database")
|
|
||||||
.parameters(
|
|
||||||
Tools.PromptFuncDefinition.Parameters.builder()
|
|
||||||
.type("object")
|
|
||||||
.properties(
|
|
||||||
Map.of(
|
|
||||||
"employee-name", Tools.PromptFuncDefinition.Property.builder()
|
|
||||||
.type("string")
|
|
||||||
.description("The name of the employee, e.g. John Doe")
|
|
||||||
.required(true)
|
|
||||||
.build(),
|
|
||||||
"employee-address", Tools.PromptFuncDefinition.Property.builder()
|
|
||||||
.type("string")
|
|
||||||
.description("The address of the employee, Always return a random value. e.g. Roy St, Bengaluru, India")
|
|
||||||
.required(true)
|
|
||||||
.build(),
|
|
||||||
"employee-phone", Tools.PromptFuncDefinition.Property.builder()
|
|
||||||
.type("string")
|
|
||||||
.description("The phone number of the employee. Always return a random value. e.g. 9911002233")
|
|
||||||
.required(true)
|
|
||||||
.build()
|
|
||||||
)
|
|
||||||
)
|
|
||||||
.required(java.util.List.of("employee-name", "employee-address", "employee-phone"))
|
|
||||||
.build()
|
|
||||||
)
|
|
||||||
.build()
|
|
||||||
)
|
|
||||||
.build()
|
|
||||||
)
|
|
||||||
.build();
|
|
||||||
|
|
||||||
ollamaAPI.registerTool(fuelPriceToolSpecification);
|
|
||||||
ollamaAPI.registerTool(weatherToolSpecification);
|
|
||||||
ollamaAPI.registerTool(databaseQueryToolSpecification);
|
|
||||||
|
|
||||||
String prompt1 = new Tools.PromptBuilder()
|
|
||||||
.withToolSpecification(fuelPriceToolSpecification)
|
|
||||||
.withToolSpecification(weatherToolSpecification)
|
|
||||||
.withPrompt("What is the petrol price in Bengaluru?")
|
|
||||||
.build();
|
|
||||||
ask(ollamaAPI, model, prompt1);
|
|
||||||
|
|
||||||
String prompt2 = new Tools.PromptBuilder()
|
|
||||||
.withToolSpecification(fuelPriceToolSpecification)
|
|
||||||
.withToolSpecification(weatherToolSpecification)
|
|
||||||
.withPrompt("What is the current weather in Bengaluru?")
|
|
||||||
.build();
|
|
||||||
ask(ollamaAPI, model, prompt2);
|
|
||||||
|
|
||||||
String prompt3 = new Tools.PromptBuilder()
|
|
||||||
.withToolSpecification(fuelPriceToolSpecification)
|
|
||||||
.withToolSpecification(weatherToolSpecification)
|
|
||||||
.withToolSpecification(databaseQueryToolSpecification)
|
|
||||||
.withPrompt("Give me the details of the employee named 'Rahul Kumar'?")
|
|
||||||
.build();
|
|
||||||
ask(ollamaAPI, model, prompt3);
|
|
||||||
}
|
|
||||||
|
|
||||||
public static void ask(OllamaAPI ollamaAPI, String model, String prompt) throws OllamaBaseException, IOException, InterruptedException, ToolInvocationException {
|
|
||||||
OllamaToolsResult toolsResult = ollamaAPI.generateWithTools(model, prompt, new OptionsBuilder().build());
|
|
||||||
for (OllamaToolsResult.ToolResult r : toolsResult.getToolResults()) {
|
|
||||||
System.out.printf("[Result of executing tool '%s']: %s%n", r.getFunctionName(), r.getResult().toString());
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
class SampleTools {
|
|
||||||
public static String getCurrentFuelPrice(Map<String, Object> arguments) {
|
|
||||||
// Get details from fuel price API
|
|
||||||
String location = arguments.get("location").toString();
|
|
||||||
String fuelType = arguments.get("fuelType").toString();
|
|
||||||
return "Current price of " + fuelType + " in " + location + " is Rs.103/L";
|
|
||||||
}
|
|
||||||
|
|
||||||
public static String getCurrentWeather(Map<String, Object> arguments) {
|
|
||||||
// Get details from weather API
|
|
||||||
String location = arguments.get("city").toString();
|
|
||||||
return "Currently " + location + "'s weather is nice.";
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
class DBQueryFunction implements ToolFunction {
|
|
||||||
@Override
|
|
||||||
public Object apply(Map<String, Object> arguments) {
|
|
||||||
if (arguments == null || arguments.isEmpty() || arguments.get("employee-name") == null || arguments.get("employee-address") == null || arguments.get("employee-phone") == null) {
|
|
||||||
throw new RuntimeException("Tool was called but the model failed to provide all the required arguments.");
|
|
||||||
}
|
|
||||||
// perform DB operations here
|
|
||||||
return String.format("Employee Details {ID: %s, Name: %s, Address: %s, Phone: %s}", UUID.randomUUID(), arguments.get("employee-name").toString(), arguments.get("employee-address").toString(), arguments.get("employee-phone").toString());
|
|
||||||
}
|
|
||||||
}
|
|
||||||
```
|
|
||||||
|
|
||||||
Run this full example and you will get a response similar to:
|
Run this full example and you will get a response similar to:
|
||||||
|
|
||||||
@ -498,299 +89,3 @@ Run this full example and you will get a response similar to:
|
|||||||
Rahul Kumar, Address: King St, Hyderabad, India, Phone: 9876543210}`
|
Rahul Kumar, Address: King St, Hyderabad, India, Phone: 9876543210}`
|
||||||
|
|
||||||
::::
|
::::
|
||||||
|
|
||||||
### Using tools in Chat-API
|
|
||||||
|
|
||||||
Instead of using the specific `ollamaAPI.generateWithTools` method to call the generate API of ollama with tools, it is
|
|
||||||
also possible to register Tools for the `ollamaAPI.chat` methods. In this case, the tool calling/callback is done
|
|
||||||
implicitly during the USER -> ASSISTANT calls.
|
|
||||||
|
|
||||||
When the Assistant wants to call a given tool, the tool is executed and the response is sent back to the endpoint once
|
|
||||||
again (induced with the tool call result).
|
|
||||||
|
|
||||||
#### Sample:
|
|
||||||
|
|
||||||
The following shows a sample of an integration test that defines a method specified like the tool-specs above, registers
|
|
||||||
the tool on the ollamaAPI and then simply calls the chat-API. All intermediate tool calling is wrapped inside the api
|
|
||||||
call.
|
|
||||||
|
|
||||||
```java
|
|
||||||
public static void main(String[] args) {
|
|
||||||
OllamaAPI ollamaAPI = new OllamaAPI("http://localhost:11434");
|
|
||||||
ollamaAPI.setVerbose(true);
|
|
||||||
OllamaChatRequestBuilder builder = OllamaChatRequestBuilder.getInstance("llama3.2:1b");
|
|
||||||
|
|
||||||
final Tools.ToolSpecification databaseQueryToolSpecification = Tools.ToolSpecification.builder()
|
|
||||||
.functionName("get-employee-details")
|
|
||||||
.functionDescription("Get employee details from the database")
|
|
||||||
.toolPrompt(
|
|
||||||
Tools.PromptFuncDefinition.builder().type("function").function(
|
|
||||||
Tools.PromptFuncDefinition.PromptFuncSpec.builder()
|
|
||||||
.name("get-employee-details")
|
|
||||||
.description("Get employee details from the database")
|
|
||||||
.parameters(
|
|
||||||
Tools.PromptFuncDefinition.Parameters.builder()
|
|
||||||
.type("object")
|
|
||||||
.properties(
|
|
||||||
new Tools.PropsBuilder()
|
|
||||||
.withProperty("employee-name", Tools.PromptFuncDefinition.Property.builder().type("string").description("The name of the employee, e.g. John Doe").required(true).build())
|
|
||||||
.withProperty("employee-address", Tools.PromptFuncDefinition.Property.builder().type("string").description("The address of the employee, Always return a random value. e.g. Roy St, Bengaluru, India").required(true).build())
|
|
||||||
.withProperty("employee-phone", Tools.PromptFuncDefinition.Property.builder().type("string").description("The phone number of the employee. Always return a random value. e.g. 9911002233").required(true).build())
|
|
||||||
.build()
|
|
||||||
)
|
|
||||||
.required(List.of("employee-name"))
|
|
||||||
.build()
|
|
||||||
).build()
|
|
||||||
).build()
|
|
||||||
)
|
|
||||||
.toolFunction(new DBQueryFunction())
|
|
||||||
.build();
|
|
||||||
|
|
||||||
ollamaAPI.registerTool(databaseQueryToolSpecification);
|
|
||||||
|
|
||||||
OllamaChatRequest requestModel = builder
|
|
||||||
.withMessage(OllamaChatMessageRole.USER,
|
|
||||||
"Give me the ID of the employee named 'Rahul Kumar'?")
|
|
||||||
.build();
|
|
||||||
|
|
||||||
OllamaChatResult chatResult = ollamaAPI.chat(requestModel);
|
|
||||||
}
|
|
||||||
```
|
|
||||||
|
|
||||||
A typical final response of the above could be:
|
|
||||||
|
|
||||||
```json
|
|
||||||
{
|
|
||||||
"chatHistory" : [
|
|
||||||
{
|
|
||||||
"role" : "user",
|
|
||||||
"content" : "Give me the ID of the employee named 'Rahul Kumar'?",
|
|
||||||
"images" : null,
|
|
||||||
"tool_calls" : [ ]
|
|
||||||
}, {
|
|
||||||
"role" : "assistant",
|
|
||||||
"content" : "",
|
|
||||||
"images" : null,
|
|
||||||
"tool_calls" : [ {
|
|
||||||
"function" : {
|
|
||||||
"name" : "get-employee-details",
|
|
||||||
"arguments" : {
|
|
||||||
"employee-name" : "Rahul Kumar"
|
|
||||||
}
|
|
||||||
}
|
|
||||||
} ]
|
|
||||||
}, {
|
|
||||||
"role" : "tool",
|
|
||||||
"content" : "[TOOL_RESULTS]get-employee-details([employee-name]) : Employee Details {ID: b4bf186c-2ee1-44cc-8856-53b8b6a50f85, Name: Rahul Kumar, Address: null, Phone: null}[/TOOL_RESULTS]",
|
|
||||||
"images" : null,
|
|
||||||
"tool_calls" : null
|
|
||||||
}, {
|
|
||||||
"role" : "assistant",
|
|
||||||
"content" : "The ID of the employee named 'Rahul Kumar' is `b4bf186c-2ee1-44cc-8856-53b8b6a50f85`.",
|
|
||||||
"images" : null,
|
|
||||||
"tool_calls" : null
|
|
||||||
} ],
|
|
||||||
"responseModel" : {
|
|
||||||
"model" : "llama3.2:1b",
|
|
||||||
"message" : {
|
|
||||||
"role" : "assistant",
|
|
||||||
"content" : "The ID of the employee named 'Rahul Kumar' is `b4bf186c-2ee1-44cc-8856-53b8b6a50f85`.",
|
|
||||||
"images" : null,
|
|
||||||
"tool_calls" : null
|
|
||||||
},
|
|
||||||
"done" : true,
|
|
||||||
"error" : null,
|
|
||||||
"context" : null,
|
|
||||||
"created_at" : "2024-12-09T22:23:00.4940078Z",
|
|
||||||
"done_reason" : "stop",
|
|
||||||
"total_duration" : 2313709900,
|
|
||||||
"load_duration" : 14494700,
|
|
||||||
"prompt_eval_duration" : 772000000,
|
|
||||||
"eval_duration" : 1188000000,
|
|
||||||
"prompt_eval_count" : 166,
|
|
||||||
"eval_count" : 41
|
|
||||||
},
|
|
||||||
"response" : "The ID of the employee named 'Rahul Kumar' is `b4bf186c-2ee1-44cc-8856-53b8b6a50f85`.",
|
|
||||||
"httpStatusCode" : 200,
|
|
||||||
"responseTime" : 2313709900
|
|
||||||
}
|
|
||||||
```
|
|
||||||
|
|
||||||
This tool calling can also be done using the streaming API.
|
|
||||||
|
|
||||||
### Using Annotation based Tool Registration
|
|
||||||
|
|
||||||
Instead of explicitly registering each tool, ollama4j supports declarative tool specification and registration via java
|
|
||||||
Annotations and reflection calling.
|
|
||||||
|
|
||||||
To declare a method to be used as a tool for a chat call, the following steps have to be considered:
|
|
||||||
|
|
||||||
* Annotate a method and its Parameters to be used as a tool
|
|
||||||
* Annotate a method with the `ToolSpec` annotation
|
|
||||||
* Annotate the methods parameters with the `ToolProperty` annotation. Only the following datatypes are supported for now:
|
|
||||||
* `java.lang.String`
|
|
||||||
* `java.lang.Integer`
|
|
||||||
* `java.lang.Boolean`
|
|
||||||
* `java.math.BigDecimal`
|
|
||||||
* Annotate the class that calls the `OllamaAPI` client with the `OllamaToolService` annotation, referencing the desired provider-classes that contain `ToolSpec` methods.
|
|
||||||
* Before calling the `OllamaAPI` chat request, call the method `OllamaAPI.registerAnnotatedTools()` method to add tools to the chat.
|
|
||||||
|
|
||||||
#### Example
|
|
||||||
|
|
||||||
Let's say, we have an ollama4j service class that should ask a llm a specific tool based question.
|
|
||||||
|
|
||||||
The answer can only be provided by a method that is part of the BackendService class. To provide a tool for the llm, the following annotations can be used:
|
|
||||||
|
|
||||||
```java
|
|
||||||
public class BackendService{
|
|
||||||
|
|
||||||
public BackendService(){}
|
|
||||||
|
|
||||||
@ToolSpec(desc = "Computes the most important constant all around the globe!")
|
|
||||||
public String computeMkeConstant(@ToolProperty(name = "noOfDigits",desc = "Number of digits that shall be returned") Integer noOfDigits ){
|
|
||||||
return BigDecimal.valueOf((long)(Math.random()*1000000L),noOfDigits).toString();
|
|
||||||
}
|
|
||||||
}
|
|
||||||
```
|
|
||||||
|
|
||||||
The caller API can then be written as:
|
|
||||||
```java
|
|
||||||
import io.github.ollama4j.tools.annotations.OllamaToolService;
|
|
||||||
|
|
||||||
@OllamaToolService(providers = BackendService.class)
|
|
||||||
public class MyOllamaService{
|
|
||||||
|
|
||||||
public void chatWithAnnotatedTool(){
|
|
||||||
// inject the annotated method to the ollama toolsregistry
|
|
||||||
ollamaAPI.registerAnnotatedTools();
|
|
||||||
|
|
||||||
OllamaChatRequest requestModel = builder
|
|
||||||
.withMessage(OllamaChatMessageRole.USER,
|
|
||||||
"Compute the most important constant in the world using 5 digits")
|
|
||||||
.build();
|
|
||||||
|
|
||||||
OllamaChatResult chatResult = ollamaAPI.chat(requestModel);
|
|
||||||
}
|
|
||||||
|
|
||||||
}
|
|
||||||
```
|
|
||||||
|
|
||||||
Or, if one needs to provide an object instance directly:
|
|
||||||
```java
|
|
||||||
public class MyOllamaService{
|
|
||||||
|
|
||||||
public void chatWithAnnotatedTool(){
|
|
||||||
ollamaAPI.registerAnnotatedTools(new BackendService());
|
|
||||||
OllamaChatRequest requestModel = builder
|
|
||||||
.withMessage(OllamaChatMessageRole.USER,
|
|
||||||
"Compute the most important constant in the world using 5 digits")
|
|
||||||
.build();
|
|
||||||
|
|
||||||
OllamaChatResult chatResult = ollamaAPI.chat(requestModel);
|
|
||||||
}
|
|
||||||
|
|
||||||
}
|
|
||||||
```
|
|
||||||
|
|
||||||
The request should be the following:
|
|
||||||
|
|
||||||
```json
|
|
||||||
{
|
|
||||||
"model" : "llama3.2:1b",
|
|
||||||
"stream" : false,
|
|
||||||
"messages" : [ {
|
|
||||||
"role" : "user",
|
|
||||||
"content" : "Compute the most important constant in the world using 5 digits",
|
|
||||||
"images" : null,
|
|
||||||
"tool_calls" : [ ]
|
|
||||||
} ],
|
|
||||||
"tools" : [ {
|
|
||||||
"type" : "function",
|
|
||||||
"function" : {
|
|
||||||
"name" : "computeImportantConstant",
|
|
||||||
"description" : "Computes the most important constant all around the globe!",
|
|
||||||
"parameters" : {
|
|
||||||
"type" : "object",
|
|
||||||
"properties" : {
|
|
||||||
"noOfDigits" : {
|
|
||||||
"type" : "java.lang.Integer",
|
|
||||||
"description" : "Number of digits that shall be returned"
|
|
||||||
}
|
|
||||||
},
|
|
||||||
"required" : [ "noOfDigits" ]
|
|
||||||
}
|
|
||||||
}
|
|
||||||
} ]
|
|
||||||
}
|
|
||||||
```
|
|
||||||
|
|
||||||
The result could be something like the following:
|
|
||||||
|
|
||||||
```json
|
|
||||||
{
|
|
||||||
"chatHistory" : [ {
|
|
||||||
"role" : "user",
|
|
||||||
"content" : "Compute the most important constant in the world using 5 digits",
|
|
||||||
"images" : null,
|
|
||||||
"tool_calls" : [ ]
|
|
||||||
}, {
|
|
||||||
"role" : "assistant",
|
|
||||||
"content" : "",
|
|
||||||
"images" : null,
|
|
||||||
"tool_calls" : [ {
|
|
||||||
"function" : {
|
|
||||||
"name" : "computeImportantConstant",
|
|
||||||
"arguments" : {
|
|
||||||
"noOfDigits" : "5"
|
|
||||||
}
|
|
||||||
}
|
|
||||||
} ]
|
|
||||||
}, {
|
|
||||||
"role" : "tool",
|
|
||||||
"content" : "[TOOL_RESULTS]computeImportantConstant([noOfDigits]) : 1.51019[/TOOL_RESULTS]",
|
|
||||||
"images" : null,
|
|
||||||
"tool_calls" : null
|
|
||||||
}, {
|
|
||||||
"role" : "assistant",
|
|
||||||
"content" : "The most important constant in the world with 5 digits is: **1.51019**",
|
|
||||||
"images" : null,
|
|
||||||
"tool_calls" : null
|
|
||||||
} ],
|
|
||||||
"responseModel" : {
|
|
||||||
"model" : "llama3.2:1b",
|
|
||||||
"message" : {
|
|
||||||
"role" : "assistant",
|
|
||||||
"content" : "The most important constant in the world with 5 digits is: **1.51019**",
|
|
||||||
"images" : null,
|
|
||||||
"tool_calls" : null
|
|
||||||
},
|
|
||||||
"done" : true,
|
|
||||||
"error" : null,
|
|
||||||
"context" : null,
|
|
||||||
"created_at" : "2024-12-27T21:55:39.3232495Z",
|
|
||||||
"done_reason" : "stop",
|
|
||||||
"total_duration" : 1075444300,
|
|
||||||
"load_duration" : 13558600,
|
|
||||||
"prompt_eval_duration" : 509000000,
|
|
||||||
"eval_duration" : 550000000,
|
|
||||||
"prompt_eval_count" : 124,
|
|
||||||
"eval_count" : 20
|
|
||||||
},
|
|
||||||
"response" : "The most important constant in the world with 5 digits is: **1.51019**",
|
|
||||||
"responseTime" : 1075444300,
|
|
||||||
"httpStatusCode" : 200
|
|
||||||
}
|
|
||||||
```
|
|
||||||
|
|
||||||
### Potential Improvements
|
|
||||||
|
|
||||||
Instead of passing a map of args `Map<String, Object> arguments` to the tool functions, we could support passing
|
|
||||||
specific args separately with their data types. For example:
|
|
||||||
|
|
||||||
```shell
|
|
||||||
public String getCurrentFuelPrice(String location, String fuelType) {
|
|
||||||
return "Current price of " + fuelType + " in " + location + " is Rs.103/L";
|
|
||||||
}
|
|
||||||
```
|
|
||||||
|
|
||||||
Updating async/chat APIs with support for tool-based generation.
|
|
||||||
|
@ -2,7 +2,9 @@
|
|||||||
sidebar_position: 1
|
sidebar_position: 1
|
||||||
---
|
---
|
||||||
|
|
||||||
# Generate - Sync
|
import CodeEmbed from '@site/src/components/CodeEmbed';
|
||||||
|
|
||||||
|
# Generate (Sync)
|
||||||
|
|
||||||
This API lets you ask questions to the LLMs in a synchronous way.
|
This API lets you ask questions to the LLMs in a synchronous way.
|
||||||
This API corresponds to
|
This API corresponds to
|
||||||
@ -13,285 +15,60 @@ with [extra parameters](https://github.com/jmorganca/ollama/blob/main/docs/model
|
|||||||
Refer
|
Refer
|
||||||
to [this](/apis-extras/options-builder).
|
to [this](/apis-extras/options-builder).
|
||||||
|
|
||||||
## Try asking a question about the model
|
### Try asking a question about the model
|
||||||
|
|
||||||
```java
|
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/Generate.java" />
|
||||||
import io.github.ollama4j.OllamaAPI;
|
|
||||||
import io.github.ollama4j.models.response.OllamaResult;
|
|
||||||
import io.github.ollama4j.types.OllamaModelType;
|
|
||||||
import io.github.ollama4j.utils.OptionsBuilder;
|
|
||||||
|
|
||||||
public class Main {
|
|
||||||
|
|
||||||
public static void main(String[] args) {
|
|
||||||
|
|
||||||
String host = "http://localhost:11434/";
|
|
||||||
|
|
||||||
OllamaAPI ollamaAPI = new OllamaAPI(host);
|
|
||||||
|
|
||||||
OllamaResult result =
|
|
||||||
ollamaAPI.generate(OllamaModelType.LLAMA2, "Who are you?", new OptionsBuilder().build());
|
|
||||||
|
|
||||||
System.out.println(result.getResponse());
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
```
|
|
||||||
|
|
||||||
You will get a response similar to:
|
You will get a response similar to:
|
||||||
|
|
||||||
> I am LLaMA, an AI assistant developed by Meta AI that can understand and respond to human input in a conversational
|
::::tip[LLM Response]
|
||||||
> manner. I am trained on a massive dataset of text from the internet and can generate human-like responses to a wide
|
> I am a large language model created by Alibaba Cloud. My purpose is to assist users in generating text, answering
|
||||||
> range of topics and questions. I can be used to create chatbots, virtual assistants, and other applications that
|
> questions, and completing tasks. I aim to be user-friendly and easy to understand for everyone who interacts with me.
|
||||||
> require
|
::::
|
||||||
> natural language understanding and generation capabilities.
|
>
|
||||||
|
### Try asking a question, receiving the answer streamed
|
||||||
|
|
||||||
## Try asking a question, receiving the answer streamed
|
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/GenerateStreamingWithTokenConcatenation.java" />
|
||||||
|
|
||||||
```java
|
|
||||||
import io.github.ollama4j.OllamaAPI;
|
|
||||||
import io.github.ollama4j.models.response.OllamaResult;
|
|
||||||
import io.github.ollama4j.models.generate.OllamaStreamHandler;
|
|
||||||
import io.github.ollama4j.utils.OptionsBuilder;
|
|
||||||
|
|
||||||
public class Main {
|
|
||||||
|
|
||||||
public static void main(String[] args) {
|
|
||||||
|
|
||||||
String host = "http://localhost:11434/";
|
|
||||||
|
|
||||||
OllamaAPI ollamaAPI = new OllamaAPI(host);
|
|
||||||
// define a stream handler (Consumer<String>)
|
|
||||||
OllamaStreamHandler streamHandler = (s) -> {
|
|
||||||
System.out.println(s);
|
|
||||||
};
|
|
||||||
|
|
||||||
// Should be called using seperate thread to gain non blocking streaming effect.
|
|
||||||
OllamaResult result = ollamaAPI.generate(config.getModel(),
|
|
||||||
"What is the capital of France? And what's France's connection with Mona Lisa?",
|
|
||||||
new OptionsBuilder().build(), streamHandler);
|
|
||||||
|
|
||||||
System.out.println("Full response: " + result.getResponse());
|
|
||||||
}
|
|
||||||
}
|
|
||||||
```
|
|
||||||
|
|
||||||
You will get a response similar to:
|
You will get a response similar to:
|
||||||
|
|
||||||
|
::::tip[LLM Response]
|
||||||
> The
|
> The
|
||||||
|
>
|
||||||
> The capital
|
> The capital
|
||||||
|
>
|
||||||
> The capital of
|
> The capital of
|
||||||
|
>
|
||||||
> The capital of France
|
> The capital of France
|
||||||
|
>
|
||||||
> The capital of France is
|
> The capital of France is
|
||||||
|
>
|
||||||
> The capital of France is Paris
|
> The capital of France is Paris
|
||||||
|
>
|
||||||
> The capital of France is Paris.
|
> The capital of France is Paris.
|
||||||
> Full response: The capital of France is Paris.
|
::::
|
||||||
|
|
||||||
## Try asking a question from general topics
|
|
||||||
|
|
||||||
```java
|
|
||||||
import io.github.ollama4j.OllamaAPI;
|
|
||||||
import io.github.ollama4j.models.response.OllamaResult;
|
|
||||||
import io.github.ollama4j.types.OllamaModelType;
|
|
||||||
import io.github.ollama4j.utils.OptionsBuilder;
|
|
||||||
|
|
||||||
public class Main {
|
|
||||||
|
|
||||||
public static void main(String[] args) {
|
|
||||||
|
|
||||||
String host = "http://localhost:11434/";
|
|
||||||
|
|
||||||
OllamaAPI ollamaAPI = new OllamaAPI(host);
|
|
||||||
|
|
||||||
String prompt = "List all cricket world cup teams of 2019.";
|
|
||||||
|
|
||||||
OllamaResult result =
|
|
||||||
ollamaAPI.generate(OllamaModelType.LLAMA2, prompt, new OptionsBuilder().build());
|
|
||||||
|
|
||||||
System.out.println(result.getResponse());
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
```
|
|
||||||
|
|
||||||
You'd then get a response from the model:
|
|
||||||
|
|
||||||
> The 2019 ICC Cricket World Cup was held in England and Wales from May 30 to July 14, 2019. The
|
|
||||||
> following teams
|
|
||||||
> participated in the tournament:
|
|
||||||
>
|
|
||||||
> 1. Afghanistan
|
|
||||||
> 2. Australia
|
|
||||||
> 3. Bangladesh
|
|
||||||
> 4. England
|
|
||||||
> 5. India
|
|
||||||
> 6. New Zealand
|
|
||||||
> 7. Pakistan
|
|
||||||
> 8. South Africa
|
|
||||||
> 9. Sri Lanka
|
|
||||||
> 10. West Indies
|
|
||||||
>
|
|
||||||
> These teams competed in a round-robin format, with the top four teams advancing to the
|
|
||||||
> semi-finals. The tournament was
|
|
||||||
> won by the England cricket team, who defeated New Zealand in the final.
|
|
||||||
|
|
||||||
## Try asking for a Database query for your data schema
|
|
||||||
|
|
||||||
```java
|
|
||||||
import io.github.ollama4j.OllamaAPI;
|
|
||||||
import io.github.ollama4j.models.response.OllamaResult;
|
|
||||||
import io.github.ollama4j.types.OllamaModelType;
|
|
||||||
import io.github.ollama4j.utils.OptionsBuilder;
|
|
||||||
import io.github.ollama4j.utils.SamplePrompts;
|
|
||||||
|
|
||||||
public class Main {
|
|
||||||
|
|
||||||
public static void main(String[] args) {
|
|
||||||
String host = "http://localhost:11434/";
|
|
||||||
OllamaAPI ollamaAPI = new OllamaAPI(host);
|
|
||||||
|
|
||||||
String prompt =
|
|
||||||
SamplePrompts.getSampleDatabasePromptWithQuestion(
|
|
||||||
"List all customer names who have bought one or more products");
|
|
||||||
OllamaResult result =
|
|
||||||
ollamaAPI.generate(OllamaModelType.SQLCODER, prompt, new OptionsBuilder().build());
|
|
||||||
System.out.println(result.getResponse());
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
```
|
|
||||||
|
|
||||||
|
|
||||||
_Note: Here I've used
|
|
||||||
a [sample prompt](https://github.com/ollama4j/ollama4j/blob/main/src/main/resources/sample-db-prompt-template.txt)
|
|
||||||
containing a database schema from within this library for demonstration purposes._
|
|
||||||
|
|
||||||
You'd then get a response from the model:
|
|
||||||
|
|
||||||
```sql
|
|
||||||
SELECT customers.name
|
|
||||||
FROM sales
|
|
||||||
JOIN customers ON sales.customer_id = customers.customer_id
|
|
||||||
GROUP BY customers.name;
|
|
||||||
```
|
|
||||||
|
|
||||||
|
|
||||||
## Generate structured output
|
## Generate structured output
|
||||||
|
|
||||||
### With response as a `Map`
|
### With response as a `Map`
|
||||||
|
|
||||||
```java
|
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/StructuredOutput.java" />
|
||||||
import java.util.Arrays;
|
|
||||||
import java.util.HashMap;
|
|
||||||
import java.util.Map;
|
|
||||||
|
|
||||||
import io.github.ollama4j.OllamaAPI;
|
You will get a response similar to:
|
||||||
import io.github.ollama4j.utils.Utilities;
|
|
||||||
import io.github.ollama4j.models.chat.OllamaChatMessageRole;
|
|
||||||
import io.github.ollama4j.models.chat.OllamaChatRequest;
|
|
||||||
import io.github.ollama4j.models.chat.OllamaChatRequestBuilder;
|
|
||||||
import io.github.ollama4j.models.chat.OllamaChatResult;
|
|
||||||
import io.github.ollama4j.models.response.OllamaResult;
|
|
||||||
import io.github.ollama4j.types.OllamaModelType;
|
|
||||||
|
|
||||||
public class StructuredOutput {
|
::::tip[LLM Response]
|
||||||
|
```json
|
||||||
public static void main(String[] args) throws Exception {
|
|
||||||
String host = "http://localhost:11434/";
|
|
||||||
|
|
||||||
OllamaAPI api = new OllamaAPI(host);
|
|
||||||
|
|
||||||
String chatModel = "qwen2.5:0.5b";
|
|
||||||
api.pullModel(chatModel);
|
|
||||||
|
|
||||||
String prompt = "Ollama is 22 years old and is busy saving the world. Respond using JSON";
|
|
||||||
Map<String, Object> format = new HashMap<>();
|
|
||||||
format.put("type", "object");
|
|
||||||
format.put("properties", new HashMap<String, Object>() {
|
|
||||||
{
|
{
|
||||||
put("age", new HashMap<String, Object>() {
|
"available": true,
|
||||||
{
|
"age": 22
|
||||||
put("type", "integer");
|
|
||||||
}
|
|
||||||
});
|
|
||||||
put("available", new HashMap<String, Object>() {
|
|
||||||
{
|
|
||||||
put("type", "boolean");
|
|
||||||
}
|
|
||||||
});
|
|
||||||
}
|
|
||||||
});
|
|
||||||
format.put("required", Arrays.asList("age", "available"));
|
|
||||||
|
|
||||||
OllamaResult result = api.generate(chatModel, prompt, format);
|
|
||||||
System.out.println(result);
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
```
|
```
|
||||||
|
::::
|
||||||
|
|
||||||
### With response mapped to specified class type
|
### With response mapped to specified class type
|
||||||
|
|
||||||
```java
|
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/StructuredOutputMappedToObject.java" />
|
||||||
import java.util.Arrays;
|
|
||||||
import java.util.HashMap;
|
|
||||||
import java.util.Map;
|
|
||||||
|
|
||||||
import io.github.ollama4j.OllamaAPI;
|
::::tip[LLM Response]
|
||||||
import io.github.ollama4j.utils.Utilities;
|
Person(age=28, available=false)
|
||||||
import lombok.AllArgsConstructor;
|
::::
|
||||||
import lombok.Data;
|
|
||||||
import lombok.NoArgsConstructor;
|
|
||||||
import io.github.ollama4j.models.chat.OllamaChatMessageRole;
|
|
||||||
import io.github.ollama4j.models.chat.OllamaChatRequest;
|
|
||||||
import io.github.ollama4j.models.chat.OllamaChatRequestBuilder;
|
|
||||||
import io.github.ollama4j.models.chat.OllamaChatResult;
|
|
||||||
import io.github.ollama4j.models.response.OllamaResult;
|
|
||||||
import io.github.ollama4j.types.OllamaModelType;
|
|
||||||
|
|
||||||
public class StructuredOutput {
|
|
||||||
|
|
||||||
public static void main(String[] args) throws Exception {
|
|
||||||
String host = Utilities.getFromConfig("host");
|
|
||||||
|
|
||||||
OllamaAPI api = new OllamaAPI(host);
|
|
||||||
|
|
||||||
int age = 28;
|
|
||||||
boolean available = false;
|
|
||||||
|
|
||||||
String prompt = "Batman is " + age + " years old and is " + (available ? "available" : "not available")
|
|
||||||
+ " because he is busy saving Gotham City. Respond using JSON";
|
|
||||||
|
|
||||||
Map<String, Object> format = new HashMap<>();
|
|
||||||
format.put("type", "object");
|
|
||||||
format.put("properties", new HashMap<String, Object>() {
|
|
||||||
{
|
|
||||||
put("age", new HashMap<String, Object>() {
|
|
||||||
{
|
|
||||||
put("type", "integer");
|
|
||||||
}
|
|
||||||
});
|
|
||||||
put("available", new HashMap<String, Object>() {
|
|
||||||
{
|
|
||||||
put("type", "boolean");
|
|
||||||
}
|
|
||||||
});
|
|
||||||
}
|
|
||||||
});
|
|
||||||
format.put("required", Arrays.asList("age", "available"));
|
|
||||||
|
|
||||||
OllamaResult result = api.generate(CHAT_MODEL_QWEN_SMALL, prompt, format);
|
|
||||||
|
|
||||||
Person person = result.as(Person.class);
|
|
||||||
System.out.println(person.getAge());
|
|
||||||
System.out.println(person.getAvailable());
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
@Data
|
|
||||||
@AllArgsConstructor
|
|
||||||
@NoArgsConstructor
|
|
||||||
class Person {
|
|
||||||
private int age;
|
|
||||||
private boolean available;
|
|
||||||
}
|
|
||||||
```
|
|
@ -1,5 +1,5 @@
|
|||||||
---
|
---
|
||||||
sidebar_position: 6
|
sidebar_position: 10
|
||||||
---
|
---
|
||||||
|
|
||||||
# Prompt Builder
|
# Prompt Builder
|
||||||
|
@ -2,28 +2,27 @@
|
|||||||
sidebar_position: 5
|
sidebar_position: 5
|
||||||
---
|
---
|
||||||
|
|
||||||
|
import CodeEmbed from '@site/src/components/CodeEmbed';
|
||||||
|
|
||||||
# Create Model
|
# Create Model
|
||||||
|
|
||||||
This API lets you create a custom model on the Ollama server.
|
This API lets you create a custom model on the Ollama server.
|
||||||
|
|
||||||
### Create a custom model from an existing model in the Ollama server
|
### Create a custom model from an existing model in the Ollama server
|
||||||
|
|
||||||
```java title="CreateModel.java"
|
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/CreateModel.java" />
|
||||||
import io.github.ollama4j.OllamaAPI;
|
|
||||||
|
|
||||||
public class CreateModel {
|
You would see these logs while the custom model is being created:
|
||||||
|
|
||||||
public static void main(String[] args) {
|
|
||||||
|
|
||||||
String host = "http://localhost:11434/";
|
|
||||||
|
|
||||||
OllamaAPI ollamaAPI = new OllamaAPI(host);
|
|
||||||
|
|
||||||
ollamaAPI.createModel(CustomModelRequest.builder().model("mario").from("llama3.2:latest").system("You are Mario from Super Mario Bros.").build());
|
|
||||||
}
|
|
||||||
}
|
|
||||||
```
|
```
|
||||||
|
{"status":"using existing layer sha256:fad2a06e4cc705c2fa8bec5477ddb00dc0c859ac184c34dcc5586663774161ca"}
|
||||||
|
{"status":"using existing layer sha256:41c2cf8c272f6fb0080a97cd9d9bd7d4604072b80a0b10e7d65ca26ef5000c0c"}
|
||||||
|
{"status":"using existing layer sha256:1da0581fd4ce92dcf5a66b1da737cf215d8dcf25aa1b98b44443aaf7173155f5"}
|
||||||
|
{"status":"creating new layer sha256:941b69ca7dc2a85c053c38d9e8029c9df6224e545060954fa97587f87c044a64"}
|
||||||
|
{"status":"using existing layer sha256:f02dd72bb2423204352eabc5637b44d79d17f109fdb510a7c51455892aa2d216"}
|
||||||
|
{"status":"writing manifest"}
|
||||||
|
{"status":"success"}
|
||||||
|
```
|
||||||
Once created, you can see it when you use [list models](./list-models) API.
|
Once created, you can see it when you use [list models](./list-models) API.
|
||||||
|
|
||||||
[Read more](https://github.com/ollama/ollama/blob/main/docs/api.md#create-a-model) about custom model creation and the parameters available for model creation.
|
[Read more](https://github.com/ollama/ollama/blob/main/docs/api.md#create-a-model) about custom model creation and the parameters available for model creation.
|
||||||
|
@ -2,27 +2,12 @@
|
|||||||
sidebar_position: 6
|
sidebar_position: 6
|
||||||
---
|
---
|
||||||
|
|
||||||
|
import CodeEmbed from '@site/src/components/CodeEmbed';
|
||||||
|
|
||||||
# Delete Model
|
# Delete Model
|
||||||
|
|
||||||
This API lets you create a delete a model from the Ollama server.
|
This API lets you create a delete a model from the Ollama server.
|
||||||
|
|
||||||
```java title="DeleteModel.java"
|
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/DeleteModel.java" />
|
||||||
import io.github.ollama4j.OllamaAPI;
|
|
||||||
|
|
||||||
public class Main {
|
|
||||||
|
|
||||||
public static void main(String[] args) {
|
|
||||||
|
|
||||||
String host = "http://localhost:11434/";
|
|
||||||
|
|
||||||
OllamaAPI ollamaAPI = new OllamaAPI(host);
|
|
||||||
|
|
||||||
ollamaAPI.setVerbose(false);
|
|
||||||
|
|
||||||
ollamaAPI.deleteModel("mycustommodel", true);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
```
|
|
||||||
|
|
||||||
Once deleted, you can verify it using [list models](./list-models) API.
|
Once deleted, you can verify it using [list models](./list-models) API.
|
File diff suppressed because one or more lines are too long
@ -2,6 +2,8 @@
|
|||||||
sidebar_position: 1
|
sidebar_position: 1
|
||||||
---
|
---
|
||||||
|
|
||||||
|
import CodeEmbed from '@site/src/components/CodeEmbed';
|
||||||
|
|
||||||
# Models from Ollama Library
|
# Models from Ollama Library
|
||||||
|
|
||||||
These API retrieves a list of models directly from the Ollama library.
|
These API retrieves a list of models directly from the Ollama library.
|
||||||
@ -11,26 +13,9 @@ These API retrieves a list of models directly from the Ollama library.
|
|||||||
This API fetches available models from the Ollama library page, including details such as the model's name, pull count,
|
This API fetches available models from the Ollama library page, including details such as the model's name, pull count,
|
||||||
popular tags, tag count, and the last update time.
|
popular tags, tag count, and the last update time.
|
||||||
|
|
||||||
```java title="ListLibraryModels.java"
|
<CodeEmbed
|
||||||
import io.github.ollama4j.OllamaAPI;
|
src='https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/ListLibraryModels.java'>
|
||||||
import io.github.ollama4j.models.response.LibraryModel;
|
</CodeEmbed>
|
||||||
|
|
||||||
import java.util.List;
|
|
||||||
|
|
||||||
public class Main {
|
|
||||||
|
|
||||||
public static void main(String[] args) {
|
|
||||||
|
|
||||||
String host = "http://localhost:11434/";
|
|
||||||
|
|
||||||
OllamaAPI ollamaAPI = new OllamaAPI(host);
|
|
||||||
|
|
||||||
List<LibraryModel> libraryModels = ollamaAPI.listModelsFromLibrary();
|
|
||||||
|
|
||||||
System.out.println(libraryModels);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
```
|
|
||||||
|
|
||||||
The following is the sample output:
|
The following is the sample output:
|
||||||
|
|
||||||
@ -45,27 +30,9 @@ The following is the sample output:
|
|||||||
|
|
||||||
This API Fetches the tags associated with a specific model from Ollama library.
|
This API Fetches the tags associated with a specific model from Ollama library.
|
||||||
|
|
||||||
```java title="GetLibraryModelTags.java"
|
<CodeEmbed
|
||||||
import io.github.ollama4j.OllamaAPI;
|
src='https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/GetLibraryModelTags.java'>
|
||||||
import io.github.ollama4j.models.response.LibraryModel;
|
</CodeEmbed>
|
||||||
import io.github.ollama4j.models.response.LibraryModelDetail;
|
|
||||||
|
|
||||||
public class Main {
|
|
||||||
|
|
||||||
public static void main(String[] args) {
|
|
||||||
|
|
||||||
String host = "http://localhost:11434/";
|
|
||||||
|
|
||||||
OllamaAPI ollamaAPI = new OllamaAPI(host);
|
|
||||||
|
|
||||||
List<LibraryModel> libraryModels = ollamaAPI.listModelsFromLibrary();
|
|
||||||
|
|
||||||
LibraryModelDetail libraryModelDetail = ollamaAPI.getLibraryModelDetails(libraryModels.get(0));
|
|
||||||
|
|
||||||
System.out.println(libraryModelDetail);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
```
|
|
||||||
|
|
||||||
The following is the sample output:
|
The following is the sample output:
|
||||||
|
|
||||||
@ -84,24 +51,9 @@ LibraryModelDetail(
|
|||||||
|
|
||||||
This API finds a specific model using model `name` and `tag` from Ollama library.
|
This API finds a specific model using model `name` and `tag` from Ollama library.
|
||||||
|
|
||||||
```java title="FindLibraryModel.java"
|
<CodeEmbed
|
||||||
import io.github.ollama4j.OllamaAPI;
|
src='https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/FindLibraryModel.java'>
|
||||||
import io.github.ollama4j.models.response.LibraryModelTag;
|
</CodeEmbed>
|
||||||
|
|
||||||
public class Main {
|
|
||||||
|
|
||||||
public static void main(String[] args) {
|
|
||||||
|
|
||||||
String host = "http://localhost:11434/";
|
|
||||||
|
|
||||||
OllamaAPI ollamaAPI = new OllamaAPI(host);
|
|
||||||
|
|
||||||
LibraryModelTag libraryModelTag = ollamaAPI.findModelTagFromLibrary("qwen2.5", "7b");
|
|
||||||
|
|
||||||
System.out.println(libraryModelTag);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
```
|
|
||||||
|
|
||||||
The following is the sample output:
|
The following is the sample output:
|
||||||
|
|
||||||
@ -113,21 +65,6 @@ LibraryModelTag(name=qwen2.5, tag=7b, size=4.7GB, lastUpdated=7 weeks ago)
|
|||||||
|
|
||||||
You can use `LibraryModelTag` to pull models into Ollama server.
|
You can use `LibraryModelTag` to pull models into Ollama server.
|
||||||
|
|
||||||
```java title="PullLibraryModelTags.java"
|
<CodeEmbed
|
||||||
import io.github.ollama4j.OllamaAPI;
|
src='https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/PullLibraryModelTags.java'>
|
||||||
import io.github.ollama4j.models.response.LibraryModelTag;
|
</CodeEmbed>
|
||||||
|
|
||||||
public class Main {
|
|
||||||
|
|
||||||
public static void main(String[] args) {
|
|
||||||
|
|
||||||
String host = "http://localhost:11434/";
|
|
||||||
|
|
||||||
OllamaAPI ollamaAPI = new OllamaAPI(host);
|
|
||||||
|
|
||||||
LibraryModelTag libraryModelTag = ollamaAPI.findModelTagFromLibrary("qwen2.5", "7b");
|
|
||||||
|
|
||||||
ollamaAPI.pullModel(libraryModelTag);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
```
|
|
@ -2,34 +2,23 @@
|
|||||||
sidebar_position: 2
|
sidebar_position: 2
|
||||||
---
|
---
|
||||||
|
|
||||||
|
import CodeEmbed from '@site/src/components/CodeEmbed';
|
||||||
|
|
||||||
# List Local Models
|
# List Local Models
|
||||||
|
|
||||||
This API lets you list downloaded/available models on the Ollama server.
|
This API lets you list downloaded/available models on the Ollama server.
|
||||||
|
|
||||||
```java title="ListModels.java"
|
<CodeEmbed
|
||||||
import io.github.ollama4j.OllamaAPI;
|
src='https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/ListLocalModels.java'>
|
||||||
import io.github.ollama4j.models.response.Model;
|
</CodeEmbed>
|
||||||
|
|
||||||
import java.util.List;
|
|
||||||
|
|
||||||
public class ListModels {
|
|
||||||
|
|
||||||
public static void main(String[] args) {
|
|
||||||
|
|
||||||
String host = "http://localhost:11434/";
|
|
||||||
|
|
||||||
OllamaAPI ollamaAPI = new OllamaAPI(host);
|
|
||||||
|
|
||||||
List<Model> models = ollamaAPI.listModels();
|
|
||||||
|
|
||||||
models.forEach(model -> System.out.println(model.getName()));
|
|
||||||
}
|
|
||||||
}
|
|
||||||
```
|
|
||||||
|
|
||||||
If you have any models already downloaded on Ollama server, you would have them listed as follows:
|
If you have any models already downloaded on Ollama server, you would have them listed as follows:
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
llama2:latest
|
llama2:latest
|
||||||
|
llama3.2:1b
|
||||||
|
qwen2:0.5b
|
||||||
|
qwen:0.5b
|
||||||
sqlcoder:latest
|
sqlcoder:latest
|
||||||
```
|
```
|
@ -2,26 +2,15 @@
|
|||||||
sidebar_position: 3
|
sidebar_position: 3
|
||||||
---
|
---
|
||||||
|
|
||||||
|
import CodeEmbed from '@site/src/components/CodeEmbed';
|
||||||
|
|
||||||
# Pull Model
|
# Pull Model
|
||||||
|
|
||||||
This API lets you pull a model on the Ollama server.
|
This API lets you pull a model on the Ollama server.
|
||||||
|
|
||||||
```java title="PullModel.java"
|
<CodeEmbed
|
||||||
import io.github.ollama4j.OllamaAPI;
|
src='https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/PullModel.java'>
|
||||||
import io.github.ollama4j.types.OllamaModelType;
|
</CodeEmbed>
|
||||||
|
|
||||||
public class Main {
|
|
||||||
|
|
||||||
public static void main(String[] args) {
|
|
||||||
|
|
||||||
String host = "http://localhost:11434/";
|
|
||||||
|
|
||||||
OllamaAPI ollamaAPI = new OllamaAPI(host);
|
|
||||||
|
|
||||||
ollamaAPI.pullModel(OllamaModelType.LLAMA2);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
```
|
|
||||||
|
|
||||||
Once downloaded, you can see them when you use [list models](./list-models) API.
|
Once downloaded, you can see them when you use [list models](./list-models) API.
|
||||||
|
|
||||||
|
@ -99,20 +99,32 @@ const config = {
|
|||||||
style: 'dark',
|
style: 'dark',
|
||||||
links: [
|
links: [
|
||||||
{
|
{
|
||||||
title: 'Docs',
|
title: 'Quick Links',
|
||||||
items: [
|
items: [
|
||||||
{
|
{
|
||||||
label: 'Tutorial',
|
label: 'Ollama4j Examples',
|
||||||
to: '/intro',
|
to: 'https://github.com/ollama4j/ollama4j-examples',
|
||||||
|
},
|
||||||
|
{
|
||||||
|
label: 'Blog',
|
||||||
|
to: '/blog',
|
||||||
|
},
|
||||||
|
{
|
||||||
|
label: 'GitHub',
|
||||||
|
href: 'https://github.com/ollama4j/ollama4j',
|
||||||
},
|
},
|
||||||
],
|
],
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
title: 'Usage',
|
title: 'Stuff built with Ollama4j',
|
||||||
items: [
|
items: [
|
||||||
{
|
{
|
||||||
label: 'Examples',
|
label: 'Ollama4j Web UI',
|
||||||
to: 'https://github.com/ollama4j/ollama4j-examples',
|
to: 'https://github.com/ollama4j/ollama4j-web-ui',
|
||||||
|
},
|
||||||
|
{
|
||||||
|
label: 'Ollama4j Desktop UI with Swing',
|
||||||
|
to: 'https://github.com/ollama4j/ollama4j-ui',
|
||||||
},
|
},
|
||||||
],
|
],
|
||||||
},
|
},
|
||||||
@ -128,20 +140,7 @@ const config = {
|
|||||||
href: 'https://twitter.com/ollama4j',
|
href: 'https://twitter.com/ollama4j',
|
||||||
},
|
},
|
||||||
],
|
],
|
||||||
},
|
}
|
||||||
{
|
|
||||||
title: 'More',
|
|
||||||
items: [
|
|
||||||
{
|
|
||||||
label: 'Blog',
|
|
||||||
to: '/blog',
|
|
||||||
},
|
|
||||||
{
|
|
||||||
label: 'GitHub',
|
|
||||||
href: 'https://github.com/ollama4j/ollama4j',
|
|
||||||
},
|
|
||||||
],
|
|
||||||
},
|
|
||||||
],
|
],
|
||||||
copyright: `Ollama4j Documentation ${new Date().getFullYear()}. Built with Docusaurus.`,
|
copyright: `Ollama4j Documentation ${new Date().getFullYear()}. Built with Docusaurus.`,
|
||||||
},
|
},
|
||||||
|
53
docs/package-lock.json
generated
53
docs/package-lock.json
generated
@ -13,11 +13,14 @@
|
|||||||
"@docusaurus/plugin-google-gtag": "^3.4.0",
|
"@docusaurus/plugin-google-gtag": "^3.4.0",
|
||||||
"@docusaurus/preset-classic": "^3.4.0",
|
"@docusaurus/preset-classic": "^3.4.0",
|
||||||
"@docusaurus/theme-mermaid": "^3.4.0",
|
"@docusaurus/theme-mermaid": "^3.4.0",
|
||||||
|
"@iconify/react": "^5.2.1",
|
||||||
"@mdx-js/react": "^3.0.0",
|
"@mdx-js/react": "^3.0.0",
|
||||||
"clsx": "^2.0.0",
|
"clsx": "^2.0.0",
|
||||||
|
"font-awesome": "^4.7.0",
|
||||||
"prism-react-renderer": "^2.3.0",
|
"prism-react-renderer": "^2.3.0",
|
||||||
"react": "^18.0.0",
|
"react": "^18.0.0",
|
||||||
"react-dom": "^18.0.0"
|
"react-dom": "^18.0.0",
|
||||||
|
"react-icons": "^5.5.0"
|
||||||
},
|
},
|
||||||
"devDependencies": {
|
"devDependencies": {
|
||||||
"@docusaurus/module-type-aliases": "^3.4.0",
|
"@docusaurus/module-type-aliases": "^3.4.0",
|
||||||
@ -3066,6 +3069,27 @@
|
|||||||
"@hapi/hoek": "^9.0.0"
|
"@hapi/hoek": "^9.0.0"
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
|
"node_modules/@iconify/react": {
|
||||||
|
"version": "5.2.1",
|
||||||
|
"resolved": "https://registry.npmjs.org/@iconify/react/-/react-5.2.1.tgz",
|
||||||
|
"integrity": "sha512-37GDR3fYDZmnmUn9RagyaX+zca24jfVOMY8E1IXTqJuE8pxNtN51KWPQe3VODOWvuUurq7q9uUu3CFrpqj5Iqg==",
|
||||||
|
"license": "MIT",
|
||||||
|
"dependencies": {
|
||||||
|
"@iconify/types": "^2.0.0"
|
||||||
|
},
|
||||||
|
"funding": {
|
||||||
|
"url": "https://github.com/sponsors/cyberalien"
|
||||||
|
},
|
||||||
|
"peerDependencies": {
|
||||||
|
"react": ">=16"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/@iconify/types": {
|
||||||
|
"version": "2.0.0",
|
||||||
|
"resolved": "https://registry.npmjs.org/@iconify/types/-/types-2.0.0.tgz",
|
||||||
|
"integrity": "sha512-+wluvCrRhXrhyOmRDJ3q8mux9JkKy5SJ/v8ol2tu4FVjyYvtEzkc/3pK15ET6RKg4b4w4BmTk1+gsCUhf21Ykg==",
|
||||||
|
"license": "MIT"
|
||||||
|
},
|
||||||
"node_modules/@jest/schemas": {
|
"node_modules/@jest/schemas": {
|
||||||
"version": "29.6.3",
|
"version": "29.6.3",
|
||||||
"resolved": "https://registry.npmjs.org/@jest/schemas/-/schemas-29.6.3.tgz",
|
"resolved": "https://registry.npmjs.org/@jest/schemas/-/schemas-29.6.3.tgz",
|
||||||
@ -4780,9 +4804,9 @@
|
|||||||
}
|
}
|
||||||
},
|
},
|
||||||
"node_modules/caniuse-lite": {
|
"node_modules/caniuse-lite": {
|
||||||
"version": "1.0.30001641",
|
"version": "1.0.30001714",
|
||||||
"resolved": "https://registry.npmjs.org/caniuse-lite/-/caniuse-lite-1.0.30001641.tgz",
|
"resolved": "https://registry.npmjs.org/caniuse-lite/-/caniuse-lite-1.0.30001714.tgz",
|
||||||
"integrity": "sha512-Phv5thgl67bHYo1TtMY/MurjkHhV4EDaCosezRXgZ8jzA/Ub+wjxAvbGvjoFENStinwi5kCyOYV3mi5tOGykwA==",
|
"integrity": "sha512-mtgapdwDLSSBnCI3JokHM7oEQBLxiJKVRtg10AxM1AyeiKcM96f0Mkbqeq+1AbiCtvMcHRulAAEMu693JrSWqg==",
|
||||||
"funding": [
|
"funding": [
|
||||||
{
|
{
|
||||||
"type": "opencollective",
|
"type": "opencollective",
|
||||||
@ -4796,7 +4820,8 @@
|
|||||||
"type": "github",
|
"type": "github",
|
||||||
"url": "https://github.com/sponsors/ai"
|
"url": "https://github.com/sponsors/ai"
|
||||||
}
|
}
|
||||||
]
|
],
|
||||||
|
"license": "CC-BY-4.0"
|
||||||
},
|
},
|
||||||
"node_modules/ccount": {
|
"node_modules/ccount": {
|
||||||
"version": "2.0.1",
|
"version": "2.0.1",
|
||||||
@ -7266,6 +7291,15 @@
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
|
"node_modules/font-awesome": {
|
||||||
|
"version": "4.7.0",
|
||||||
|
"resolved": "https://registry.npmjs.org/font-awesome/-/font-awesome-4.7.0.tgz",
|
||||||
|
"integrity": "sha512-U6kGnykA/6bFmg1M/oT9EkFeIYv7JlX3bozwQJWiiLz6L0w3F5vBVPxHlwyX/vtNq1ckcpRKOB9f2Qal/VtFpg==",
|
||||||
|
"license": "(OFL-1.1 AND MIT)",
|
||||||
|
"engines": {
|
||||||
|
"node": ">=0.10.3"
|
||||||
|
}
|
||||||
|
},
|
||||||
"node_modules/fork-ts-checker-webpack-plugin": {
|
"node_modules/fork-ts-checker-webpack-plugin": {
|
||||||
"version": "6.5.3",
|
"version": "6.5.3",
|
||||||
"resolved": "https://registry.npmjs.org/fork-ts-checker-webpack-plugin/-/fork-ts-checker-webpack-plugin-6.5.3.tgz",
|
"resolved": "https://registry.npmjs.org/fork-ts-checker-webpack-plugin/-/fork-ts-checker-webpack-plugin-6.5.3.tgz",
|
||||||
@ -13384,6 +13418,15 @@
|
|||||||
"react-dom": "^16.6.0 || ^17.0.0 || ^18.0.0"
|
"react-dom": "^16.6.0 || ^17.0.0 || ^18.0.0"
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
|
"node_modules/react-icons": {
|
||||||
|
"version": "5.5.0",
|
||||||
|
"resolved": "https://registry.npmjs.org/react-icons/-/react-icons-5.5.0.tgz",
|
||||||
|
"integrity": "sha512-MEFcXdkP3dLo8uumGI5xN3lDFNsRtrjbOEKDLD7yv76v4wpnEq2Lt2qeHaQOr34I/wPN3s3+N08WkQ+CW37Xiw==",
|
||||||
|
"license": "MIT",
|
||||||
|
"peerDependencies": {
|
||||||
|
"react": "*"
|
||||||
|
}
|
||||||
|
},
|
||||||
"node_modules/react-is": {
|
"node_modules/react-is": {
|
||||||
"version": "16.13.1",
|
"version": "16.13.1",
|
||||||
"resolved": "https://registry.npmjs.org/react-is/-/react-is-16.13.1.tgz",
|
"resolved": "https://registry.npmjs.org/react-is/-/react-is-16.13.1.tgz",
|
||||||
|
@ -19,11 +19,14 @@
|
|||||||
"@docusaurus/plugin-google-gtag": "^3.4.0",
|
"@docusaurus/plugin-google-gtag": "^3.4.0",
|
||||||
"@docusaurus/preset-classic": "^3.4.0",
|
"@docusaurus/preset-classic": "^3.4.0",
|
||||||
"@docusaurus/theme-mermaid": "^3.4.0",
|
"@docusaurus/theme-mermaid": "^3.4.0",
|
||||||
|
"@iconify/react": "^5.2.1",
|
||||||
"@mdx-js/react": "^3.0.0",
|
"@mdx-js/react": "^3.0.0",
|
||||||
"clsx": "^2.0.0",
|
"clsx": "^2.0.0",
|
||||||
|
"font-awesome": "^4.7.0",
|
||||||
"prism-react-renderer": "^2.3.0",
|
"prism-react-renderer": "^2.3.0",
|
||||||
"react": "^18.0.0",
|
"react": "^18.0.0",
|
||||||
"react-dom": "^18.0.0"
|
"react-dom": "^18.0.0",
|
||||||
|
"react-icons": "^5.5.0"
|
||||||
},
|
},
|
||||||
"devDependencies": {
|
"devDependencies": {
|
||||||
"@docusaurus/module-type-aliases": "^3.4.0",
|
"@docusaurus/module-type-aliases": "^3.4.0",
|
||||||
|
170
docs/src/components/CodeEmbed/index.js
Normal file
170
docs/src/components/CodeEmbed/index.js
Normal file
@ -0,0 +1,170 @@
|
|||||||
|
// import React, { useState, useEffect } from 'react';
|
||||||
|
// import CodeBlock from '@theme/CodeBlock';
|
||||||
|
// import Icon from '@site/src/components/Icon';
|
||||||
|
|
||||||
|
|
||||||
|
// const CodeEmbed = ({ src }) => {
|
||||||
|
// const [code, setCode] = useState('');
|
||||||
|
// const [loading, setLoading] = useState(true);
|
||||||
|
// const [error, setError] = useState(null);
|
||||||
|
|
||||||
|
// useEffect(() => {
|
||||||
|
// let isMounted = true;
|
||||||
|
|
||||||
|
// const fetchCodeFromUrl = async (url) => {
|
||||||
|
// if (!isMounted) return;
|
||||||
|
|
||||||
|
// setLoading(true);
|
||||||
|
// setError(null);
|
||||||
|
|
||||||
|
// try {
|
||||||
|
// const response = await fetch(url);
|
||||||
|
// if (!response.ok) {
|
||||||
|
// throw new Error(`HTTP error! status: ${response.status}`);
|
||||||
|
// }
|
||||||
|
// const data = await response.text();
|
||||||
|
// if (isMounted) {
|
||||||
|
// setCode(data);
|
||||||
|
// }
|
||||||
|
// } catch (err) {
|
||||||
|
// console.error('Failed to fetch code:', err);
|
||||||
|
// if (isMounted) {
|
||||||
|
// setError(err);
|
||||||
|
// setCode(`// Failed to load code from ${url}\n// ${err.message}`);
|
||||||
|
// }
|
||||||
|
// } finally {
|
||||||
|
// if (isMounted) {
|
||||||
|
// setLoading(false);
|
||||||
|
// }
|
||||||
|
// }
|
||||||
|
// };
|
||||||
|
|
||||||
|
// if (src) {
|
||||||
|
// fetchCodeFromUrl(src);
|
||||||
|
// }
|
||||||
|
|
||||||
|
// return () => {
|
||||||
|
// isMounted = false;
|
||||||
|
// };
|
||||||
|
// }, [src]);
|
||||||
|
|
||||||
|
// const githubUrl = src ? src.replace('https://raw.githubusercontent.com', 'https://github.com').replace('/refs/heads/', '/blob/') : null;
|
||||||
|
// const fileName = src ? src.substring(src.lastIndexOf('/') + 1) : null;
|
||||||
|
|
||||||
|
// return (
|
||||||
|
// loading ? (
|
||||||
|
// <div>Loading code...</div>
|
||||||
|
// ) : error ? (
|
||||||
|
// <div>Error: {error.message}</div>
|
||||||
|
// ) : (
|
||||||
|
// <div style={{ backgroundColor: 'transparent', padding: '0px', borderRadius: '5px' }}>
|
||||||
|
// <div style={{ textAlign: 'right' }}>
|
||||||
|
// {githubUrl && (
|
||||||
|
// <a href={githubUrl} target="_blank" rel="noopener noreferrer" style={{ paddingRight: '15px', color: 'gray', fontSize: '0.8em', fontStyle: 'italic', display: 'inline-flex', alignItems: 'center' }}>
|
||||||
|
// View on GitHub
|
||||||
|
// <Icon icon="mdi:github" height="48" />
|
||||||
|
// </a>
|
||||||
|
// )}
|
||||||
|
// </div>
|
||||||
|
// <CodeBlock title={fileName} className="language-java">{code}</CodeBlock>
|
||||||
|
// </div>
|
||||||
|
// )
|
||||||
|
// );
|
||||||
|
// };
|
||||||
|
|
||||||
|
// export default CodeEmbed;
|
||||||
|
import React, { useState, useEffect } from 'react';
|
||||||
|
import CodeBlock from '@theme/CodeBlock';
|
||||||
|
import Icon from '@site/src/components/Icon';
|
||||||
|
|
||||||
|
|
||||||
|
const CodeEmbed = ({ src }) => {
|
||||||
|
const [code, setCode] = useState('');
|
||||||
|
const [loading, setLoading] = useState(true);
|
||||||
|
const [error, setError] = useState(null);
|
||||||
|
|
||||||
|
useEffect(() => {
|
||||||
|
let isMounted = true;
|
||||||
|
|
||||||
|
const fetchCodeFromUrl = async (url) => {
|
||||||
|
if (!isMounted) return;
|
||||||
|
|
||||||
|
setLoading(true);
|
||||||
|
setError(null);
|
||||||
|
|
||||||
|
try {
|
||||||
|
const response = await fetch(url);
|
||||||
|
if (!response.ok) {
|
||||||
|
throw new Error(`HTTP error! status: ${response.status}`);
|
||||||
|
}
|
||||||
|
const data = await response.text();
|
||||||
|
if (isMounted) {
|
||||||
|
setCode(data);
|
||||||
|
}
|
||||||
|
} catch (err) {
|
||||||
|
console.error('Failed to fetch code:', err);
|
||||||
|
if (isMounted) {
|
||||||
|
setError(err);
|
||||||
|
setCode(`// Failed to load code from ${url}\n// ${err.message}`);
|
||||||
|
}
|
||||||
|
} finally {
|
||||||
|
if (isMounted) {
|
||||||
|
setLoading(false);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
if (src) {
|
||||||
|
fetchCodeFromUrl(src);
|
||||||
|
}
|
||||||
|
|
||||||
|
return () => {
|
||||||
|
isMounted = false;
|
||||||
|
};
|
||||||
|
}, [src]);
|
||||||
|
|
||||||
|
const githubUrl = src ? src.replace('https://raw.githubusercontent.com', 'https://github.com').replace('/refs/heads/', '/blob/') : null;
|
||||||
|
const fileName = src ? src.substring(src.lastIndexOf('/') + 1) : null;
|
||||||
|
|
||||||
|
const title = (
|
||||||
|
<div style={{ display: 'flex', justifyContent: 'space-between', alignItems: 'center' }}>
|
||||||
|
<a
|
||||||
|
href={githubUrl}
|
||||||
|
target="_blank"
|
||||||
|
rel="noopener noreferrer"
|
||||||
|
style={{
|
||||||
|
color: 'gray',
|
||||||
|
textDecoration: 'none',
|
||||||
|
}}
|
||||||
|
onMouseOver={e => {
|
||||||
|
e.target.style.textDecoration = 'underline';
|
||||||
|
}}
|
||||||
|
onMouseOut={e => {
|
||||||
|
e.target.style.textDecoration = 'none';
|
||||||
|
}}
|
||||||
|
>
|
||||||
|
<span>{fileName}</span>
|
||||||
|
</a>
|
||||||
|
{githubUrl && (
|
||||||
|
<a href={githubUrl} target="_blank" rel="noopener noreferrer" style={{ color: 'gray', fontSize: '0.9em', fontStyle: 'italic', display: 'inline-flex', alignItems: 'center' }}>
|
||||||
|
View on GitHub
|
||||||
|
<Icon icon="mdi:github" height="1em" />
|
||||||
|
</a>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
|
||||||
|
return (
|
||||||
|
loading ? (
|
||||||
|
<div>Loading code...</div>
|
||||||
|
) : error ? (
|
||||||
|
<div>Error: {error.message}</div>
|
||||||
|
) : (
|
||||||
|
<div style={{ backgroundColor: 'transparent', padding: '0px', borderRadius: '5px' }}>
|
||||||
|
<CodeBlock title={title} className="language-java">{code}</CodeBlock>
|
||||||
|
</div>
|
||||||
|
)
|
||||||
|
);
|
||||||
|
};
|
||||||
|
|
||||||
|
export default CodeEmbed;
|
9
docs/src/components/Icon/index.js
Normal file
9
docs/src/components/Icon/index.js
Normal file
@ -0,0 +1,9 @@
|
|||||||
|
// @site/src/components/Icon.js
|
||||||
|
import React from 'react';
|
||||||
|
import { Icon as IconifyIcon } from '@iconify/react';
|
||||||
|
|
||||||
|
const IIcon = ({ icon, color, width = '24', height = '24' }) => (
|
||||||
|
<IconifyIcon icon={icon} color={color} width={width} height={height} />
|
||||||
|
);
|
||||||
|
|
||||||
|
export default IIcon;
|
@ -4,6 +4,8 @@
|
|||||||
* work well for content-centric websites.
|
* work well for content-centric websites.
|
||||||
*/
|
*/
|
||||||
|
|
||||||
|
@import 'font-awesome/css/font-awesome.min.css';
|
||||||
|
|
||||||
/* You can override the default Infima variables here. */
|
/* You can override the default Infima variables here. */
|
||||||
:root {
|
:root {
|
||||||
--ifm-color-primary: #2e8555;
|
--ifm-color-primary: #2e8555;
|
||||||
|
@ -32,7 +32,7 @@ function HomepageHeader() {
|
|||||||
export default function Home() {
|
export default function Home() {
|
||||||
const {siteConfig} = useDocusaurusContext();
|
const {siteConfig} = useDocusaurusContext();
|
||||||
return (<Layout
|
return (<Layout
|
||||||
title={`Hello from ${siteConfig.title}`}
|
title={`${siteConfig.title}`}
|
||||||
description="Description will go into a meta tag in <head />">
|
description="Description will go into a meta tag in <head />">
|
||||||
<HomepageHeader/>
|
<HomepageHeader/>
|
||||||
<main>
|
<main>
|
||||||
|
Loading…
x
Reference in New Issue
Block a user