Compare commits

...

39 Commits

Author SHA1 Message Date
amithkoujalgi
7ce89a3e89 Update Javadoc in Agent class to reflect changes in the interact method signature, now including an OllamaChatStreamObserver parameter for improved conversation handling. 2025-10-20 22:31:04 +05:30
Amith Koujalgi
fe43e87e1a Enhance agent documentation with detailed YAML configuration instructions and benefits. Update CodeEmbed component to support customizable language for syntax highlighting. Refactor Agent class to improve Javadoc comments and method signatures for better clarity and functionality. 2025-10-20 20:07:15 +05:30
Amith Koujalgi
d55d1c0fd9 Update Javadoc comments in Ollama and Agent classes to reflect correct method references for chat request construction and agent instantiation. 2025-10-19 14:34:21 +05:30
Amith Koujalgi
f0e5a9e172 Add documentation for the new Agent feature and update sidebar positions for Metrics and API categories. Adjust code examples in various API documentation to reflect correct paths and improve clarity. Enhance the Agent class with an equals and hash code method for better functionality. 2025-10-19 14:03:10 +05:30
amithkoujalgi
866c08f590 Remove SampleAgent class and associated YAML configuration file, streamlining the project by eliminating example implementations and their dependencies. 2025-10-19 11:22:16 +05:30
amithkoujalgi
bec634dd37 Refactor Agent class to include request timeout configuration and enhance interactive input display. Remove commented-out code for clarity. Update SampleAgent to utilize YAML configuration for agent instantiation. 2025-10-18 20:19:43 +05:30
amithkoujalgi
cbf65fef48 Add YAML support for Agent configuration and enhance Agent class to load tools from YAML. Introduce custom prompt functionality and refactor constructor to accept additional parameters. Update SampleAgent to demonstrate YAML loading. 2025-10-13 14:34:03 +05:30
amithkoujalgi
6df57c4a23 Enhance SampleAgent with improved Javadoc comments for tool specifications and descriptions, clarifying usage and parameters for weather, calculator, and hotel booking tools. 2025-10-11 00:00:42 +05:30
amithkoujalgi
da6d20d118 Add default target to Makefile, enhance Ollama class to use tools, and introduce Agent and SampleAgent classes for interactive tool usage. Update Javadoc generation message and improve error handling in endpoint callers. 2025-10-10 23:56:31 +05:30
Amith Koujalgi
64c629775a Refactor OllamaChatRequest and OllamaGenerateRequest to remove builder classes, implement builder-like methods directly in the request classes, and enhance request handling with additional options and image support. Update integration tests to reflect these changes.
Some checks failed
Mark stale issues / stale (push) Failing after 12s
Mark stale issues and PRs / stale (push) Failing after 8s
CodeQL / Analyze (java) (push) Failing after 11s
CodeQL / Analyze (javascript) (push) Failing after 10s
2025-10-07 18:33:22 +05:30
Amith Koujalgi
46f2d62fed Merge pull request #208 from ollama4j/dependabot/npm_and_yarn/docs/react-dom-19.2.0
Some checks failed
CodeQL / Analyze (java) (push) Failing after 13s
CodeQL / Analyze (javascript) (push) Failing after 12s
Mark stale issues / stale (push) Failing after 43s
Mark stale issues and PRs / stale (push) Failing after 1m30s
Bump react-dom from 19.1.1 to 19.2.0 in /docs
2025-10-06 23:28:31 +05:30
dependabot[bot]
b91943066e Bump react-dom from 19.1.1 to 19.2.0 in /docs
Bumps [react-dom](https://github.com/facebook/react/tree/HEAD/packages/react-dom) from 19.1.1 to 19.2.0.
- [Release notes](https://github.com/facebook/react/releases)
- [Changelog](https://github.com/facebook/react/blob/main/CHANGELOG.md)
- [Commits](https://github.com/facebook/react/commits/v19.2.0/packages/react-dom)

---
updated-dependencies:
- dependency-name: react-dom
  dependency-version: 19.2.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-10-06 17:57:36 +00:00
Amith Koujalgi
58d73637bb Merge pull request #207 from ollama4j/dependabot/npm_and_yarn/docs/react-19.2.0
Bump react from 19.1.1 to 19.2.0 in /docs
2025-10-06 23:26:19 +05:30
Amith Koujalgi
0ffaac65d4 Add new logo 2025-10-06 23:25:10 +05:30
Amith Koujalgi
4ce9c4c191 Add new logo 2025-10-06 23:23:42 +05:30
dependabot[bot]
4681b1986f Bump react from 19.1.1 to 19.2.0 in /docs
Bumps [react](https://github.com/facebook/react/tree/HEAD/packages/react) from 19.1.1 to 19.2.0.
- [Release notes](https://github.com/facebook/react/releases)
- [Changelog](https://github.com/facebook/react/blob/main/CHANGELOG.md)
- [Commits](https://github.com/facebook/react/commits/v19.2.0/packages/react)

---
updated-dependencies:
- dependency-name: react
  dependency-version: 19.2.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-10-06 00:36:12 +00:00
Amith Koujalgi
89d42fd469 Merge pull request #206 from ollama4j/docs
Some checks failed
Mark stale issues / stale (push) Failing after 23s
Mark stale issues and PRs / stale (push) Failing after 1m19s
CodeQL / Analyze (java) (push) Failing after 15s
CodeQL / Analyze (javascript) (push) Failing after 13s
Update metrics.md
2025-10-01 01:24:33 +05:30
amithkoujalgi
8a903f695e Update metrics.md 2025-10-01 01:24:03 +05:30
amithkoujalgi
3a20af25f1 Merge branch 'main' of https://github.com/ollama4j/ollama4j 2025-10-01 01:16:50 +05:30
Amith Koujalgi
24046b6660 Merge pull request #205 from ollama4j/refactor
Add metrics documentation for Ollama4j library
2025-10-01 01:15:45 +05:30
Amith Koujalgi
42a0034728 Merge pull request #203 from ollama4j/dependabot/npm_and_yarn/docs/docusaurus/preset-classic-3.9.1
Some checks failed
CodeQL / Analyze (javascript) (push) Failing after 9s
CodeQL / Analyze (java) (push) Failing after 12s
Mark stale issues and PRs / stale (push) Failing after 1m31s
Mark stale issues / stale (push) Failing after 15s
Bump @docusaurus/preset-classic from 3.9.0 to 3.9.1 in /docs
2025-09-29 23:34:32 +05:30
dependabot[bot]
d8d660be8d Bump @docusaurus/preset-classic from 3.9.0 to 3.9.1 in /docs
Bumps [@docusaurus/preset-classic](https://github.com/facebook/docusaurus/tree/HEAD/packages/docusaurus-preset-classic) from 3.9.0 to 3.9.1.
- [Release notes](https://github.com/facebook/docusaurus/releases)
- [Changelog](https://github.com/facebook/docusaurus/blob/main/CHANGELOG.md)
- [Commits](https://github.com/facebook/docusaurus/commits/v3.9.1/packages/docusaurus-preset-classic)

---
updated-dependencies:
- dependency-name: "@docusaurus/preset-classic"
  dependency-version: 3.9.1
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-09-29 18:02:34 +00:00
Amith Koujalgi
c5a2d583c7 Merge pull request #202 from ollama4j/dependabot/npm_and_yarn/docs/docusaurus/theme-mermaid-3.9.1
Bump @docusaurus/theme-mermaid from 3.9.0 to 3.9.1 in /docs
2025-09-29 23:30:55 +05:30
dependabot[bot]
cd656264cf Bump @docusaurus/theme-mermaid from 3.9.0 to 3.9.1 in /docs
Bumps [@docusaurus/theme-mermaid](https://github.com/facebook/docusaurus/tree/HEAD/packages/docusaurus-theme-mermaid) from 3.9.0 to 3.9.1.
- [Release notes](https://github.com/facebook/docusaurus/releases)
- [Changelog](https://github.com/facebook/docusaurus/blob/main/CHANGELOG.md)
- [Commits](https://github.com/facebook/docusaurus/commits/v3.9.1/packages/docusaurus-theme-mermaid)

---
updated-dependencies:
- dependency-name: "@docusaurus/theme-mermaid"
  dependency-version: 3.9.1
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-09-29 14:24:20 +00:00
Amith Koujalgi
272c8204c7 Merge pull request #201 from ollama4j/dependabot/maven/org.jacoco-jacoco-maven-plugin-0.8.13
Some checks failed
CodeQL / Analyze (javascript) (push) Failing after 11s
CodeQL / Analyze (java) (push) Failing after 14s
Mark stale issues / stale (push) Failing after 21s
Bump org.jacoco:jacoco-maven-plugin from 0.8.7 to 0.8.13
2025-09-29 19:05:43 +05:30
Amith Koujalgi
23c7321b63 Merge pull request #200 from ollama4j/dependabot/maven/org.apache.maven.plugins-maven-compiler-plugin-3.14.1
Bump org.apache.maven.plugins:maven-compiler-plugin from 3.14.0 to 3.14.1
2025-09-29 19:05:27 +05:30
Amith Koujalgi
e24a38f89f Merge pull request #199 from ollama4j/dependabot/npm_and_yarn/docs/docusaurus/plugin-google-gtag-3.9.1
Bump @docusaurus/plugin-google-gtag from 3.9.0 to 3.9.1 in /docs
2025-09-29 19:05:15 +05:30
Amith Koujalgi
5847cfc94c Merge pull request #198 from ollama4j/dependabot/maven/io.github.git-commit-id-git-commit-id-maven-plugin-9.0.2
Bump io.github.git-commit-id:git-commit-id-maven-plugin from 9.0.1 to 9.0.2
2025-09-29 19:05:02 +05:30
Amith Koujalgi
05d5958307 Merge pull request #197 from ollama4j/dependabot/maven/org.mockito-mockito-core-5.20.0
Bump org.mockito:mockito-core from 4.1.0 to 5.20.0
2025-09-29 19:04:48 +05:30
Amith Koujalgi
ffa81cb7df Merge pull request #196 from ollama4j/dependabot/npm_and_yarn/docs/docusaurus/module-type-aliases-3.9.1
Bump @docusaurus/module-type-aliases from 3.9.0 to 3.9.1 in /docs
2025-09-29 19:04:22 +05:30
Amith Koujalgi
bd231e639d Merge pull request #195 from ollama4j/dependabot/maven/com.diffplug.spotless-spotless-maven-plugin-3.0.0
Bump com.diffplug.spotless:spotless-maven-plugin from 2.46.1 to 3.0.0
2025-09-29 19:04:01 +05:30
Amith Koujalgi
73a0a48eab Merge pull request #194 from ollama4j/refactor
Refactor OllamaAPI to Ollama class and update documentation
2025-09-29 14:12:34 +05:30
dependabot[bot]
b347faff83 Bump org.jacoco:jacoco-maven-plugin from 0.8.7 to 0.8.13
Bumps [org.jacoco:jacoco-maven-plugin](https://github.com/jacoco/jacoco) from 0.8.7 to 0.8.13.
- [Release notes](https://github.com/jacoco/jacoco/releases)
- [Commits](https://github.com/jacoco/jacoco/compare/v0.8.7...v0.8.13)

---
updated-dependencies:
- dependency-name: org.jacoco:jacoco-maven-plugin
  dependency-version: 0.8.13
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-09-29 00:39:08 +00:00
dependabot[bot]
16e14ddd1c Bump org.apache.maven.plugins:maven-compiler-plugin
Bumps [org.apache.maven.plugins:maven-compiler-plugin](https://github.com/apache/maven-compiler-plugin) from 3.14.0 to 3.14.1.
- [Release notes](https://github.com/apache/maven-compiler-plugin/releases)
- [Commits](https://github.com/apache/maven-compiler-plugin/compare/maven-compiler-plugin-3.14.0...maven-compiler-plugin-3.14.1)

---
updated-dependencies:
- dependency-name: org.apache.maven.plugins:maven-compiler-plugin
  dependency-version: 3.14.1
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-09-29 00:39:07 +00:00
dependabot[bot]
2117d42f60 Bump @docusaurus/plugin-google-gtag from 3.9.0 to 3.9.1 in /docs
Bumps [@docusaurus/plugin-google-gtag](https://github.com/facebook/docusaurus/tree/HEAD/packages/docusaurus-plugin-google-gtag) from 3.9.0 to 3.9.1.
- [Release notes](https://github.com/facebook/docusaurus/releases)
- [Changelog](https://github.com/facebook/docusaurus/blob/main/CHANGELOG.md)
- [Commits](https://github.com/facebook/docusaurus/commits/v3.9.1/packages/docusaurus-plugin-google-gtag)

---
updated-dependencies:
- dependency-name: "@docusaurus/plugin-google-gtag"
  dependency-version: 3.9.1
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-09-29 00:38:59 +00:00
dependabot[bot]
79b8dbaefd Bump io.github.git-commit-id:git-commit-id-maven-plugin
Bumps [io.github.git-commit-id:git-commit-id-maven-plugin](https://github.com/git-commit-id/git-commit-id-maven-plugin) from 9.0.1 to 9.0.2.
- [Release notes](https://github.com/git-commit-id/git-commit-id-maven-plugin/releases)
- [Commits](https://github.com/git-commit-id/git-commit-id-maven-plugin/compare/v9.0.1...v9.0.2)

---
updated-dependencies:
- dependency-name: io.github.git-commit-id:git-commit-id-maven-plugin
  dependency-version: 9.0.2
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-09-29 00:38:48 +00:00
dependabot[bot]
04f5c28052 Bump org.mockito:mockito-core from 4.1.0 to 5.20.0
Bumps [org.mockito:mockito-core](https://github.com/mockito/mockito) from 4.1.0 to 5.20.0.
- [Release notes](https://github.com/mockito/mockito/releases)
- [Commits](https://github.com/mockito/mockito/compare/v4.1.0...v5.20.0)

---
updated-dependencies:
- dependency-name: org.mockito:mockito-core
  dependency-version: 5.20.0
  dependency-type: direct:development
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-09-29 00:38:48 +00:00
dependabot[bot]
a4da036389 Bump @docusaurus/module-type-aliases from 3.9.0 to 3.9.1 in /docs
Bumps [@docusaurus/module-type-aliases](https://github.com/facebook/docusaurus/tree/HEAD/packages/docusaurus-module-type-aliases) from 3.9.0 to 3.9.1.
- [Release notes](https://github.com/facebook/docusaurus/releases)
- [Changelog](https://github.com/facebook/docusaurus/blob/main/CHANGELOG.md)
- [Commits](https://github.com/facebook/docusaurus/commits/v3.9.1/packages/docusaurus-module-type-aliases)

---
updated-dependencies:
- dependency-name: "@docusaurus/module-type-aliases"
  dependency-version: 3.9.1
  dependency-type: direct:development
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-09-29 00:38:34 +00:00
dependabot[bot]
a73cf015d6 Bump com.diffplug.spotless:spotless-maven-plugin from 2.46.1 to 3.0.0
Bumps [com.diffplug.spotless:spotless-maven-plugin](https://github.com/diffplug/spotless) from 2.46.1 to 3.0.0.
- [Release notes](https://github.com/diffplug/spotless/releases)
- [Changelog](https://github.com/diffplug/spotless/blob/main/CHANGES.md)
- [Commits](https://github.com/diffplug/spotless/compare/maven/2.46.1...lib/3.0.0)

---
updated-dependencies:
- dependency-name: com.diffplug.spotless:spotless-maven-plugin
  dependency-version: 3.0.0
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-09-29 00:38:19 +00:00
33 changed files with 1257 additions and 836 deletions

View File

@@ -1,3 +1,7 @@
# Default target
.PHONY: all
all: dev build
dev:
@echo "Setting up dev environment..."
@command -v pre-commit >/dev/null 2>&1 || { echo "Error: pre-commit is not installed. Please install it first."; exit 1; }
@@ -43,7 +47,7 @@ doxygen:
@doxygen Doxyfile
javadoc:
@echo "\033[0;34mGenerating Javadocs into '$(javadocfolder)'...\033[0m"
@echo "\033[0;34mGenerating Javadocs...\033[0m"
@mvn clean javadoc:javadoc
@if [ -f "target/reports/apidocs/index.html" ]; then \
echo "\033[0;32mJavadocs generated in target/reports/apidocs/index.html\033[0m"; \

View File

@@ -1,5 +1,5 @@
<div align="center">
<img src='https://raw.githubusercontent.com/ollama4j/ollama4j/65a9d526150da8fcd98e2af6a164f055572bf722/ollama4j.jpeg' width='100' alt="ollama4j-icon">
<img src='https://raw.githubusercontent.com/ollama4j/ollama4j/refs/heads/main/ollama4j-new.jpeg' width='200' alt="ollama4j-icon">
### Ollama4j

60
docs/docs/agent.md Normal file
View File

@@ -0,0 +1,60 @@
---
sidebar_position: 4
title: Agents
---
import CodeEmbed from '@site/src/components/CodeEmbed';
# Agents
Build powerful, flexible agents—backed by LLMs and tools—in a few minutes.
Ollama4js agent system lets you bring together the best of LLM reasoning and external tool-use using a simple, declarative YAML configuration. No framework bloat, no complicated setup—just describe your agent, plug in your logic, and go.
---
**Why use agents in Ollama4j?**
- **Effortless Customization:** Instantly adjust your agents persona, reasoning strategies, or domain by tweaking YAML. No need to touch your compiled Java code.
- **Easy Extensibility:** Want new capabilities? Just add or change tools and logic classes—no framework glue or plumbing required.
- **Fast Experimentation:** Mix-and-match models, instructions, and tools—prototype sophisticated behaviors or orchestrators in minutes.
- **Clean Separation:** Keep business logic (Java) and agent personality/configuration (YAML) separate for maintainability and clarity.
---
## Define an Agent in YAML
Specify everything about your agent—what LLM it uses, its “personality,” and all callable tools—in a single YAML file.
**Agent YAML keys:**
| Field | Description |
|-------------------------|-----------------------------------------------------------------------------------------------------------------------|
| `name` | Name of your agent. |
| `host` | The base URL for your Ollama server (e.g., `http://localhost:11434`). |
| `model` | The LLM backing your agent (e.g., `llama2`, `mistral`, `mixtral`, etc). |
| `customPrompt` | _(optional)_ System prompt—instructions or persona for your agent. |
| `tools` | List of tools the agent can use. Each tool entry describes the name, function, and parameters. |
| `toolFunctionFQCN` | Fully qualified Java class name implementing the tool logic. Must be present on classpath. |
| `requestTimeoutSeconds` | _(optional)_ How long (seconds) to wait for agent replies. |
YAML makes it effortless to configure and tweak your agents powers and behavior—no code changes needed!
**Example agent YAML:**
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/resources/agent.yaml" language='yaml'/>
---
## Instantiating and Running Agents in Java
Once your agent is described in YAML, bringing it to life in Java takes only a couple of lines:
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/AgentExample.java"/>
- **No boilerplate.** Just load and start chatting or calling tools.
- The API takes care of wiring up LLMs, tool invocation, and instruction handling.
Ready to build your own AI-powered assistant? Just write your YAML, implement the tool logic in Java, and go!

View File

@@ -1,6 +1,6 @@
{
"label": "APIs - Extras",
"position": 4,
"label": "Extras",
"position": 5,
"link": {
"type": "generated-index",
"description": "Details of APIs to handle bunch of extra stuff."

View File

@@ -1,5 +1,5 @@
{
"label": "APIs - Generate",
"label": "Generate",
"position": 3,
"link": {
"type": "generated-index",

View File

@@ -66,11 +66,11 @@ To use a method as a tool within a chat call, follow these steps:
Let's try an example. Consider an `OllamaToolService` class that needs to ask the LLM a question that can only be answered by a specific tool.
This tool is implemented within a `GlobalConstantGenerator` class. Following is the code that exposes an annotated method as a tool:
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/toolcalling/annotated/GlobalConstantGenerator.java"/>
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/tools/annotated/GlobalConstantGenerator.java"/>
The annotated method can then be used as a tool in the chat session:
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/toolcalling/annotated/AnnotatedToolCallingExample.java"/>
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/AnnotatedToolCallingExample.java"/>
Running the above would produce a response similar to:

View File

@@ -63,7 +63,7 @@ You will get a response similar to:
### Using a simple Console Output Stream Handler
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/ConsoleOutputStreamHandlerExample.java" />
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/ChatWithConsoleHandlerExample.java" />
### With a Stream Handler to receive the tokens as they are generated

View File

@@ -19,11 +19,11 @@ You can use this feature to receive both the thinking and the response as separa
You will get a response similar to:
:::tip[Thinking Tokens]
User asks "Who are you?" It's a request for identity. As ChatGPT, we should explain that I'm an AI developed by OpenAI, etc. Provide friendly explanation.
USER ASKS "WHO ARE YOU?" IT'S A REQUEST FOR IDENTITY. AS CHATGPT, WE SHOULD EXPLAIN THAT I'M AN AI DEVELOPED BY OPENAI, ETC. PROVIDE FRIENDLY EXPLANATION.
:::
:::tip[Response Tokens]
Im ChatGPT, a large language model created by OpenAI. Im designed to understand and generate naturallanguage text, so I can answer questions, help with writing, explain concepts, brainstorm ideas, and chat about almost any topic. I dont have a personal life or consciousness—Im a tool that processes input and produces responses based on patterns in the data I was trained on. If you have any questions about how I work or what I can do, feel free to ask!
im chatgpt, a large language model created by openai. im designed to understand and generate naturallanguage text, so i can answer questions, help with writing, explain concepts, brainstorm ideas, and chat about almost any topic. i dont have a personal life or consciousness—im a tool that processes input and produces responses based on patterns in the data i was trained on. if you have any questions about how i work or what i can do, feel free to ask!
:::
### Generate response and receive the thinking and response tokens streamed
@@ -34,7 +34,7 @@ You will get a response similar to:
:::tip[Thinking Tokens]
<TypewriterTextarea
textContent={`User asks "Who are you?" It's a request for identity. As ChatGPT, we should explain that I'm an AI developed by OpenAI, etc. Provide friendly explanation.`}
textContent={`USER ASKS "WHO ARE YOU?" WE SHOULD EXPLAIN THAT I'M AN AI BY OPENAI, ETC.`}
typingSpeed={10}
pauseBetweenSentences={1200}
height="auto"
@@ -45,7 +45,7 @@ style={{ whiteSpace: 'pre-line' }}
:::tip[Response Tokens]
<TypewriterTextarea
textContent={`Im ChatGPT, a large language model created by OpenAI. Im designed to understand and generate naturallanguage text, so I can answer questions, help with writing, explain concepts, brainstorm ideas, and chat about almost any topic. I dont have a personal life or consciousness—Im a tool that processes input and produces responses based on patterns in the data I was trained on. If you have any questions about how I work or what I can do, feel free to ask!`}
textContent={`im chatgpt, a large language model created by openai.`}
typingSpeed={10}
pauseBetweenSentences={1200}
height="auto"

View File

@@ -3,6 +3,7 @@ sidebar_position: 4
---
import CodeEmbed from '@site/src/components/CodeEmbed';
import TypewriterTextarea from '@site/src/components/TypewriterTextarea';
# Generate with Images
@@ -17,13 +18,11 @@ recommended.
:::
## Synchronous mode
If you have this image downloaded and you pass the path to the downloaded image to the following code:
![Img](https://t3.ftcdn.net/jpg/02/96/63/80/360_F_296638053_0gUVA4WVBKceGsIr7LNqRWSnkusi07dq.jpg)
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/GenerateWithImageFile.java" />
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/GenerateWithImageFileSimple.java" />
You will get a response similar to:
@@ -32,30 +31,22 @@ This image features a white boat with brown cushions, where a dog is sitting on
be enjoying its time outdoors, perhaps on a lake.
:::
# Generate with Image URLs
This API lets you ask questions along with the image files to the LLMs.
This API corresponds to
the [completion](https://github.com/jmorganca/ollama/blob/main/docs/api.md#generate-a-completion) API.
:::note
Executing this on Ollama server running in CPU-mode will take longer to generate response. Hence, GPU-mode is
recommended.
:::
## Ask (Sync)
Passing the link of this image the following code:
If you want the response to be streamed, you can use the following code:
![Img](https://t3.ftcdn.net/jpg/02/96/63/80/360_F_296638053_0gUVA4WVBKceGsIr7LNqRWSnkusi07dq.jpg)
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/GenerateWithImageURL.java" />
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/GenerateWithImageFileStreaming.java" />
You will get a response similar to:
:::tip[LLM Response]
This image features a white boat with brown cushions, where a dog is sitting on the back of the boat. The dog seems to
be enjoying its time outdoors, perhaps on a lake.
:::tip[Response Tokens]
<TypewriterTextarea
textContent={`This image features a white boat with brown cushions, where a dog is sitting on the back of the boat. The dog seems to be enjoying its time outdoors, perhaps on a lake.`}
typingSpeed={10}
pauseBetweenSentences={1200}
height="auto"
width="100%"
style={{ whiteSpace: 'pre-line' }}
/>
:::

View File

@@ -36,19 +36,19 @@ We can create static functions as our tools.
This function takes the arguments `location` and `fuelType` and performs an operation with these arguments and returns
fuel price value.
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/toolcalling/tools/FuelPriceTool.java"/ >
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/tools/toolfunctions/FuelPriceToolFunction.java"/ >
This function takes the argument `city` and performs an operation with the argument and returns the weather for a
location.
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/toolcalling/tools/WeatherTool.java"/ >
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/tools/toolfunctions/WeatherToolFunction.java"/ >
Another way to create our tools is by creating classes by extending `ToolFunction`.
This function takes the argument `employee-name` and performs an operation with the argument and returns employee
details.
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/toolcalling/tools/DBQueryFunction.java"/ >
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/tools/toolfunctions/EmployeeFinderToolFunction.java"/ >
### Define Tool Specifications
@@ -57,21 +57,21 @@ Lets define a sample tool specification called **Fuel Price Tool** for getting t
- Specify the function `name`, `description`, and `required` properties (`location` and `fuelType`).
- Associate the `getCurrentFuelPrice` function you defined earlier.
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/toolcalling/toolspecs/FuelPriceToolSpec.java"/ >
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/tools/toolspecs/FuelPriceToolSpec.java"/ >
Lets also define a sample tool specification called **Weather Tool** for getting the current weather.
- Specify the function `name`, `description`, and `required` property (`city`).
- Associate the `getCurrentWeather` function you defined earlier.
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/toolcalling/toolspecs/WeatherToolSpec.java"/ >
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/tools/toolspecs/WeatherToolSpec.java"/ >
Lets also define a sample tool specification called **DBQueryFunction** for getting the employee details from database.
- Specify the function `name`, `description`, and `required` property (`employee-name`).
- Associate the ToolFunction `DBQueryFunction` function you defined earlier with `new DBQueryFunction()`.
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/toolcalling/toolspecs/DatabaseQueryToolSpec.java"/ >
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/tools/toolspecs/EmployeeFinderToolSpec.java"/ >
Now put it all together by registering the tools and prompting with tools.

View File

@@ -1,5 +1,5 @@
{
"label": "APIs - Manage Models",
"label": "Manage Models",
"position": 2,
"link": {
"type": "generated-index",

View File

@@ -15,13 +15,13 @@ This API lets you create a custom model on the Ollama server.
You would see these logs while the custom model is being created:
```
{"status":"using existing layer sha256:fad2a06e4cc705c2fa8bec5477ddb00dc0c859ac184c34dcc5586663774161ca"}
{"status":"using existing layer sha256:41c2cf8c272f6fb0080a97cd9d9bd7d4604072b80a0b10e7d65ca26ef5000c0c"}
{"status":"using existing layer sha256:1da0581fd4ce92dcf5a66b1da737cf215d8dcf25aa1b98b44443aaf7173155f5"}
{"status":"creating new layer sha256:941b69ca7dc2a85c053c38d9e8029c9df6224e545060954fa97587f87c044a64"}
{"status":"using existing layer sha256:f02dd72bb2423204352eabc5637b44d79d17f109fdb510a7c51455892aa2d216"}
{"status":"writing manifest"}
{"status":"success"}
using existing layer sha256:fad2a06e4cc705c2fa8bec5477ddb00dc0c859ac184c34dcc5586663774161ca
using existing layer sha256:41c2cf8c272f6fb0080a97cd9d9bd7d4604072b80a0b10e7d65ca26ef5000c0c
using existing layer sha256:1da0581fd4ce92dcf5a66b1da737cf215d8dcf25aa1b98b44443aaf7173155f5
creating new layer sha256:941b69ca7dc2a85c053c38d9e8029c9df6224e545060954fa97587f87c044a64
using existing layer sha256:f02dd72bb2423204352eabc5637b44d79d17f109fdb510a7c51455892aa2d216
writing manifest
success
```
Once created, you can see it when you use [list models](./list-models) API.

View File

@@ -1,5 +1,5 @@
---
sidebar_position: 5
sidebar_position: 6
title: Metrics
---
@@ -55,7 +55,7 @@ metrics via `/metrics` endpoint:
</dependency>
```
Here is a sample code snippet demonstrating how to retrieve and print metrics:
Here is a sample code snippet demonstrating how to retrieve and print metrics on Grafana:
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/MetricsExample.java" />
@@ -64,8 +64,27 @@ at: http://localhost:8080/metrics
## Integrating with Monitoring Tools
To integrate Ollama4j metrics with external monitoring systems, you can export the metrics endpoint and configure your
monitoring tool to scrape or collect the data. Refer to the [integration guide](../integration/monitoring.md) for
detailed instructions.
### Grafana
For more information on customizing and extending metrics, see the [API documentation](../api/metrics.md).
Use the following sample `docker-compose` file to host a basic Grafana container.
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/docker/docker-compose.yml" />
And run:
```shell
docker-compose -f path/to/your/docker-compose.yml up
```
This starts Granfana at http://localhost:3000
[//]: # (To integrate Ollama4j metrics with external monitoring systems, you can export the metrics endpoint and configure your)
[//]: # (monitoring tool to scrape or collect the data. Refer to the [integration guide]&#40;../integration/monitoring.md&#41; for)
[//]: # (detailed instructions.)
[//]: # ()
[//]: # (For more information on customizing and extending metrics, see the [API documentation]&#40;../api/metrics.md&#41;.)

656
docs/package-lock.json generated

File diff suppressed because it is too large Load Diff

View File

@@ -16,21 +16,21 @@
"dependencies": {
"@docsearch/js": "^4.1.0",
"@docusaurus/core": "^3.9.0",
"@docusaurus/plugin-google-gtag": "^3.8.1",
"@docusaurus/preset-classic": "^3.9.0",
"@docusaurus/theme-mermaid": "^3.9.0",
"@docusaurus/plugin-google-gtag": "^3.9.1",
"@docusaurus/preset-classic": "^3.9.1",
"@docusaurus/theme-mermaid": "^3.9.1",
"@iconify/react": "^6.0.2",
"@mdx-js/react": "^3.1.1",
"clsx": "^2.1.1",
"font-awesome": "^4.7.0",
"prism-react-renderer": "^2.4.1",
"react": "^19.1.1",
"react-dom": "^19.1.1",
"react": "^19.2.0",
"react-dom": "^19.2.0",
"react-icons": "^5.5.0",
"react-image-gallery": "^1.4.0"
},
"devDependencies": {
"@docusaurus/module-type-aliases": "^3.8.1",
"@docusaurus/module-type-aliases": "^3.9.1",
"@docusaurus/types": "^3.4.0"
},
"browserslist": {

View File

@@ -1,84 +1,14 @@
// import React, { useState, useEffect } from 'react';
// import CodeBlock from '@theme/CodeBlock';
// import Icon from '@site/src/components/Icon';
// const CodeEmbed = ({ src }) => {
// const [code, setCode] = useState('');
// const [loading, setLoading] = useState(true);
// const [error, setError] = useState(null);
// useEffect(() => {
// let isMounted = true;
// const fetchCodeFromUrl = async (url) => {
// if (!isMounted) return;
// setLoading(true);
// setError(null);
// try {
// const response = await fetch(url);
// if (!response.ok) {
// throw new Error(`HTTP error! status: ${response.status}`);
// }
// const data = await response.text();
// if (isMounted) {
// setCode(data);
// }
// } catch (err) {
// console.error('Failed to fetch code:', err);
// if (isMounted) {
// setError(err);
// setCode(`// Failed to load code from ${url}\n// ${err.message}`);
// }
// } finally {
// if (isMounted) {
// setLoading(false);
// }
// }
// };
// if (src) {
// fetchCodeFromUrl(src);
// }
// return () => {
// isMounted = false;
// };
// }, [src]);
// const githubUrl = src ? src.replace('https://raw.githubusercontent.com', 'https://github.com').replace('/refs/heads/', '/blob/') : null;
// const fileName = src ? src.substring(src.lastIndexOf('/') + 1) : null;
// return (
// loading ? (
// <div>Loading code...</div>
// ) : error ? (
// <div>Error: {error.message}</div>
// ) : (
// <div style={{ backgroundColor: 'transparent', padding: '0px', borderRadius: '5px' }}>
// <div style={{ textAlign: 'right' }}>
// {githubUrl && (
// <a href={githubUrl} target="_blank" rel="noopener noreferrer" style={{ paddingRight: '15px', color: 'gray', fontSize: '0.8em', fontStyle: 'italic', display: 'inline-flex', alignItems: 'center' }}>
// View on GitHub
// <Icon icon="mdi:github" height="48" />
// </a>
// )}
// </div>
// <CodeBlock title={fileName} className="language-java">{code}</CodeBlock>
// </div>
// )
// );
// };
// export default CodeEmbed;
import React, { useState, useEffect } from 'react';
import React, {useState, useEffect} from 'react';
import CodeBlock from '@theme/CodeBlock';
import Icon from '@site/src/components/Icon';
const CodeEmbed = ({ src }) => {
/**
* CodeEmbed component to display code fetched from a URL in a CodeBlock.
* @param {object} props
* @param {string} props.src - Source URL to fetch the code from.
* @param {string} [props.language='java'] - Language for syntax highlighting in CodeBlock.
*/
const CodeEmbed = ({src, language = 'java'}) => {
const [code, setCode] = useState('');
const [loading, setLoading] = useState(true);
const [error, setError] = useState(null);
@@ -127,7 +57,7 @@ const CodeEmbed = ({ src }) => {
const fileName = src ? src.substring(src.lastIndexOf('/') + 1) : null;
const title = (
<div style={{ display: 'flex', justifyContent: 'space-between', alignItems: 'center' }}>
<div style={{display: 'flex', justifyContent: 'space-between', alignItems: 'center'}}>
<a
href={githubUrl}
target="_blank"
@@ -146,9 +76,15 @@ const CodeEmbed = ({ src }) => {
<span>{fileName}</span>
</a>
{githubUrl && (
<a href={githubUrl} target="_blank" rel="noopener noreferrer" style={{ color: 'gray', fontSize: '0.9em', fontStyle: 'italic', display: 'inline-flex', alignItems: 'center' }}>
<a href={githubUrl} target="_blank" rel="noopener noreferrer" style={{
color: 'gray',
fontSize: '0.9em',
fontStyle: 'italic',
display: 'inline-flex',
alignItems: 'center'
}}>
View on GitHub
<Icon icon="mdi:github" height="1em" />
<Icon icon="mdi:github" height="1em"/>
</a>
)}
</div>
@@ -160,8 +96,8 @@ const CodeEmbed = ({ src }) => {
) : error ? (
<div>Error: {error.message}</div>
) : (
<div style={{ backgroundColor: 'transparent', padding: '0px', borderRadius: '5px' }}>
<CodeBlock title={title} className="language-java">{code}</CodeBlock>
<div style={{backgroundColor: 'transparent', padding: '0px', borderRadius: '5px'}}>
<CodeBlock title={title} language={language}>{code}</CodeBlock>
</div>
)
);

BIN
ollama4j-new.jpeg Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 67 KiB

21
pom.xml
View File

@@ -150,7 +150,7 @@
<plugin>
<groupId>io.github.git-commit-id</groupId>
<artifactId>git-commit-id-maven-plugin</artifactId>
<version>9.0.1</version>
<version>9.0.2</version>
<executions>
<execution>
<goals>
@@ -167,7 +167,7 @@
<plugin>
<groupId>com.diffplug.spotless</groupId>
<artifactId>spotless-maven-plugin</artifactId>
<version>2.46.1</version>
<version>3.0.0</version>
<configuration>
<formats>
<!-- you can define as many formats as you want, each is independent -->
@@ -232,7 +232,7 @@
<plugins>
<plugin>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.14.0</version>
<version>3.14.1</version>
</plugin>
<plugin>
<artifactId>maven-jar-plugin</artifactId>
@@ -259,6 +259,11 @@
<artifactId>jackson-databind</artifactId>
<version>2.20.0</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.dataformat</groupId>
<artifactId>jackson-dataformat-yaml</artifactId>
<version>2.20.0</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.datatype</groupId>
<artifactId>jackson-datatype-jsr310</artifactId>
@@ -275,7 +280,6 @@
<artifactId>slf4j-api</artifactId>
<version>2.0.17</version>
</dependency>
<dependency>
<groupId>org.junit.jupiter</groupId>
<artifactId>junit-jupiter-api</artifactId>
@@ -285,7 +289,7 @@
<dependency>
<groupId>org.mockito</groupId>
<artifactId>mockito-core</artifactId>
<version>4.1.0</version>
<version>5.20.0</version>
<scope>test</scope>
</dependency>
<dependency>
@@ -294,7 +298,6 @@
<version>20250517</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.testcontainers</groupId>
<artifactId>ollama</artifactId>
@@ -307,14 +310,12 @@
<version>1.21.3</version>
<scope>test</scope>
</dependency>
<!-- Prometheus metrics dependencies -->
<dependency>
<groupId>io.prometheus</groupId>
<artifactId>simpleclient</artifactId>
<version>0.16.0</version>
</dependency>
<dependency>
<groupId>com.google.guava</groupId>
<artifactId>guava</artifactId>
@@ -371,7 +372,7 @@
<plugin>
<groupId>org.jacoco</groupId>
<artifactId>jacoco-maven-plugin</artifactId>
<version>0.8.11</version>
<version>0.8.13</version>
<executions>
<execution>
<goals>
@@ -482,7 +483,7 @@
<plugin>
<groupId>org.jacoco</groupId>
<artifactId>jacoco-maven-plugin</artifactId>
<version>0.8.7</version>
<version>0.8.13</version>
<executions>
<execution>
<goals>

View File

@@ -70,10 +70,14 @@ public class Ollama {
*/
@Setter private long requestTimeoutSeconds = 10;
/** The read timeout in seconds for image URLs. */
/**
* The read timeout in seconds for image URLs.
*/
@Setter private int imageURLReadTimeoutSeconds = 10;
/** The connect timeout in seconds for image URLs. */
/**
* The connect timeout in seconds for image URLs.
*/
@Setter private int imageURLConnectTimeoutSeconds = 10;
/**
@@ -280,9 +284,9 @@ public class Ollama {
/**
* Handles retry backoff for pullModel.
*
* @param modelName the name of the model being pulled
* @param currentRetry the current retry attempt (zero-based)
* @param maxRetries the maximum number of retries allowed
* @param modelName the name of the model being pulled
* @param currentRetry the current retry attempt (zero-based)
* @param maxRetries the maximum number of retries allowed
* @param baseDelayMillis the base delay in milliseconds for exponential backoff
* @throws InterruptedException if the thread is interrupted during sleep
*/
@@ -376,7 +380,7 @@ public class Ollama {
* Returns true if the response indicates a successful pull.
*
* @param modelPullResponse the response from the model pull
* @param modelName the name of the model
* @param modelName the name of the model
* @return true if the pull was successful, false otherwise
* @throws OllamaException if the response contains an error
*/
@@ -601,7 +605,7 @@ public class Ollama {
/**
* Deletes a model from the Ollama server.
*
* @param modelName the name of the model to be deleted
* @param modelName the name of the model to be deleted
* @param ignoreIfNotPresent ignore errors if the specified model is not present on the Ollama server
* @throws OllamaException if the response indicates an error status
*/
@@ -758,7 +762,7 @@ public class Ollama {
* Generates a response from a model using the specified parameters and stream observer.
* If {@code streamObserver} is provided, streaming is enabled; otherwise, a synchronous call is made.
*
* @param request the generation request
* @param request the generation request
* @param streamObserver the stream observer for streaming responses, or null for synchronous
* @return the result of the generation
* @throws OllamaException if the request fails
@@ -801,6 +805,7 @@ public class Ollama {
chatRequest.setMessages(msgs);
msgs.add(ocm);
OllamaChatTokenHandler hdlr = null;
chatRequest.setUseTools(true);
chatRequest.setTools(request.getTools());
if (streamObserver != null) {
chatRequest.setStream(true);
@@ -823,10 +828,10 @@ public class Ollama {
/**
* Generates a response from a model asynchronously, returning a streamer for results.
*
* @param model the model name
* @param model the model name
* @param prompt the prompt to send
* @param raw whether to use raw mode
* @param think whether to use "think" mode
* @param raw whether to use raw mode
* @param think whether to use "think" mode
* @return an OllamaAsyncResultStreamer for streaming results
* @throws OllamaException if the request fails
*/
@@ -857,13 +862,13 @@ public class Ollama {
/**
* Sends a chat request to a model using an {@link OllamaChatRequest} and sets up streaming response.
* This can be constructed using an {@link OllamaChatRequestBuilder}.
* This can be constructed using an {@link OllamaChatRequest#builder()}.
*
* <p>Note: the OllamaChatRequestModel#getStream() property is not implemented.
*
* @param request request object to be sent to the server
* @param request request object to be sent to the server
* @param tokenHandler callback handler to handle the last token from stream (caution: the
* previous tokens from stream will not be concatenated)
* previous tokens from stream will not be concatenated)
* @return {@link OllamaChatResult}
* @throws OllamaException if the response indicates an error status
*/
@@ -877,7 +882,7 @@ public class Ollama {
// only add tools if tools flag is set
if (request.isUseTools()) {
// add all registered tools to request
request.setTools(toolRegistry.getRegisteredTools());
request.getTools().addAll(toolRegistry.getRegisteredTools());
}
if (tokenHandler != null) {
@@ -958,12 +963,16 @@ public class Ollama {
* Registers multiple tools in the tool registry.
*
* @param tools a list of {@link Tools.Tool} objects to register. Each tool contains its
* specification and function.
* specification and function.
*/
public void registerTools(List<Tools.Tool> tools) {
toolRegistry.addTools(tools);
}
public List<Tools.Tool> getRegisteredTools() {
return toolRegistry.getRegisteredTools();
}
/**
* Deregisters all tools from the tool registry. This method removes all registered tools,
* effectively clearing the registry.
@@ -979,7 +988,7 @@ public class Ollama {
* and recursively registers annotated tools from all the providers specified in the annotation.
*
* @throws OllamaException if the caller's class is not annotated with {@link
* OllamaToolService} or if reflection-based instantiation or invocation fails
* OllamaToolService} or if reflection-based instantiation or invocation fails
*/
public void registerAnnotatedTools() throws OllamaException {
try {
@@ -1127,7 +1136,7 @@ public class Ollama {
* This method synchronously calls the Ollama API. If a stream handler is provided,
* the request will be streamed; otherwise, a regular synchronous request will be made.
*
* @param ollamaRequestModel the request model containing necessary parameters for the Ollama API request
* @param ollamaRequestModel the request model containing necessary parameters for the Ollama API request
* @param thinkingStreamHandler the stream handler for "thinking" tokens, or null if not used
* @param responseStreamHandler the stream handler to process streaming responses, or null for non-streaming requests
* @return the result of the Ollama API request

View File

@@ -0,0 +1,318 @@
/*
* Ollama4j - Java library for interacting with Ollama server.
* Copyright (c) 2025 Amith Koujalgi and contributors.
*
* Licensed under the MIT License (the "License");
* you may not use this file except in compliance with the License.
*
*/
package io.github.ollama4j.agent;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.fasterxml.jackson.dataformat.yaml.YAMLFactory;
import io.github.ollama4j.Ollama;
import io.github.ollama4j.exceptions.OllamaException;
import io.github.ollama4j.impl.ConsoleOutputGenerateTokenHandler;
import io.github.ollama4j.models.chat.*;
import io.github.ollama4j.tools.ToolFunction;
import io.github.ollama4j.tools.Tools;
import java.io.InputStream;
import java.util.ArrayList;
import java.util.List;
import java.util.Scanner;
import lombok.*;
/**
* The {@code Agent} class represents an AI assistant capable of interacting with the Ollama API
* server.
*
* <p>It supports the use of tools (interchangeable code components), persistent chat history, and
* interactive as well as pre-scripted chat sessions.
*
* <h2>Usage</h2>
*
* <ul>
* <li>Instantiate an Agent via {@link #load(String)} for YAML-based configuration.
* <li>Handle conversation turns via {@link #interact(String, OllamaChatStreamObserver)}.
* <li>Use {@link #runInteractive()} for an interactive console-based session.
* </ul>
*/
public class Agent {
/**
* The agent's display name
*/
private final String name;
/**
* List of supported tools for this agent
*/
private final List<Tools.Tool> tools;
/**
* Ollama client instance for communication with the API
*/
private final Ollama ollamaClient;
/**
* The model name used for chat completions
*/
private final String model;
/**
* Persists chat message history across rounds
*/
private final List<OllamaChatMessage> chatHistory;
/**
* Optional custom system prompt for the agent
*/
private final String customPrompt;
/**
* Constructs a new Agent.
*
* @param name The agent's given name.
* @param ollamaClient The Ollama API client instance to use.
* @param model The model name to use for chat completion.
* @param customPrompt A custom prompt to prepend to all conversations (may be null).
* @param tools List of available tools for function calling.
*/
public Agent(
String name,
Ollama ollamaClient,
String model,
String customPrompt,
List<Tools.Tool> tools) {
this.name = name;
this.ollamaClient = ollamaClient;
this.chatHistory = new ArrayList<>();
this.tools = tools;
this.model = model;
this.customPrompt = customPrompt;
}
/**
* Loads and constructs an Agent from a YAML configuration file (classpath or filesystem).
*
* <p>The YAML should define the agent, the model, and the desired tool functions (using their
* fully qualified class names for auto-discovery).
*
* @param yamlPathOrResource Path or classpath resource name of the YAML file.
* @return New Agent instance loaded according to the YAML definition.
* @throws RuntimeException if the YAML cannot be read or agent cannot be constructed.
*/
public static Agent load(String yamlPathOrResource) {
try {
ObjectMapper mapper = new ObjectMapper(new YAMLFactory());
InputStream input =
Agent.class.getClassLoader().getResourceAsStream(yamlPathOrResource);
if (input == null) {
java.nio.file.Path filePath = java.nio.file.Paths.get(yamlPathOrResource);
if (java.nio.file.Files.exists(filePath)) {
input = java.nio.file.Files.newInputStream(filePath);
} else {
throw new RuntimeException(
yamlPathOrResource + " not found in classpath or file system");
}
}
AgentSpec agentSpec = mapper.readValue(input, AgentSpec.class);
List<AgentToolSpec> tools = agentSpec.getTools();
for (AgentToolSpec tool : tools) {
String fqcn = tool.getToolFunctionFQCN();
if (fqcn != null && !fqcn.isEmpty()) {
try {
Class<?> clazz = Class.forName(fqcn);
Object instance = clazz.getDeclaredConstructor().newInstance();
if (instance instanceof ToolFunction) {
tool.setToolFunctionInstance((ToolFunction) instance);
} else {
throw new RuntimeException(
"Class does not implement ToolFunction: " + fqcn);
}
} catch (Exception e) {
throw new RuntimeException(
"Failed to instantiate tool function: " + fqcn, e);
}
}
}
List<Tools.Tool> agentTools = new ArrayList<>();
for (AgentToolSpec a : tools) {
Tools.Tool t = new Tools.Tool();
t.setToolFunction(a.getToolFunctionInstance());
Tools.ToolSpec ts = new Tools.ToolSpec();
ts.setName(a.getName());
ts.setDescription(a.getDescription());
ts.setParameters(a.getParameters());
t.setToolSpec(ts);
agentTools.add(t);
}
Ollama ollama = new Ollama(agentSpec.getHost());
ollama.setRequestTimeoutSeconds(120);
return new Agent(
agentSpec.getName(),
ollama,
agentSpec.getModel(),
agentSpec.getCustomPrompt(),
agentTools);
} catch (Exception e) {
throw new RuntimeException("Failed to load agent from YAML", e);
}
}
/**
* Facilitates a single round of chat for the agent:
*
* <ul>
* <li>Builds/promotes the system prompt on the first turn if necessary
* <li>Adds the user's input to chat history
* <li>Submits the chat turn to the Ollama model (with tool/function support)
* <li>Updates internal chat history in accordance with the Ollama chat result
* </ul>
*
* @param userInput The user's message or question for the agent.
* @return The model's response as a string.
* @throws OllamaException If there is a problem with the Ollama API.
*/
public String interact(String userInput, OllamaChatStreamObserver chatTokenHandler)
throws OllamaException {
// Build a concise and readable description of available tools
String availableToolsDescription =
tools.isEmpty()
? ""
: tools.stream()
.map(
t ->
String.format(
"- %s: %s",
t.getToolSpec().getName(),
t.getToolSpec().getDescription() != null
? t.getToolSpec().getDescription()
: "No description"))
.reduce((a, b) -> a + "\n" + b)
.map(desc -> "\nYou have access to the following tools:\n" + desc)
.orElse("");
// Add system prompt if chatHistory is empty
if (chatHistory.isEmpty()) {
String systemPrompt =
String.format(
"You are a helpful AI assistant named %s. Your actions are limited to"
+ " using the available tools. %s%s",
name,
(customPrompt != null ? customPrompt : ""),
availableToolsDescription);
chatHistory.add(new OllamaChatMessage(OllamaChatMessageRole.SYSTEM, systemPrompt));
}
// Add the user input as a message before sending request
chatHistory.add(new OllamaChatMessage(OllamaChatMessageRole.USER, userInput));
OllamaChatRequest request =
OllamaChatRequest.builder()
.withTools(tools)
.withUseTools(true)
.withModel(model)
.withMessages(chatHistory)
.build();
OllamaChatResult response = ollamaClient.chat(request, chatTokenHandler);
// Update chat history for continuity
chatHistory.clear();
chatHistory.addAll(response.getChatHistory());
return response.getResponseModel().getMessage().getResponse();
}
/**
* Launches an endless interactive console session with the agent, echoing user input and the
* agent's response using the provided chat model and tools.
*
* <p>Type {@code exit} to break the loop and terminate the session.
*
* @throws OllamaException if any errors occur talking to the Ollama API.
*/
public void runInteractive() throws OllamaException {
Scanner sc = new Scanner(System.in);
while (true) {
System.out.print("\n[You]: ");
String input = sc.nextLine();
if ("exit".equalsIgnoreCase(input)) break;
this.interact(
input,
new OllamaChatStreamObserver(
new ConsoleOutputGenerateTokenHandler(),
new ConsoleOutputGenerateTokenHandler()));
}
}
/**
* Bean describing an agent as definable from YAML.
*
* <ul>
* <li>{@code name}: Agent display name
* <li>{@code description}: Freeform description
* <li>{@code tools}: List of tools/functions to enable
* <li>{@code host}: Target Ollama host address
* <li>{@code model}: Name of Ollama model to use
* <li>{@code customPrompt}: Agent's custom base prompt
* <li>{@code requestTimeoutSeconds}: Timeout for requests
* </ul>
*/
@Data
public static class AgentSpec {
private String name;
private String description;
private List<AgentToolSpec> tools;
private String host;
private String model;
private String customPrompt;
private int requestTimeoutSeconds;
}
/**
* Subclass extension of {@link Tools.ToolSpec}, which allows associating a tool with a function
* implementation (via FQCN).
*/
@Data
@Setter
@Getter
@EqualsAndHashCode(callSuper = false)
private static class AgentToolSpec extends Tools.ToolSpec {
/**
* Fully qualified class name of the tool's {@link ToolFunction} implementation
*/
private String toolFunctionFQCN = null;
/**
* Instance of the {@link ToolFunction} to invoke
*/
private ToolFunction toolFunctionInstance = null;
}
/**
* Bean for describing a tool function parameter for use in agent YAML definitions.
*/
@Data
public class AgentToolParameter {
/**
* The parameter's type (e.g., string, number, etc.)
*/
private String type;
/**
* Description of the parameter
*/
private String description;
/**
* Whether this parameter is required
*/
private boolean required;
/**
* Enum values (if any) that this parameter may take; _enum used because 'enum' is reserved
*/
private List<String> _enum; // `enum` is a reserved keyword, so use _enum or similar
}
}

View File

@@ -11,6 +11,9 @@ package io.github.ollama4j.models.chat;
import io.github.ollama4j.models.request.OllamaCommonRequest;
import io.github.ollama4j.tools.Tools;
import io.github.ollama4j.utils.OllamaRequestBody;
import io.github.ollama4j.utils.Options;
import java.io.File;
import java.util.ArrayList;
import java.util.Collections;
import java.util.List;
import lombok.Getter;
@@ -20,8 +23,8 @@ import lombok.Setter;
* Defines a Request to use against the ollama /api/chat endpoint.
*
* @see <a href=
* "https://github.com/ollama/ollama/blob/main/docs/api.md#generate-a-chat-completion">Generate
* Chat Completion</a>
* "https://github.com/ollama/ollama/blob/main/docs/api.md#generate-a-chat-completion">Generate
* Chat Completion</a>
*/
@Getter
@Setter
@@ -36,11 +39,15 @@ public class OllamaChatRequest extends OllamaCommonRequest implements OllamaRequ
/**
* Controls whether tools are automatically executed.
*
* <p>If set to {@code true} (the default), tools will be automatically used/applied by the
* library. If set to {@code false}, tool calls will be returned to the client for manual
* <p>
* If set to {@code true} (the default), tools will be automatically
* used/applied by the
* library. If set to {@code false}, tool calls will be returned to the client
* for manual
* handling.
*
* <p>Disabling this should be an explicit operation.
* <p>
* Disabling this should be an explicit operation.
*/
private boolean useTools = true;
@@ -57,7 +64,116 @@ public class OllamaChatRequest extends OllamaCommonRequest implements OllamaRequ
if (!(o instanceof OllamaChatRequest)) {
return false;
}
return this.toString().equals(o.toString());
}
// --- Builder-like fluent API methods ---
public static OllamaChatRequest builder() {
OllamaChatRequest req = new OllamaChatRequest();
req.setMessages(new ArrayList<>());
return req;
}
public OllamaChatRequest withModel(String model) {
this.setModel(model);
return this;
}
public OllamaChatRequest withMessage(OllamaChatMessageRole role, String content) {
return withMessage(role, content, Collections.emptyList());
}
public OllamaChatRequest withMessage(
OllamaChatMessageRole role, String content, List<OllamaChatToolCalls> toolCalls) {
if (this.messages == null || this.messages == Collections.EMPTY_LIST) {
this.messages = new ArrayList<>();
}
this.messages.add(new OllamaChatMessage(role, content, null, toolCalls, null));
return this;
}
public OllamaChatRequest withMessage(
OllamaChatMessageRole role,
String content,
List<OllamaChatToolCalls> toolCalls,
List<File> images) {
if (this.messages == null || this.messages == Collections.EMPTY_LIST) {
this.messages = new ArrayList<>();
}
List<byte[]> imagesAsBytes = new ArrayList<>();
if (images != null) {
for (File image : images) {
try {
imagesAsBytes.add(java.nio.file.Files.readAllBytes(image.toPath()));
} catch (java.io.IOException e) {
throw new RuntimeException(
"Failed to read image file: " + image.getAbsolutePath(), e);
}
}
}
this.messages.add(new OllamaChatMessage(role, content, null, toolCalls, imagesAsBytes));
return this;
}
public OllamaChatRequest withMessages(List<OllamaChatMessage> messages) {
this.setMessages(messages);
return this;
}
public OllamaChatRequest withOptions(Options options) {
if (options != null) {
this.setOptions(options.getOptionsMap());
}
return this;
}
public OllamaChatRequest withGetJsonResponse() {
this.setFormat("json");
return this;
}
public OllamaChatRequest withTemplate(String template) {
this.setTemplate(template);
return this;
}
public OllamaChatRequest withStreaming() {
this.setStream(true);
return this;
}
public OllamaChatRequest withKeepAlive(String keepAlive) {
this.setKeepAlive(keepAlive);
return this;
}
public OllamaChatRequest withThinking(boolean think) {
this.setThink(think);
return this;
}
public OllamaChatRequest withUseTools(boolean useTools) {
this.setUseTools(useTools);
return this;
}
public OllamaChatRequest withTools(List<Tools.Tool> tools) {
this.setTools(tools);
return this;
}
public OllamaChatRequest build() {
return this;
}
public void reset() {
// Only clear the messages, keep model and think as is
if (this.messages == null || this.messages == Collections.EMPTY_LIST) {
this.messages = new ArrayList<>();
} else {
this.messages.clear();
}
}
}

View File

@@ -1,176 +0,0 @@
/*
* Ollama4j - Java library for interacting with Ollama server.
* Copyright (c) 2025 Amith Koujalgi and contributors.
*
* Licensed under the MIT License (the "License");
* you may not use this file except in compliance with the License.
*
*/
package io.github.ollama4j.models.chat;
import io.github.ollama4j.utils.Options;
import io.github.ollama4j.utils.Utils;
import java.io.File;
import java.io.IOException;
import java.nio.file.Files;
import java.util.ArrayList;
import java.util.Collections;
import java.util.List;
import java.util.stream.Collectors;
import lombok.Setter;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
/** Helper class for creating {@link OllamaChatRequest} objects using the builder-pattern. */
public class OllamaChatRequestBuilder {
private static final Logger LOG = LoggerFactory.getLogger(OllamaChatRequestBuilder.class);
private int imageURLConnectTimeoutSeconds = 10;
private int imageURLReadTimeoutSeconds = 10;
private OllamaChatRequest request;
@Setter private boolean useTools = true;
private OllamaChatRequestBuilder() {
request = new OllamaChatRequest();
request.setMessages(new ArrayList<>());
}
public static OllamaChatRequestBuilder builder() {
return new OllamaChatRequestBuilder();
}
public OllamaChatRequestBuilder withImageURLConnectTimeoutSeconds(
int imageURLConnectTimeoutSeconds) {
this.imageURLConnectTimeoutSeconds = imageURLConnectTimeoutSeconds;
return this;
}
public OllamaChatRequestBuilder withImageURLReadTimeoutSeconds(int imageURLReadTimeoutSeconds) {
this.imageURLReadTimeoutSeconds = imageURLReadTimeoutSeconds;
return this;
}
public OllamaChatRequestBuilder withModel(String model) {
request.setModel(model);
return this;
}
public void reset() {
request = new OllamaChatRequest(request.getModel(), request.isThink(), new ArrayList<>());
}
public OllamaChatRequestBuilder withMessage(OllamaChatMessageRole role, String content) {
return withMessage(role, content, Collections.emptyList());
}
public OllamaChatRequestBuilder withMessage(
OllamaChatMessageRole role, String content, List<OllamaChatToolCalls> toolCalls) {
List<OllamaChatMessage> messages = this.request.getMessages();
messages.add(new OllamaChatMessage(role, content, null, toolCalls, null));
return this;
}
public OllamaChatRequestBuilder withMessage(
OllamaChatMessageRole role,
String content,
List<OllamaChatToolCalls> toolCalls,
List<File> images) {
List<OllamaChatMessage> messages = this.request.getMessages();
List<byte[]> binaryImages =
images.stream()
.map(
file -> {
try {
return Files.readAllBytes(file.toPath());
} catch (IOException e) {
LOG.warn(
"File '{}' could not be accessed, will not add to"
+ " message!",
file.toPath(),
e);
return new byte[0];
}
})
.collect(Collectors.toList());
messages.add(new OllamaChatMessage(role, content, null, toolCalls, binaryImages));
return this;
}
public OllamaChatRequestBuilder withMessage(
OllamaChatMessageRole role,
String content,
List<OllamaChatToolCalls> toolCalls,
String... imageUrls)
throws IOException, InterruptedException {
List<OllamaChatMessage> messages = this.request.getMessages();
List<byte[]> binaryImages = null;
if (imageUrls.length > 0) {
binaryImages = new ArrayList<>();
for (String imageUrl : imageUrls) {
try {
binaryImages.add(
Utils.loadImageBytesFromUrl(
imageUrl,
imageURLConnectTimeoutSeconds,
imageURLReadTimeoutSeconds));
} catch (InterruptedException e) {
LOG.error("Failed to load image from URL: '{}'. Cause: {}", imageUrl, e);
Thread.currentThread().interrupt();
throw new InterruptedException(
"Interrupted while loading image from URL: " + imageUrl);
} catch (IOException e) {
LOG.error(
"IOException occurred while loading image from URL '{}'. Cause: {}",
imageUrl,
e.getMessage(),
e);
throw new IOException(
"IOException while loading image from URL: " + imageUrl, e);
}
}
}
messages.add(new OllamaChatMessage(role, content, null, toolCalls, binaryImages));
return this;
}
public OllamaChatRequestBuilder withMessages(List<OllamaChatMessage> messages) {
request.setMessages(messages);
return this;
}
public OllamaChatRequestBuilder withOptions(Options options) {
this.request.setOptions(options.getOptionsMap());
return this;
}
public OllamaChatRequestBuilder withGetJsonResponse() {
this.request.setFormat("json");
return this;
}
public OllamaChatRequestBuilder withTemplate(String template) {
this.request.setTemplate(template);
return this;
}
public OllamaChatRequestBuilder withStreaming() {
this.request.setStream(true);
return this;
}
public OllamaChatRequestBuilder withKeepAlive(String keepAlive) {
this.request.setKeepAlive(keepAlive);
return this;
}
public OllamaChatRequestBuilder withThinking(boolean think) {
this.request.setThink(think);
return this;
}
public OllamaChatRequest build() {
request.setUseTools(useTools);
return request;
}
}

View File

@@ -11,7 +11,14 @@ package io.github.ollama4j.models.generate;
import io.github.ollama4j.models.request.OllamaCommonRequest;
import io.github.ollama4j.tools.Tools;
import io.github.ollama4j.utils.OllamaRequestBody;
import io.github.ollama4j.utils.Options;
import java.io.File;
import java.io.IOException;
import java.nio.file.Files;
import java.util.ArrayList;
import java.util.Base64;
import java.util.List;
import java.util.Map;
import lombok.Getter;
import lombok.Setter;
@@ -41,6 +48,100 @@ public class OllamaGenerateRequest extends OllamaCommonRequest implements Ollama
this.images = images;
}
// --- Builder-style methods ---
public static OllamaGenerateRequest builder() {
return new OllamaGenerateRequest();
}
public OllamaGenerateRequest withPrompt(String prompt) {
this.setPrompt(prompt);
return this;
}
public OllamaGenerateRequest withTools(List<Tools.Tool> tools) {
this.setTools(tools);
return this;
}
public OllamaGenerateRequest withModel(String model) {
this.setModel(model);
return this;
}
public OllamaGenerateRequest withGetJsonResponse() {
this.setFormat("json");
return this;
}
public OllamaGenerateRequest withOptions(Options options) {
this.setOptions(options.getOptionsMap());
return this;
}
public OllamaGenerateRequest withTemplate(String template) {
this.setTemplate(template);
return this;
}
public OllamaGenerateRequest withStreaming(boolean streaming) {
this.setStream(streaming);
return this;
}
public OllamaGenerateRequest withKeepAlive(String keepAlive) {
this.setKeepAlive(keepAlive);
return this;
}
public OllamaGenerateRequest withRaw(boolean raw) {
this.setRaw(raw);
return this;
}
public OllamaGenerateRequest withThink(boolean think) {
this.setThink(think);
return this;
}
public OllamaGenerateRequest withUseTools(boolean useTools) {
this.setUseTools(useTools);
return this;
}
public OllamaGenerateRequest withFormat(Map<String, Object> format) {
this.setFormat(format);
return this;
}
public OllamaGenerateRequest withSystem(String system) {
this.setSystem(system);
return this;
}
public OllamaGenerateRequest withContext(String context) {
this.setContext(context);
return this;
}
public OllamaGenerateRequest withImagesBase64(List<String> images) {
this.setImages(images);
return this;
}
public OllamaGenerateRequest withImages(List<File> imageFiles) throws IOException {
List<String> images = new ArrayList<>();
for (File imageFile : imageFiles) {
images.add(Base64.getEncoder().encodeToString(Files.readAllBytes(imageFile.toPath())));
}
this.setImages(images);
return this;
}
public OllamaGenerateRequest build() {
return this;
}
@Override
public boolean equals(Object o) {
if (!(o instanceof OllamaGenerateRequest)) {

View File

@@ -1,121 +0,0 @@
/*
* Ollama4j - Java library for interacting with Ollama server.
* Copyright (c) 2025 Amith Koujalgi and contributors.
*
* Licensed under the MIT License (the "License");
* you may not use this file except in compliance with the License.
*
*/
package io.github.ollama4j.models.generate;
import io.github.ollama4j.tools.Tools;
import io.github.ollama4j.utils.Options;
import java.io.File;
import java.io.IOException;
import java.nio.file.Files;
import java.util.ArrayList;
import java.util.Base64;
import java.util.List;
/** Helper class for creating {@link OllamaGenerateRequest} objects using the builder-pattern. */
public class OllamaGenerateRequestBuilder {
private OllamaGenerateRequestBuilder() {
request = new OllamaGenerateRequest();
}
private OllamaGenerateRequest request;
public static OllamaGenerateRequestBuilder builder() {
return new OllamaGenerateRequestBuilder();
}
public OllamaGenerateRequest build() {
return request;
}
public OllamaGenerateRequestBuilder withPrompt(String prompt) {
request.setPrompt(prompt);
return this;
}
public OllamaGenerateRequestBuilder withTools(List<Tools.Tool> tools) {
request.setTools(tools);
return this;
}
public OllamaGenerateRequestBuilder withModel(String model) {
request.setModel(model);
return this;
}
public OllamaGenerateRequestBuilder withGetJsonResponse() {
this.request.setFormat("json");
return this;
}
public OllamaGenerateRequestBuilder withOptions(Options options) {
this.request.setOptions(options.getOptionsMap());
return this;
}
public OllamaGenerateRequestBuilder withTemplate(String template) {
this.request.setTemplate(template);
return this;
}
public OllamaGenerateRequestBuilder withStreaming(boolean streaming) {
this.request.setStream(streaming);
return this;
}
public OllamaGenerateRequestBuilder withKeepAlive(String keepAlive) {
this.request.setKeepAlive(keepAlive);
return this;
}
public OllamaGenerateRequestBuilder withRaw(boolean raw) {
this.request.setRaw(raw);
return this;
}
public OllamaGenerateRequestBuilder withThink(boolean think) {
this.request.setThink(think);
return this;
}
public OllamaGenerateRequestBuilder withUseTools(boolean useTools) {
this.request.setUseTools(useTools);
return this;
}
public OllamaGenerateRequestBuilder withFormat(java.util.Map<String, Object> format) {
this.request.setFormat(format);
return this;
}
public OllamaGenerateRequestBuilder withSystem(String system) {
this.request.setSystem(system);
return this;
}
public OllamaGenerateRequestBuilder withContext(String context) {
this.request.setContext(context);
return this;
}
public OllamaGenerateRequestBuilder withImagesBase64(java.util.List<String> images) {
this.request.setImages(images);
return this;
}
public OllamaGenerateRequestBuilder withImages(java.util.List<File> imageFiles)
throws IOException {
java.util.List<String> images = new ArrayList<>();
for (File imageFile : imageFiles) {
images.add(Base64.getEncoder().encodeToString(Files.readAllBytes(imageFile.toPath())));
}
this.request.setImages(images);
return this;
}
}

View File

@@ -96,7 +96,6 @@ public class OllamaChatEndpointCaller extends OllamaEndpointCaller {
getRequestBuilderDefault(uri).POST(body.getBodyPublisher());
HttpRequest request = requestBuilder.build();
LOG.debug("Asking model: {}", body);
System.out.println("Asking model: " + Utils.toJSON(body));
HttpResponse<InputStream> response =
httpClient.send(request, HttpResponse.BodyHandlers.ofInputStream());
@@ -142,7 +141,6 @@ public class OllamaChatEndpointCaller extends OllamaEndpointCaller {
responseBuffer);
if (statusCode != 200) {
LOG.error("Status code: {}", statusCode);
System.out.println(responseBuffer);
throw new OllamaException(responseBuffer.toString());
}
if (wantedToolsForStream != null && ollamaChatResponseModel != null) {

View File

@@ -136,18 +136,21 @@ public class OllamaGenerateEndpointCaller extends OllamaEndpointCaller {
thinkingBuffer.toString(),
endTime - startTime,
statusCode);
ollamaResult.setModel(ollamaGenerateResponseModel.getModel());
ollamaResult.setCreatedAt(ollamaGenerateResponseModel.getCreatedAt());
ollamaResult.setDone(ollamaGenerateResponseModel.isDone());
ollamaResult.setDoneReason(ollamaGenerateResponseModel.getDoneReason());
ollamaResult.setContext(ollamaGenerateResponseModel.getContext());
ollamaResult.setTotalDuration(ollamaGenerateResponseModel.getTotalDuration());
ollamaResult.setLoadDuration(ollamaGenerateResponseModel.getLoadDuration());
ollamaResult.setPromptEvalCount(ollamaGenerateResponseModel.getPromptEvalCount());
ollamaResult.setPromptEvalDuration(ollamaGenerateResponseModel.getPromptEvalDuration());
ollamaResult.setEvalCount(ollamaGenerateResponseModel.getEvalCount());
ollamaResult.setEvalDuration(ollamaGenerateResponseModel.getEvalDuration());
if (ollamaGenerateResponseModel != null) {
ollamaResult.setModel(ollamaGenerateResponseModel.getModel());
ollamaResult.setCreatedAt(ollamaGenerateResponseModel.getCreatedAt());
ollamaResult.setDone(ollamaGenerateResponseModel.isDone());
ollamaResult.setDoneReason(ollamaGenerateResponseModel.getDoneReason());
ollamaResult.setContext(ollamaGenerateResponseModel.getContext());
ollamaResult.setTotalDuration(ollamaGenerateResponseModel.getTotalDuration());
ollamaResult.setLoadDuration(ollamaGenerateResponseModel.getLoadDuration());
ollamaResult.setPromptEvalCount(ollamaGenerateResponseModel.getPromptEvalCount());
ollamaResult.setPromptEvalDuration(
ollamaGenerateResponseModel.getPromptEvalDuration());
ollamaResult.setEvalCount(ollamaGenerateResponseModel.getEvalCount());
ollamaResult.setEvalDuration(ollamaGenerateResponseModel.getEvalDuration());
}
LOG.debug("Model plain response: {}", ollamaGenerateResponseModel);
LOG.debug("Model response: {}", ollamaResult);
return ollamaResult;
}

View File

@@ -11,7 +11,11 @@ package io.github.ollama4j.tools;
import com.fasterxml.jackson.annotation.JsonIgnore;
import com.fasterxml.jackson.annotation.JsonInclude;
import com.fasterxml.jackson.annotation.JsonProperty;
import com.fasterxml.jackson.core.type.TypeReference;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.fasterxml.jackson.databind.node.ObjectNode;
import com.fasterxml.jackson.dataformat.yaml.YAMLFactory;
import java.io.File;
import java.util.ArrayList;
import java.util.List;
import java.util.Map;
@@ -116,4 +120,53 @@ public class Tools {
@JsonIgnore private boolean required;
}
public static List<Tool> fromJSONFile(String filePath, Map<String, ToolFunction> functionMap) {
try {
ObjectMapper mapper = new ObjectMapper();
List<Map<String, Object>> rawTools =
mapper.readValue(
new File(filePath),
new com.fasterxml.jackson.core.type.TypeReference<>() {});
List<Tool> tools = new ArrayList<>();
for (Map<String, Object> rawTool : rawTools) {
String json = mapper.writeValueAsString(rawTool);
Tool tool = mapper.readValue(json, Tool.class);
String toolName = tool.getToolSpec().getName();
for (Map.Entry<String, ToolFunction> toolFunctionEntry : functionMap.entrySet()) {
if (toolFunctionEntry.getKey().equals(toolName)) {
tool.setToolFunction(toolFunctionEntry.getValue());
}
}
tools.add(tool);
}
return tools;
} catch (Exception e) {
throw new RuntimeException("Failed to load tools from file: " + filePath, e);
}
}
public static List<Tool> fromYAMLFile(String filePath, Map<String, ToolFunction> functionMap) {
try {
ObjectMapper mapper = new ObjectMapper(new YAMLFactory());
List<Map<String, Object>> rawTools =
mapper.readValue(new File(filePath), new TypeReference<>() {});
List<Tool> tools = new ArrayList<>();
for (Map<String, Object> rawTool : rawTools) {
String yaml = mapper.writeValueAsString(rawTool);
Tool tool = mapper.readValue(yaml, Tool.class);
String toolName = tool.getToolSpec().getName();
ToolFunction function = functionMap.get(toolName);
if (function != null) {
tool.setToolFunction(function);
}
tools.add(tool);
}
return tools;
} catch (Exception e) {
throw new RuntimeException("Failed to load tools from YAML file: " + filePath, e);
}
}
}

View File

@@ -18,7 +18,6 @@ import io.github.ollama4j.models.chat.*;
import io.github.ollama4j.models.embed.OllamaEmbedRequest;
import io.github.ollama4j.models.embed.OllamaEmbedResult;
import io.github.ollama4j.models.generate.OllamaGenerateRequest;
import io.github.ollama4j.models.generate.OllamaGenerateRequestBuilder;
import io.github.ollama4j.models.generate.OllamaGenerateStreamObserver;
import io.github.ollama4j.models.response.Model;
import io.github.ollama4j.models.response.ModelDetail;
@@ -272,7 +271,7 @@ class OllamaIntegrationTest {
format.put("required", List.of("isNoon"));
OllamaGenerateRequest request =
OllamaGenerateRequestBuilder.builder()
OllamaGenerateRequest.builder()
.withModel(TOOLS_MODEL)
.withPrompt(prompt)
.withFormat(format)
@@ -299,7 +298,7 @@ class OllamaIntegrationTest {
boolean raw = false;
boolean thinking = false;
OllamaGenerateRequest request =
OllamaGenerateRequestBuilder.builder()
OllamaGenerateRequest.builder()
.withModel(GENERAL_PURPOSE_MODEL)
.withPrompt(
"What is the capital of France? And what's France's connection with"
@@ -327,7 +326,7 @@ class OllamaIntegrationTest {
api.pullModel(GENERAL_PURPOSE_MODEL);
boolean raw = false;
OllamaGenerateRequest request =
OllamaGenerateRequestBuilder.builder()
OllamaGenerateRequest.builder()
.withModel(GENERAL_PURPOSE_MODEL)
.withPrompt(
"What is the capital of France? And what's France's connection with"
@@ -357,8 +356,7 @@ class OllamaIntegrationTest {
void shouldGenerateWithCustomOptions() throws OllamaException {
api.pullModel(GENERAL_PURPOSE_MODEL);
OllamaChatRequestBuilder builder =
OllamaChatRequestBuilder.builder().withModel(GENERAL_PURPOSE_MODEL);
OllamaChatRequest builder = OllamaChatRequest.builder().withModel(GENERAL_PURPOSE_MODEL);
OllamaChatRequest requestModel =
builder.withMessage(
OllamaChatMessageRole.SYSTEM,
@@ -390,8 +388,7 @@ class OllamaIntegrationTest {
String expectedResponse = "Bhai";
OllamaChatRequestBuilder builder =
OllamaChatRequestBuilder.builder().withModel(GENERAL_PURPOSE_MODEL);
OllamaChatRequest builder = OllamaChatRequest.builder().withModel(GENERAL_PURPOSE_MODEL);
OllamaChatRequest requestModel =
builder.withMessage(
OllamaChatMessageRole.SYSTEM,
@@ -429,8 +426,7 @@ class OllamaIntegrationTest {
@Order(10)
void shouldChatWithHistory() throws Exception {
api.pullModel(THINKING_TOOL_MODEL);
OllamaChatRequestBuilder builder =
OllamaChatRequestBuilder.builder().withModel(THINKING_TOOL_MODEL);
OllamaChatRequest builder = OllamaChatRequest.builder().withModel(THINKING_TOOL_MODEL);
OllamaChatRequest requestModel =
builder.withMessage(
@@ -481,8 +477,7 @@ class OllamaIntegrationTest {
void shouldChatWithExplicitTool() throws OllamaException {
String theToolModel = TOOLS_MODEL;
api.pullModel(theToolModel);
OllamaChatRequestBuilder builder =
OllamaChatRequestBuilder.builder().withModel(theToolModel);
OllamaChatRequest builder = OllamaChatRequest.builder().withModel(theToolModel);
api.registerTool(EmployeeFinderToolSpec.getSpecification());
@@ -534,8 +529,7 @@ class OllamaIntegrationTest {
void shouldChatWithExplicitToolAndUseTools() throws OllamaException {
String theToolModel = TOOLS_MODEL;
api.pullModel(theToolModel);
OllamaChatRequestBuilder builder =
OllamaChatRequestBuilder.builder().withModel(theToolModel);
OllamaChatRequest builder = OllamaChatRequest.builder().withModel(theToolModel);
api.registerTool(EmployeeFinderToolSpec.getSpecification());
@@ -579,8 +573,7 @@ class OllamaIntegrationTest {
String theToolModel = TOOLS_MODEL;
api.pullModel(theToolModel);
OllamaChatRequestBuilder builder =
OllamaChatRequestBuilder.builder().withModel(theToolModel);
OllamaChatRequest builder = OllamaChatRequest.builder().withModel(theToolModel);
api.registerTool(EmployeeFinderToolSpec.getSpecification());
@@ -633,8 +626,7 @@ class OllamaIntegrationTest {
void shouldChatWithAnnotatedToolSingleParam() throws OllamaException {
String theToolModel = TOOLS_MODEL;
api.pullModel(theToolModel);
OllamaChatRequestBuilder builder =
OllamaChatRequestBuilder.builder().withModel(theToolModel);
OllamaChatRequest builder = OllamaChatRequest.builder().withModel(theToolModel);
api.registerAnnotatedTools();
@@ -680,8 +672,7 @@ class OllamaIntegrationTest {
void shouldChatWithAnnotatedToolMultipleParams() throws OllamaException {
String theToolModel = TOOLS_MODEL;
api.pullModel(theToolModel);
OllamaChatRequestBuilder builder =
OllamaChatRequestBuilder.builder().withModel(theToolModel);
OllamaChatRequest builder = OllamaChatRequest.builder().withModel(theToolModel);
api.registerAnnotatedTools(new AnnotatedTool());
@@ -712,8 +703,7 @@ class OllamaIntegrationTest {
void shouldChatWithStream() throws OllamaException {
api.deregisterTools();
api.pullModel(GENERAL_PURPOSE_MODEL);
OllamaChatRequestBuilder builder =
OllamaChatRequestBuilder.builder().withModel(GENERAL_PURPOSE_MODEL);
OllamaChatRequest builder = OllamaChatRequest.builder().withModel(GENERAL_PURPOSE_MODEL);
OllamaChatRequest requestModel =
builder.withMessage(
OllamaChatMessageRole.USER,
@@ -739,8 +729,7 @@ class OllamaIntegrationTest {
@Order(15)
void shouldChatWithThinkingAndStream() throws OllamaException {
api.pullModel(THINKING_TOOL_MODEL_2);
OllamaChatRequestBuilder builder =
OllamaChatRequestBuilder.builder().withModel(THINKING_TOOL_MODEL_2);
OllamaChatRequest builder = OllamaChatRequest.builder().withModel(THINKING_TOOL_MODEL_2);
OllamaChatRequest requestModel =
builder.withMessage(
OllamaChatMessageRole.USER,
@@ -758,32 +747,6 @@ class OllamaIntegrationTest {
assertNotNull(chatResult.getResponseModel().getMessage().getResponse());
}
/**
* Tests chat API with an image input from a URL.
*
* <p>Scenario: Sends a user message with an image URL and verifies the assistant's response.
* Usage: chat, vision model, image from URL, no tools, no thinking, no streaming.
*/
@Test
@Order(10)
void shouldChatWithImageFromURL() throws OllamaException, IOException, InterruptedException {
api.pullModel(VISION_MODEL);
OllamaChatRequestBuilder builder =
OllamaChatRequestBuilder.builder().withModel(VISION_MODEL);
OllamaChatRequest requestModel =
builder.withMessage(
OllamaChatMessageRole.USER,
"What's in the picture?",
Collections.emptyList(),
"https://t3.ftcdn.net/jpg/02/96/63/80/360_F_296638053_0gUVA4WVBKceGsIr7LNqRWSnkusi07dq.jpg")
.build();
api.registerAnnotatedTools(new OllamaIntegrationTest());
OllamaChatResult chatResult = api.chat(requestModel, null);
assertNotNull(chatResult);
}
/**
* Tests chat API with an image input from a file and multi-turn history.
*
@@ -795,8 +758,7 @@ class OllamaIntegrationTest {
@Order(10)
void shouldChatWithImageFromFileAndHistory() throws OllamaException {
api.pullModel(VISION_MODEL);
OllamaChatRequestBuilder builder =
OllamaChatRequestBuilder.builder().withModel(VISION_MODEL);
OllamaChatRequest builder = OllamaChatRequest.builder().withModel(VISION_MODEL);
OllamaChatRequest requestModel =
builder.withMessage(
OllamaChatMessageRole.USER,
@@ -832,7 +794,7 @@ class OllamaIntegrationTest {
api.pullModel(VISION_MODEL);
try {
OllamaGenerateRequest request =
OllamaGenerateRequestBuilder.builder()
OllamaGenerateRequest.builder()
.withModel(VISION_MODEL)
.withPrompt("What is in this image?")
.withRaw(false)
@@ -865,7 +827,7 @@ class OllamaIntegrationTest {
void shouldGenerateWithImageFilesAndResponseStreamed() throws OllamaException, IOException {
api.pullModel(VISION_MODEL);
OllamaGenerateRequest request =
OllamaGenerateRequestBuilder.builder()
OllamaGenerateRequest.builder()
.withModel(VISION_MODEL)
.withPrompt("What is in this image?")
.withRaw(false)
@@ -900,7 +862,7 @@ class OllamaIntegrationTest {
boolean think = true;
OllamaGenerateRequest request =
OllamaGenerateRequestBuilder.builder()
OllamaGenerateRequest.builder()
.withModel(THINKING_TOOL_MODEL)
.withPrompt("Who are you?")
.withRaw(raw)
@@ -929,7 +891,7 @@ class OllamaIntegrationTest {
api.pullModel(THINKING_TOOL_MODEL);
boolean raw = false;
OllamaGenerateRequest request =
OllamaGenerateRequestBuilder.builder()
OllamaGenerateRequest.builder()
.withModel(THINKING_TOOL_MODEL)
.withPrompt("Who are you?")
.withRaw(raw)
@@ -967,7 +929,7 @@ class OllamaIntegrationTest {
boolean raw = true;
boolean thinking = false;
OllamaGenerateRequest request =
OllamaGenerateRequestBuilder.builder()
OllamaGenerateRequest.builder()
.withModel(GENERAL_PURPOSE_MODEL)
.withPrompt("What is 2+2?")
.withRaw(raw)
@@ -995,7 +957,7 @@ class OllamaIntegrationTest {
api.pullModel(GENERAL_PURPOSE_MODEL);
boolean raw = true;
OllamaGenerateRequest request =
OllamaGenerateRequestBuilder.builder()
OllamaGenerateRequest.builder()
.withModel(GENERAL_PURPOSE_MODEL)
.withPrompt("What is the largest planet in our solar system?")
.withRaw(raw)
@@ -1028,7 +990,7 @@ class OllamaIntegrationTest {
// 'response' tokens
boolean raw = true;
OllamaGenerateRequest request =
OllamaGenerateRequestBuilder.builder()
OllamaGenerateRequest.builder()
.withModel(THINKING_TOOL_MODEL)
.withPrompt(
"Count 1 to 5. Just give me the numbers and do not give any other"
@@ -1093,7 +1055,7 @@ class OllamaIntegrationTest {
format.put("required", List.of("cities"));
OllamaGenerateRequest request =
OllamaGenerateRequestBuilder.builder()
OllamaGenerateRequest.builder()
.withModel(TOOLS_MODEL)
.withPrompt(prompt)
.withFormat(format)
@@ -1119,8 +1081,7 @@ class OllamaIntegrationTest {
@Order(26)
void shouldChatWithThinkingNoStream() throws OllamaException {
api.pullModel(THINKING_TOOL_MODEL);
OllamaChatRequestBuilder builder =
OllamaChatRequestBuilder.builder().withModel(THINKING_TOOL_MODEL);
OllamaChatRequest builder = OllamaChatRequest.builder().withModel(THINKING_TOOL_MODEL);
OllamaChatRequest requestModel =
builder.withMessage(
OllamaChatMessageRole.USER,
@@ -1149,8 +1110,7 @@ class OllamaIntegrationTest {
void shouldChatWithCustomOptionsAndStreaming() throws OllamaException {
api.pullModel(GENERAL_PURPOSE_MODEL);
OllamaChatRequestBuilder builder =
OllamaChatRequestBuilder.builder().withModel(GENERAL_PURPOSE_MODEL);
OllamaChatRequest builder = OllamaChatRequest.builder().withModel(GENERAL_PURPOSE_MODEL);
OllamaChatRequest requestModel =
builder.withMessage(
OllamaChatMessageRole.USER,
@@ -1184,8 +1144,7 @@ class OllamaIntegrationTest {
api.registerTool(EmployeeFinderToolSpec.getSpecification());
OllamaChatRequestBuilder builder =
OllamaChatRequestBuilder.builder().withModel(THINKING_TOOL_MODEL_2);
OllamaChatRequest builder = OllamaChatRequest.builder().withModel(THINKING_TOOL_MODEL_2);
OllamaChatRequest requestModel =
builder.withMessage(
OllamaChatMessageRole.USER,
@@ -1219,8 +1178,7 @@ class OllamaIntegrationTest {
File image1 = getImageFileFromClasspath("emoji-smile.jpeg");
File image2 = getImageFileFromClasspath("roses.jpg");
OllamaChatRequestBuilder builder =
OllamaChatRequestBuilder.builder().withModel(VISION_MODEL);
OllamaChatRequest builder = OllamaChatRequest.builder().withModel(VISION_MODEL);
OllamaChatRequest requestModel =
builder.withMessage(
OllamaChatMessageRole.USER,
@@ -1247,7 +1205,7 @@ class OllamaIntegrationTest {
void shouldHandleNonExistentModel() {
String nonExistentModel = "this-model-does-not-exist:latest";
OllamaGenerateRequest request =
OllamaGenerateRequestBuilder.builder()
OllamaGenerateRequest.builder()
.withModel(nonExistentModel)
.withPrompt("Hello")
.withRaw(false)
@@ -1274,8 +1232,7 @@ class OllamaIntegrationTest {
api.pullModel(GENERAL_PURPOSE_MODEL);
List<OllamaChatToolCalls> tools = Collections.emptyList();
OllamaChatRequestBuilder builder =
OllamaChatRequestBuilder.builder().withModel(GENERAL_PURPOSE_MODEL);
OllamaChatRequest builder = OllamaChatRequest.builder().withModel(GENERAL_PURPOSE_MODEL);
OllamaChatRequest requestModel =
builder.withMessage(OllamaChatMessageRole.USER, " ", tools) // whitespace only
.build();
@@ -1298,7 +1255,7 @@ class OllamaIntegrationTest {
void shouldGenerateWithExtremeParameters() throws OllamaException {
api.pullModel(GENERAL_PURPOSE_MODEL);
OllamaGenerateRequest request =
OllamaGenerateRequestBuilder.builder()
OllamaGenerateRequest.builder()
.withModel(GENERAL_PURPOSE_MODEL)
.withPrompt("Generate a random word")
.withRaw(false)
@@ -1351,8 +1308,7 @@ class OllamaIntegrationTest {
void shouldChatWithKeepAlive() throws OllamaException {
api.pullModel(GENERAL_PURPOSE_MODEL);
OllamaChatRequestBuilder builder =
OllamaChatRequestBuilder.builder().withModel(GENERAL_PURPOSE_MODEL);
OllamaChatRequest builder = OllamaChatRequest.builder().withModel(GENERAL_PURPOSE_MODEL);
OllamaChatRequest requestModel =
builder.withMessage(OllamaChatMessageRole.USER, "Hello, how are you?")
.withKeepAlive("5m") // Keep model loaded for 5 minutes
@@ -1376,7 +1332,7 @@ class OllamaIntegrationTest {
void shouldGenerateWithAdvancedOptions() throws OllamaException {
api.pullModel(GENERAL_PURPOSE_MODEL);
OllamaGenerateRequest request =
OllamaGenerateRequestBuilder.builder()
OllamaGenerateRequest.builder()
.withModel(GENERAL_PURPOSE_MODEL)
.withPrompt("Write a detailed explanation of machine learning")
.withRaw(false)
@@ -1421,8 +1377,8 @@ class OllamaIntegrationTest {
new Thread(
() -> {
try {
OllamaChatRequestBuilder builder =
OllamaChatRequestBuilder.builder()
OllamaChatRequest builder =
OllamaChatRequest.builder()
.withModel(GENERAL_PURPOSE_MODEL);
OllamaChatRequest requestModel =
builder.withMessage(

View File

@@ -13,7 +13,6 @@ import static org.junit.jupiter.api.Assertions.*;
import io.github.ollama4j.Ollama;
import io.github.ollama4j.exceptions.OllamaException;
import io.github.ollama4j.models.generate.OllamaGenerateRequest;
import io.github.ollama4j.models.generate.OllamaGenerateRequestBuilder;
import io.github.ollama4j.models.generate.OllamaGenerateStreamObserver;
import io.github.ollama4j.models.response.OllamaResult;
import io.github.ollama4j.samples.AnnotatedTool;
@@ -205,7 +204,7 @@ public class WithAuth {
format.put("required", List.of("isNoon"));
OllamaGenerateRequest request =
OllamaGenerateRequestBuilder.builder()
OllamaGenerateRequest.builder()
.withModel(model)
.withPrompt(prompt)
.withRaw(false)

View File

@@ -19,7 +19,6 @@ import io.github.ollama4j.models.chat.OllamaChatMessageRole;
import io.github.ollama4j.models.embed.OllamaEmbedRequest;
import io.github.ollama4j.models.embed.OllamaEmbedResult;
import io.github.ollama4j.models.generate.OllamaGenerateRequest;
import io.github.ollama4j.models.generate.OllamaGenerateRequestBuilder;
import io.github.ollama4j.models.generate.OllamaGenerateStreamObserver;
import io.github.ollama4j.models.request.CustomModelRequest;
import io.github.ollama4j.models.response.ModelDetail;
@@ -158,7 +157,7 @@ class TestMockedAPIs {
OllamaGenerateStreamObserver observer = new OllamaGenerateStreamObserver(null, null);
try {
OllamaGenerateRequest request =
OllamaGenerateRequestBuilder.builder()
OllamaGenerateRequest.builder()
.withModel(model)
.withPrompt(prompt)
.withRaw(false)
@@ -180,7 +179,7 @@ class TestMockedAPIs {
String prompt = "some prompt text";
try {
OllamaGenerateRequest request =
OllamaGenerateRequestBuilder.builder()
OllamaGenerateRequest.builder()
.withModel(model)
.withPrompt(prompt)
.withRaw(false)
@@ -206,7 +205,7 @@ class TestMockedAPIs {
String prompt = "some prompt text";
try {
OllamaGenerateRequest request =
OllamaGenerateRequestBuilder.builder()
OllamaGenerateRequest.builder()
.withModel(model)
.withPrompt(prompt)
.withRaw(false)

View File

@@ -12,15 +12,14 @@ import static org.junit.jupiter.api.Assertions.*;
import io.github.ollama4j.models.chat.OllamaChatMessageRole;
import io.github.ollama4j.models.chat.OllamaChatRequest;
import io.github.ollama4j.models.chat.OllamaChatRequestBuilder;
import org.junit.jupiter.api.Test;
class TestOllamaChatRequestBuilder {
@Test
void testResetClearsMessagesButKeepsModelAndThink() {
OllamaChatRequestBuilder builder =
OllamaChatRequestBuilder.builder()
OllamaChatRequest builder =
OllamaChatRequest.builder()
.withModel("my-model")
.withThinking(true)
.withMessage(OllamaChatMessageRole.USER, "first");

View File

@@ -13,7 +13,6 @@ import static org.junit.jupiter.api.Assertions.assertThrowsExactly;
import io.github.ollama4j.models.chat.OllamaChatMessageRole;
import io.github.ollama4j.models.chat.OllamaChatRequest;
import io.github.ollama4j.models.chat.OllamaChatRequestBuilder;
import io.github.ollama4j.utils.OptionsBuilder;
import java.io.File;
import java.util.Collections;
@@ -24,11 +23,11 @@ import org.junit.jupiter.api.Test;
public class TestChatRequestSerialization extends AbstractSerializationTest<OllamaChatRequest> {
private OllamaChatRequestBuilder builder;
private OllamaChatRequest builder;
@BeforeEach
public void init() {
builder = OllamaChatRequestBuilder.builder().withModel("DummyModel");
builder = OllamaChatRequest.builder().withModel("DummyModel");
}
@Test

View File

@@ -11,7 +11,6 @@ package io.github.ollama4j.unittests.jackson;
import static org.junit.jupiter.api.Assertions.assertEquals;
import io.github.ollama4j.models.generate.OllamaGenerateRequest;
import io.github.ollama4j.models.generate.OllamaGenerateRequestBuilder;
import io.github.ollama4j.utils.OptionsBuilder;
import org.json.JSONObject;
import org.junit.jupiter.api.BeforeEach;
@@ -19,16 +18,17 @@ import org.junit.jupiter.api.Test;
class TestGenerateRequestSerialization extends AbstractSerializationTest<OllamaGenerateRequest> {
private OllamaGenerateRequestBuilder builder;
private OllamaGenerateRequest builder;
@BeforeEach
public void init() {
builder = OllamaGenerateRequestBuilder.builder().withModel("Dummy Model");
builder = OllamaGenerateRequest.builder().withModel("Dummy Model");
}
@Test
public void testRequestOnlyMandatoryFields() {
OllamaGenerateRequest req = builder.withPrompt("Some prompt").build();
OllamaGenerateRequest req =
builder.withPrompt("Some prompt").withModel("Dummy Model").build();
String jsonRequest = serialize(req);
assertEqualsAfterUnmarshalling(deserialize(jsonRequest, OllamaGenerateRequest.class), req);
@@ -38,7 +38,10 @@ class TestGenerateRequestSerialization extends AbstractSerializationTest<OllamaG
public void testRequestWithOptions() {
OptionsBuilder b = new OptionsBuilder();
OllamaGenerateRequest req =
builder.withPrompt("Some prompt").withOptions(b.setMirostat(1).build()).build();
builder.withPrompt("Some prompt")
.withOptions(b.setMirostat(1).build())
.withModel("Dummy Model")
.build();
String jsonRequest = serialize(req);
OllamaGenerateRequest deserializeRequest =
@@ -49,7 +52,11 @@ class TestGenerateRequestSerialization extends AbstractSerializationTest<OllamaG
@Test
public void testWithJsonFormat() {
OllamaGenerateRequest req = builder.withPrompt("Some prompt").withGetJsonResponse().build();
OllamaGenerateRequest req =
builder.withPrompt("Some prompt")
.withGetJsonResponse()
.withModel("Dummy Model")
.build();
String jsonRequest = serialize(req);
System.out.printf(jsonRequest);