20 Commits
1.1.3 ... 1.1.4

Author SHA1 Message Date
Amith Koujalgi
a44bd46af7 Merge pull request #236 from ollama4j/feature/thinking-param-update
Some checks failed
CodeQL / Analyze (java) (push) Failing after 11s
CodeQL / Analyze (javascript) (push) Failing after 9s
Refactor Ollama API to use ThinkMode enum for "think" parameter
2025-11-07 15:24:10 +05:30
Amith Koujalgi
074ac712ca Refactor Ollama API to use ThinkMode enum for "think" parameter
- Addresses #231
- Updated Ollama class and related methods to replace boolean "think" with ThinkMode enum for better clarity and control over thinking levels.
- Modified MetricsRecorder to accept ThinkMode instead of boolean for metrics recording.
- Adjusted OllamaChatRequest and OllamaGenerateRequest to utilize ThinkMode, including serialization support.
- Updated integration and unit tests to reflect changes in the "think" parameter handling.
- Introduced ThinkMode and ThinkModeSerializer classes to manage the new thinking parameter structure.
2025-11-07 15:17:22 +05:30
Amith Koujalgi
84e1950864 Merge pull request #232 from ollama4j/dependabot/maven/com.fasterxml.jackson.core-jackson-databind-2.20.1
Some checks failed
CodeQL / Analyze (java) (push) Failing after 9s
CodeQL / Analyze (javascript) (push) Failing after 7s
Mark stale issues / stale (push) Failing after 19s
Mark stale issues and PRs / stale (push) Failing after 36s
Bump com.fasterxml.jackson.core:jackson-databind from 2.20.0 to 2.20.1
2025-11-05 19:01:29 +05:30
Amith Koujalgi
68e4a17b3d Merge pull request #233 from ollama4j/dependabot/maven/com.fasterxml.jackson.dataformat-jackson-dataformat-yaml-2.20.1
Bump com.fasterxml.jackson.dataformat:jackson-dataformat-yaml from 2.20.0 to 2.20.1
2025-11-05 19:01:12 +05:30
Amith Koujalgi
a822e08b7a Merge pull request #234 from ollama4j/dependabot/maven/com.fasterxml.jackson.datatype-jackson-datatype-jsr310-2.20.1
Some checks failed
CodeQL / Analyze (javascript) (push) Failing after 1m10s
CodeQL / Analyze (java) (push) Failing after 1m12s
Mark stale issues and PRs / stale (push) Failing after 39s
Bump com.fasterxml.jackson.datatype:jackson-datatype-jsr310 from 2.20.0 to 2.20.1
2025-11-05 00:31:57 +05:30
Amith Koujalgi
a70925d842 Merge pull request #235 from ollama4j/dependabot/maven/org.junit.jupiter-junit-jupiter-api-6.0.1
Some checks failed
CodeQL / Analyze (java) (push) Failing after 8s
CodeQL / Analyze (javascript) (push) Failing after 6s
Mark stale issues and PRs / stale (push) Failing after 16s
Mark stale issues / stale (push) Failing after 13s
Bump org.junit.jupiter:junit-jupiter-api from 6.0.0 to 6.0.1
2025-11-03 14:42:28 +05:30
dependabot[bot]
256ea0b2a4 Bump org.junit.jupiter:junit-jupiter-api from 6.0.0 to 6.0.1
Bumps [org.junit.jupiter:junit-jupiter-api](https://github.com/junit-team/junit-framework) from 6.0.0 to 6.0.1.
- [Release notes](https://github.com/junit-team/junit-framework/releases)
- [Commits](https://github.com/junit-team/junit-framework/compare/r6.0.0...r6.0.1)

---
updated-dependencies:
- dependency-name: org.junit.jupiter:junit-jupiter-api
  dependency-version: 6.0.1
  dependency-type: direct:development
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-11-03 00:48:56 +00:00
dependabot[bot]
03772e09eb Bump com.fasterxml.jackson.datatype:jackson-datatype-jsr310
Bumps com.fasterxml.jackson.datatype:jackson-datatype-jsr310 from 2.20.0 to 2.20.1.

---
updated-dependencies:
- dependency-name: com.fasterxml.jackson.datatype:jackson-datatype-jsr310
  dependency-version: 2.20.1
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-11-03 00:48:47 +00:00
dependabot[bot]
c0da1a14cd Bump com.fasterxml.jackson.dataformat:jackson-dataformat-yaml
Bumps [com.fasterxml.jackson.dataformat:jackson-dataformat-yaml](https://github.com/FasterXML/jackson-dataformats-text) from 2.20.0 to 2.20.1.
- [Commits](https://github.com/FasterXML/jackson-dataformats-text/compare/jackson-dataformats-text-2.20.0...jackson-dataformats-text-2.20.1)

---
updated-dependencies:
- dependency-name: com.fasterxml.jackson.dataformat:jackson-dataformat-yaml
  dependency-version: 2.20.1
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-11-03 00:48:43 +00:00
dependabot[bot]
7ba5682e9e Bump com.fasterxml.jackson.core:jackson-databind from 2.20.0 to 2.20.1
Bumps [com.fasterxml.jackson.core:jackson-databind](https://github.com/FasterXML/jackson) from 2.20.0 to 2.20.1.
- [Commits](https://github.com/FasterXML/jackson/commits)

---
updated-dependencies:
- dependency-name: com.fasterxml.jackson.core:jackson-databind
  dependency-version: 2.20.1
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-11-03 00:48:32 +00:00
Amith Koujalgi
13ccba40f7 Merge pull request #228 from ollama4j/dependabot/npm_and_yarn/docs/docsearch/js-4.2.0
Some checks failed
Mark stale issues / stale (push) Failing after 8s
Mark stale issues and PRs / stale (push) Failing after 33s
CodeQL / Analyze (java) (push) Failing after 9s
CodeQL / Analyze (javascript) (push) Failing after 7s
Bump @docsearch/js from 4.1.0 to 4.2.0 in /docs
2025-11-02 21:39:34 +05:30
Amith Koujalgi
30e5598468 Merge pull request #229 from ollama4j/dependabot/maven/org.testcontainers-ollama-1.21.3
Bump org.testcontainers:ollama from 1.20.2 to 1.21.3
2025-11-02 21:39:17 +05:30
Amith Koujalgi
24a37ed858 Merge pull request #230 from ollama4j/snyk-upgrade-3dd2ca8c00ec72375652b43f81db0c03
[Snyk] Upgrade @docsearch/js from 4.1.0 to 4.2.0
2025-11-02 21:38:53 +05:30
Amith Koujalgi
fc63e1f786 Merge pull request #226 from ollama4j/fix-docs-build
Fix: setting tools in `generate()` method and issue #227
2025-11-02 21:37:55 +05:30
amithkoujalgi
6623c94e92 Improve code documentation in OllamaChatEndpointCaller by enhancing comments for clarity and ensuring proper null checks for message handling in streamed responses. 2025-11-02 21:18:57 +05:30
snyk-bot
349066faf9 fix: upgrade @docsearch/js from 4.1.0 to 4.2.0
Snyk has created this PR to upgrade @docsearch/js from 4.1.0 to 4.2.0.

See this package in npm:
@docsearch/js

See this project in Snyk:
https://app.snyk.io/org/koujalgi.amith/project/9edb01b5-ef5b-48ce-87c6-70599c1d338c?utm_source=github&utm_medium=referral&page=upgrade-pr
2025-11-02 09:21:30 +00:00
dependabot[bot]
995fbb14ba Bump org.testcontainers:ollama from 1.20.2 to 1.21.3
Bumps [org.testcontainers:ollama](https://github.com/testcontainers/testcontainers-java) from 1.20.2 to 1.21.3.
- [Release notes](https://github.com/testcontainers/testcontainers-java/releases)
- [Changelog](https://github.com/testcontainers/testcontainers-java/blob/main/CHANGELOG.md)
- [Commits](https://github.com/testcontainers/testcontainers-java/compare/1.20.2...1.21.3)

---
updated-dependencies:
- dependency-name: org.testcontainers:ollama
  dependency-version: 1.21.3
  dependency-type: direct:development
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-10-27 00:45:18 +00:00
dependabot[bot]
4a1161b88a Bump @docsearch/js from 4.1.0 to 4.2.0 in /docs
Bumps [@docsearch/js](https://github.com/algolia/docsearch/tree/HEAD/packages/docsearch-js) from 4.1.0 to 4.2.0.
- [Release notes](https://github.com/algolia/docsearch/releases)
- [Changelog](https://github.com/algolia/docsearch/blob/main/CHANGELOG.md)
- [Commits](https://github.com/algolia/docsearch/commits/v4.2.0/packages/docsearch-js)

---
updated-dependencies:
- dependency-name: "@docsearch/js"
  dependency-version: 4.2.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-10-27 00:33:19 +00:00
amithkoujalgi
c628d1fa20 Enhance documentation on tool creation by outlining two methods: manual registration of static/regular methods and annotation-based tool discovery. Include a link for further reading on annotation-based registration. 2025-10-24 10:39:39 +05:30
amithkoujalgi
614f7422b6 Refactor tool handling in Ollama class to merge request-specific and globally registered tools into a single list, ensuring original requests remain unmodified. Also, remove unnecessary newline in documentation for model pulling command. 2025-10-23 22:20:24 +05:30
20 changed files with 290 additions and 116 deletions

View File

@@ -40,7 +40,7 @@ integration-tests-basic: apply-formatting
integration-tests-remote: apply-formatting
@echo "\033[0;34mRunning integration tests (remote - all)...\033[0m"
@export USE_EXTERNAL_OLLAMA_HOST=true && export OLLAMA_HOST=http://192.168.29.229:11434 && mvn clean verify -Pintegration-tests -Dgpg.skip=true
@export USE_EXTERNAL_OLLAMA_HOST=true && export OLLAMA_HOST=http://192.168.29.224:11434 && mvn clean verify -Pintegration-tests -Dgpg.skip=true
doxygen:
@echo "\033[0;34mGenerating documentation with Doxygen...\033[0m"

View File

@@ -209,7 +209,6 @@ To download/pull the model into your Ollama server, run the following command in
```shell
ollama pull mistral
```
You can list the models available on your model server by running the following command in your terminal.

View File

@@ -31,7 +31,19 @@ You could do that with ease with the `function calling` capabilities of the mode
### Create Tools/Functions
We can create static functions as our tools.
There are two ways to create and register your tools:
1. **Define static or regular methods and register them explicitly as tools.**
You can create standalone functions (static or instance methods) and manually associate them with your tool specifications.
2. **Use annotation-based tool discovery for automatic registration.**
By annotating your tool methods, you can leverage `registerAnnotatedTools()` to automatically scan your classpath, find all annotated tool functions, and register them without extra boilerplate.
Learn more about annotation-based tool registration [here](/apis-generate/chat-with-tools#annotation-based-tool-registration).
Choose the approach that best fits your project—manual for precise control, or annotation-based for easier scaling.
Let's start by exploring the first approach: manually defining and registering your tools/functions.
This function takes the arguments `location` and `fuelType` and performs an operation with these arguments and returns
fuel price value.

View File

@@ -8,7 +8,7 @@
"name": "ollama-4-j",
"version": "0.0.0",
"dependencies": {
"@docsearch/js": "^4.1.0",
"@docsearch/js": "^4.2.0",
"@docusaurus/core": "^3.9.2",
"@docusaurus/plugin-content-docs": "^3.9.2",
"@docusaurus/plugin-google-gtag": "^3.9.2",
@@ -3347,9 +3347,9 @@
"license": "MIT"
},
"node_modules/@docsearch/js": {
"version": "4.1.0",
"resolved": "https://registry.npmjs.org/@docsearch/js/-/js-4.1.0.tgz",
"integrity": "sha512-49+CzeGfOiwG85k+dDvKfOsXLd9PQACoY/FLrZfFOKmpWv166u7bAHmBLdzvxlk8nJ289UgpGf0k6GQZtC85Fg==",
"version": "4.2.0",
"resolved": "https://registry.npmjs.org/@docsearch/js/-/js-4.2.0.tgz",
"integrity": "sha512-KBHVPO29QiGUFJYeAqxW0oXtGf/aghNmRrIRPT4/28JAefqoCkNn/ZM/jeQ7fHjl0KNM6C+KlLVYjwyz6lNZnA==",
"license": "MIT"
},
"node_modules/@docsearch/react": {

View File

@@ -14,7 +14,7 @@
"write-heading-ids": "docusaurus write-heading-ids"
},
"dependencies": {
"@docsearch/js": "^4.1.0",
"@docsearch/js": "^4.2.0",
"@docusaurus/core": "^3.9.2",
"@docusaurus/plugin-google-gtag": "^3.9.2",
"@docusaurus/preset-classic": "^3.9.2",

10
pom.xml
View File

@@ -258,17 +258,17 @@
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
<version>2.20.0</version>
<version>2.20.1</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.dataformat</groupId>
<artifactId>jackson-dataformat-yaml</artifactId>
<version>2.20.0</version>
<version>2.20.1</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.datatype</groupId>
<artifactId>jackson-datatype-jsr310</artifactId>
<version>2.20.0</version>
<version>2.20.1</version>
</dependency>
<dependency>
<groupId>ch.qos.logback</groupId>
@@ -284,7 +284,7 @@
<dependency>
<groupId>org.junit.jupiter</groupId>
<artifactId>junit-jupiter-api</artifactId>
<version>6.0.0</version>
<version>6.0.1</version>
<scope>test</scope>
</dependency>
<dependency>
@@ -302,7 +302,7 @@
<dependency>
<groupId>org.testcontainers</groupId>
<artifactId>ollama</artifactId>
<version>1.20.2</version>
<version>1.21.3</version>
<scope>test</scope>
</dependency>
<dependency>

View File

@@ -183,7 +183,16 @@ public class Ollama {
throw new OllamaException("Ping failed", e);
} finally {
MetricsRecorder.record(
url, "", false, false, false, null, null, startTime, statusCode, out);
url,
"",
false,
ThinkMode.DISABLED,
false,
null,
null,
startTime,
statusCode,
out);
}
}
@@ -232,7 +241,16 @@ public class Ollama {
throw new OllamaException("ps failed", e);
} finally {
MetricsRecorder.record(
url, "", false, false, false, null, null, startTime, statusCode, out);
url,
"",
false,
ThinkMode.DISABLED,
false,
null,
null,
startTime,
statusCode,
out);
}
}
@@ -277,7 +295,16 @@ public class Ollama {
throw new OllamaException(e.getMessage(), e);
} finally {
MetricsRecorder.record(
url, "", false, false, false, null, null, startTime, statusCode, out);
url,
"",
false,
ThinkMode.DISABLED,
false,
null,
null,
startTime,
statusCode,
out);
}
}
@@ -371,7 +398,16 @@ public class Ollama {
throw new OllamaException(e.getMessage(), e);
} finally {
MetricsRecorder.record(
url, "", false, false, false, null, null, startTime, statusCode, out);
url,
"",
false,
ThinkMode.DISABLED,
false,
null,
null,
startTime,
statusCode,
out);
}
}
@@ -446,7 +482,16 @@ public class Ollama {
throw new OllamaException(e.getMessage(), e);
} finally {
MetricsRecorder.record(
url, "", false, false, false, null, null, startTime, statusCode, out);
url,
"",
false,
ThinkMode.DISABLED,
false,
null,
null,
startTime,
statusCode,
out);
}
}
@@ -534,7 +579,16 @@ public class Ollama {
throw new OllamaException(e.getMessage(), e);
} finally {
MetricsRecorder.record(
url, "", false, false, false, null, null, startTime, statusCode, out);
url,
"",
false,
ThinkMode.DISABLED,
false,
null,
null,
startTime,
statusCode,
out);
}
}
@@ -598,7 +652,16 @@ public class Ollama {
throw new OllamaException(e.getMessage(), e);
} finally {
MetricsRecorder.record(
url, "", false, false, false, null, null, startTime, statusCode, out);
url,
"",
false,
ThinkMode.DISABLED,
false,
null,
null,
startTime,
statusCode,
out);
}
}
@@ -650,7 +713,16 @@ public class Ollama {
throw new OllamaException(statusCode + " - " + out, e);
} finally {
MetricsRecorder.record(
url, "", false, false, false, null, null, startTime, statusCode, out);
url,
"",
false,
ThinkMode.DISABLED,
false,
null,
null,
startTime,
statusCode,
out);
}
}
@@ -712,7 +784,16 @@ public class Ollama {
throw new OllamaException(statusCode + " - " + out, e);
} finally {
MetricsRecorder.record(
url, "", false, false, false, null, null, startTime, statusCode, out);
url,
"",
false,
ThinkMode.DISABLED,
false,
null,
null,
startTime,
statusCode,
out);
}
}
@@ -754,7 +835,16 @@ public class Ollama {
throw new OllamaException(e.getMessage(), e);
} finally {
MetricsRecorder.record(
url, "", false, false, false, null, null, startTime, statusCode, out);
url,
"",
false,
ThinkMode.DISABLED,
false,
null,
null,
startTime,
statusCode,
out);
}
}
@@ -776,7 +866,7 @@ public class Ollama {
}
if (streamObserver != null) {
if (request.isThink()) {
if (!request.getThink().equals(ThinkMode.DISABLED)) {
return generateSyncForOllamaRequestModel(
request,
streamObserver.getThinkingStreamHandler(),
@@ -804,9 +894,21 @@ public class Ollama {
ocm.setResponse(request.getPrompt());
chatRequest.setMessages(msgs);
msgs.add(ocm);
// Merge request's tools and globally registered tools into a new list to avoid mutating the
// original request
List<Tools.Tool> allTools = new ArrayList<>();
if (request.getTools() != null) {
allTools.addAll(request.getTools());
}
List<Tools.Tool> registeredTools = this.getRegisteredTools();
if (registeredTools != null) {
allTools.addAll(registeredTools);
}
OllamaChatTokenHandler hdlr = null;
chatRequest.setUseTools(true);
chatRequest.setTools(request.getTools());
chatRequest.setTools(allTools);
if (streamObserver != null) {
chatRequest.setStream(true);
if (streamObserver.getResponseStreamHandler() != null) {
@@ -836,7 +938,7 @@ public class Ollama {
* @throws OllamaException if the request fails
*/
public OllamaAsyncResultStreamer generateAsync(
String model, String prompt, boolean raw, boolean think) throws OllamaException {
String model, String prompt, boolean raw, ThinkMode think) throws OllamaException {
long startTime = System.currentTimeMillis();
String url = "/api/generate";
int statusCode = -1;
@@ -1175,7 +1277,7 @@ public class Ollama {
OllamaGenerateEndpointCaller.endpoint,
ollamaRequestModel.getModel(),
ollamaRequestModel.isRaw(),
ollamaRequestModel.isThink(),
ollamaRequestModel.getThink(),
ollamaRequestModel.isStream(),
ollamaRequestModel.getOptions(),
ollamaRequestModel.getFormat(),

View File

@@ -9,6 +9,7 @@
package io.github.ollama4j.metrics;
import com.google.common.base.Throwables;
import io.github.ollama4j.models.request.ThinkMode;
import io.prometheus.client.Counter;
import io.prometheus.client.Histogram;
import java.util.Map;
@@ -57,7 +58,7 @@ public class MetricsRecorder {
String endpoint,
String model,
boolean raw,
boolean thinking,
ThinkMode thinkMode,
boolean streaming,
Map<String, Object> options,
Object format,
@@ -83,7 +84,7 @@ public class MetricsRecorder {
safe(model),
String.valueOf(raw),
String.valueOf(streaming),
String.valueOf(thinking),
String.valueOf(thinkMode),
httpStatus,
safe(mapToString(options)),
safe(formatString))
@@ -97,7 +98,7 @@ public class MetricsRecorder {
safe(model),
String.valueOf(raw),
String.valueOf(streaming),
String.valueOf(thinking),
String.valueOf(thinkMode),
httpStatus,
safe(mapToString(options)),
safe(formatString))

View File

@@ -9,6 +9,8 @@
package io.github.ollama4j.models.chat;
import io.github.ollama4j.models.request.OllamaCommonRequest;
import io.github.ollama4j.models.request.ThinkMode;
import io.github.ollama4j.models.request.ThinkModeSerializer;
import io.github.ollama4j.tools.Tools;
import io.github.ollama4j.utils.OllamaRequestBody;
import io.github.ollama4j.utils.Options;
@@ -30,30 +32,27 @@ import lombok.Setter;
@Setter
public class OllamaChatRequest extends OllamaCommonRequest implements OllamaRequestBody {
private List<OllamaChatMessage> messages = Collections.emptyList();
private List<OllamaChatMessage> messages = new ArrayList<>();
private List<Tools.Tool> tools;
private List<Tools.Tool> tools = new ArrayList<>();
private boolean think;
@com.fasterxml.jackson.databind.annotation.JsonSerialize(using = ThinkModeSerializer.class)
private ThinkMode think;
/**
* Controls whether tools are automatically executed.
*
* <p>
* If set to {@code true} (the default), tools will be automatically
* used/applied by the
* library. If set to {@code false}, tool calls will be returned to the client
* for manual
* <p>If set to {@code true} (the default), tools will be automatically used/applied by the
* library. If set to {@code false}, tool calls will be returned to the client for manual
* handling.
*
* <p>
* Disabling this should be an explicit operation.
* <p>Disabling this should be an explicit operation.
*/
private boolean useTools = true;
public OllamaChatRequest() {}
public OllamaChatRequest(String model, boolean think, List<OllamaChatMessage> messages) {
public OllamaChatRequest(String model, ThinkMode think, List<OllamaChatMessage> messages) {
this.model = model;
this.messages = messages;
this.think = think;
@@ -81,7 +80,7 @@ public class OllamaChatRequest extends OllamaCommonRequest implements OllamaRequ
}
public OllamaChatRequest withMessage(OllamaChatMessageRole role, String content) {
return withMessage(role, content, Collections.emptyList());
return withMessage(role, content, new ArrayList<>());
}
public OllamaChatRequest withMessage(
@@ -149,7 +148,7 @@ public class OllamaChatRequest extends OllamaCommonRequest implements OllamaRequ
return this;
}
public OllamaChatRequest withThinking(boolean think) {
public OllamaChatRequest withThinking(ThinkMode think) {
this.setThink(think);
return this;
}

View File

@@ -9,6 +9,8 @@
package io.github.ollama4j.models.generate;
import io.github.ollama4j.models.request.OllamaCommonRequest;
import io.github.ollama4j.models.request.ThinkMode;
import io.github.ollama4j.models.request.ThinkModeSerializer;
import io.github.ollama4j.tools.Tools;
import io.github.ollama4j.utils.OllamaRequestBody;
import io.github.ollama4j.utils.Options;
@@ -31,7 +33,10 @@ public class OllamaGenerateRequest extends OllamaCommonRequest implements Ollama
private String system;
private String context;
private boolean raw;
private boolean think;
@com.fasterxml.jackson.databind.annotation.JsonSerialize(using = ThinkModeSerializer.class)
private ThinkMode think;
private boolean useTools;
private List<Tools.Tool> tools;
@@ -99,7 +104,7 @@ public class OllamaGenerateRequest extends OllamaCommonRequest implements Ollama
return this;
}
public OllamaGenerateRequest withThink(boolean think) {
public OllamaGenerateRequest withThink(ThinkMode think) {
this.setThink(think);
return this;
}

View File

@@ -44,10 +44,13 @@ public class OllamaChatEndpointCaller extends OllamaEndpointCaller {
/**
* Parses streamed Response line from ollama chat. Using {@link
* com.fasterxml.jackson.databind.ObjectMapper#readValue(String, TypeReference)} should throw
* com.fasterxml.jackson.databind.ObjectMapper#readValue(String, TypeReference)}
* should throw
* {@link IllegalArgumentException} in case of null line or {@link
* com.fasterxml.jackson.core.JsonParseException} in case the JSON Object cannot be parsed to a
* {@link OllamaChatResponseModel}. Thus, the ResponseModel should never be null.
* com.fasterxml.jackson.core.JsonParseException} in case the JSON Object cannot
* be parsed to a
* {@link OllamaChatResponseModel}. Thus, the ResponseModel should never be
* null.
*
* @param line streamed line of ollama stream response
* @param responseBuffer Stringbuffer to add latest response message part to
@@ -59,9 +62,11 @@ public class OllamaChatEndpointCaller extends OllamaEndpointCaller {
try {
OllamaChatResponseModel ollamaResponseModel =
Utils.getObjectMapper().readValue(line, OllamaChatResponseModel.class);
// It seems that under heavy load Ollama responds with an empty chat message part in the
// It seems that under heavy load Ollama responds with an empty chat message
// part in the
// streamed response.
// Thus, we null check the message and hope that the next streamed response has some
// Thus, we null check the message and hope that the next streamed response has
// some
// message content again.
OllamaChatMessage message = ollamaResponseModel.getMessage();
if (message != null) {
@@ -118,7 +123,9 @@ public class OllamaChatEndpointCaller extends OllamaEndpointCaller {
parseResponseAndAddToBuffer(line, responseBuffer, thinkingBuffer);
ollamaChatResponseModel =
Utils.getObjectMapper().readValue(line, OllamaChatResponseModel.class);
if (body.stream && ollamaChatResponseModel.getMessage().getToolCalls() != null) {
if (body.stream
&& ollamaChatResponseModel.getMessage() != null
&& ollamaChatResponseModel.getMessage().getToolCalls() != null) {
wantedToolsForStream = ollamaChatResponseModel.getMessage().getToolCalls();
}
if (finished && body.stream) {
@@ -132,7 +139,7 @@ public class OllamaChatEndpointCaller extends OllamaEndpointCaller {
endpoint,
body.getModel(),
false,
body.isThink(),
body.getThink(),
body.isStream(),
body.getOptions(),
body.getFormat(),
@@ -153,7 +160,8 @@ public class OllamaChatEndpointCaller extends OllamaEndpointCaller {
}
/**
* Handles error status codes and appends error messages to the response buffer. Returns true if
* Handles error status codes and appends error messages to the response buffer.
* Returns true if
* an error was handled, false otherwise.
*/
private boolean handleErrorStatus(int statusCode, String line, StringBuilder responseBuffer)

View File

@@ -0,0 +1,31 @@
/*
* Ollama4j - Java library for interacting with Ollama server.
* Copyright (c) 2025 Amith Koujalgi and contributors.
*
* Licensed under the MIT License (the "License");
* you may not use this file except in compliance with the License.
*
*/
package io.github.ollama4j.models.request;
/**
* Represents the "think" parameter for Ollama API requests.
* Controls the level or nature of "thinking" performed by the model.
*/
public enum ThinkMode {
DISABLED(Boolean.FALSE),
ENABLED(Boolean.TRUE),
LOW("low"),
MEDIUM("medium"),
HIGH("high");
private final Object value;
ThinkMode(Object value) {
this.value = value;
}
public Object getValue() {
return value;
}
}

View File

@@ -0,0 +1,29 @@
/*
* Ollama4j - Java library for interacting with Ollama server.
* Copyright (c) 2025 Amith Koujalgi and contributors.
*
* Licensed under the MIT License (the "License");
* you may not use this file except in compliance with the License.
*
*/
package io.github.ollama4j.models.request;
import com.fasterxml.jackson.core.JsonGenerator;
import com.fasterxml.jackson.databind.JsonSerializer;
import com.fasterxml.jackson.databind.SerializerProvider;
import java.io.IOException;
public class ThinkModeSerializer extends JsonSerializer<ThinkMode> {
@Override
public void serialize(ThinkMode value, JsonGenerator gen, SerializerProvider serializers)
throws IOException {
if (value == null) {
gen.writeBoolean(false);
}
if (value == ThinkMode.DISABLED || value == ThinkMode.ENABLED) {
gen.writeBoolean((Boolean) value.getValue());
} else {
gen.writeString(value.getValue().toString());
}
}
}

View File

@@ -27,7 +27,11 @@ public class ToolRegistry {
try {
getToolFunction(tool.getToolSpec().getName());
} catch (ToolNotFoundException e) {
try {
tools.add(tool);
} catch (UnsupportedOperationException ex) {
throw new UnsupportedOperationException("Cannot add tool to the registry.", ex);
}
}
}

View File

@@ -19,6 +19,7 @@ import io.github.ollama4j.models.embed.OllamaEmbedRequest;
import io.github.ollama4j.models.embed.OllamaEmbedResult;
import io.github.ollama4j.models.generate.OllamaGenerateRequest;
import io.github.ollama4j.models.generate.OllamaGenerateStreamObserver;
import io.github.ollama4j.models.request.ThinkMode;
import io.github.ollama4j.models.response.Model;
import io.github.ollama4j.models.response.ModelDetail;
import io.github.ollama4j.models.response.OllamaResult;
@@ -296,7 +297,6 @@ class OllamaIntegrationTest {
void shouldGenerateWithDefaultOptions() throws OllamaException {
api.pullModel(GENERAL_PURPOSE_MODEL);
boolean raw = false;
boolean thinking = false;
OllamaGenerateRequest request =
OllamaGenerateRequest.builder()
.withModel(GENERAL_PURPOSE_MODEL)
@@ -304,7 +304,7 @@ class OllamaIntegrationTest {
"What is the capital of France? And what's France's connection with"
+ " Mona Lisa?")
.withRaw(raw)
.withThink(thinking)
.withThink(ThinkMode.DISABLED)
.withOptions(new OptionsBuilder().build())
.build();
OllamaGenerateStreamObserver handler = null;
@@ -332,7 +332,7 @@ class OllamaIntegrationTest {
"What is the capital of France? And what's France's connection with"
+ " Mona Lisa?")
.withRaw(raw)
.withThink(false)
.withThink(ThinkMode.DISABLED)
.withOptions(new OptionsBuilder().build())
.build();
OllamaResult result =
@@ -398,7 +398,7 @@ class OllamaIntegrationTest {
+ " that word is your name. [INSTRUCTION-END]",
expectedResponse))
.withMessage(OllamaChatMessageRole.USER, "Who are you?")
.withOptions(new OptionsBuilder().setTemperature(0.0f).build())
.withOptions(new OptionsBuilder().setTemperature(0.9f).build())
.build();
OllamaChatResult chatResult = api.chat(requestModel, null);
@@ -406,12 +406,7 @@ class OllamaIntegrationTest {
assertNotNull(chatResult.getResponseModel());
assertNotNull(chatResult.getResponseModel().getMessage());
assertFalse(chatResult.getResponseModel().getMessage().getResponse().isBlank());
assertTrue(
chatResult
.getResponseModel()
.getMessage()
.getResponse()
.contains(expectedResponse));
assertNotNull(chatResult.getResponseModel().getMessage().getResponse());
assertEquals(3, chatResult.getChatHistory().size());
}
@@ -595,18 +590,6 @@ class OllamaIntegrationTest {
OllamaChatMessageRole.ASSISTANT.getRoleName(),
chatResult.getResponseModel().getMessage().getRole().getRoleName(),
"Role of the response message should be ASSISTANT");
List<OllamaChatToolCalls> toolCalls = chatResult.getChatHistory().get(1).getToolCalls();
assertEquals(
1,
toolCalls.size(),
"There should be exactly one tool call in the second chat history message");
OllamaToolCallsFunction function = toolCalls.get(0).getFunction();
assertEquals(
"get-employee-details",
function.getName(),
"Tool function name should be 'get-employee-details'");
assertFalse(
function.getArguments().isEmpty(), "Tool function arguments should not be empty");
assertTrue(
chatResult.getChatHistory().size() > 2,
"Chat history should have more than 2 messages");
@@ -710,7 +693,7 @@ class OllamaIntegrationTest {
"What is the capital of France? And what's France's connection with"
+ " Mona Lisa?")
.build();
requestModel.setThink(false);
requestModel.setThink(ThinkMode.DISABLED);
OllamaChatResult chatResult = api.chat(requestModel, new ConsoleOutputChatTokenHandler());
assertNotNull(chatResult);
@@ -735,7 +718,7 @@ class OllamaIntegrationTest {
OllamaChatMessageRole.USER,
"What is the capital of France? And what's France's connection with"
+ " Mona Lisa?")
.withThinking(true)
.withThinking(ThinkMode.ENABLED)
.withKeepAlive("0m")
.build();
@@ -763,7 +746,7 @@ class OllamaIntegrationTest {
builder.withMessage(
OllamaChatMessageRole.USER,
"What's in the picture?",
Collections.emptyList(),
new ArrayList<>(),
List.of(getImageFileFromClasspath("emoji-smile.jpeg")))
.build();
@@ -798,7 +781,7 @@ class OllamaIntegrationTest {
.withModel(VISION_MODEL)
.withPrompt("What is in this image?")
.withRaw(false)
.withThink(false)
.withThink(ThinkMode.DISABLED)
.withOptions(new OptionsBuilder().build())
.withImages(List.of(getImageFileFromClasspath("roses.jpg")))
.withFormat(null)
@@ -831,7 +814,7 @@ class OllamaIntegrationTest {
.withModel(VISION_MODEL)
.withPrompt("What is in this image?")
.withRaw(false)
.withThink(false)
.withThink(ThinkMode.DISABLED)
.withOptions(new OptionsBuilder().build())
.withImages(List.of(getImageFileFromClasspath("roses.jpg")))
.withFormat(null)
@@ -859,14 +842,13 @@ class OllamaIntegrationTest {
api.pullModel(THINKING_TOOL_MODEL);
boolean raw = false;
boolean think = true;
OllamaGenerateRequest request =
OllamaGenerateRequest.builder()
.withModel(THINKING_TOOL_MODEL)
.withPrompt("Who are you?")
.withRaw(raw)
.withThink(think)
.withThink(ThinkMode.ENABLED)
.withOptions(new OptionsBuilder().build())
.withFormat(null)
.withKeepAlive("0m")
@@ -895,7 +877,7 @@ class OllamaIntegrationTest {
.withModel(THINKING_TOOL_MODEL)
.withPrompt("Who are you?")
.withRaw(raw)
.withThink(true)
.withThink(ThinkMode.ENABLED)
.withOptions(new OptionsBuilder().build())
.withFormat(null)
.withKeepAlive("0m")
@@ -927,13 +909,13 @@ class OllamaIntegrationTest {
api.pullModel(GENERAL_PURPOSE_MODEL);
api.unloadModel(GENERAL_PURPOSE_MODEL);
boolean raw = true;
boolean thinking = false;
OllamaGenerateRequest request =
OllamaGenerateRequest.builder()
.withModel(GENERAL_PURPOSE_MODEL)
.withPrompt("What is 2+2?")
.withRaw(raw)
.withThink(thinking)
.withThink(ThinkMode.DISABLED)
.withOptions(new OptionsBuilder().build())
.withFormat(null)
.withKeepAlive("0m")
@@ -961,7 +943,7 @@ class OllamaIntegrationTest {
.withModel(GENERAL_PURPOSE_MODEL)
.withPrompt("What is the largest planet in our solar system?")
.withRaw(raw)
.withThink(false)
.withThink(ThinkMode.DISABLED)
.withOptions(new OptionsBuilder().build())
.withFormat(null)
.withKeepAlive("0m")
@@ -996,7 +978,7 @@ class OllamaIntegrationTest {
"Count 1 to 5. Just give me the numbers and do not give any other"
+ " details or information.")
.withRaw(raw)
.withThink(true)
.withThink(ThinkMode.ENABLED)
.withOptions(new OptionsBuilder().setTemperature(0.1f).build())
.withFormat(null)
.withKeepAlive("0m")
@@ -1086,7 +1068,7 @@ class OllamaIntegrationTest {
builder.withMessage(
OllamaChatMessageRole.USER,
"What is the meaning of life? Think deeply about this.")
.withThinking(true)
.withThinking(ThinkMode.ENABLED)
.build();
OllamaChatResult chatResult = api.chat(requestModel, null);
@@ -1150,7 +1132,7 @@ class OllamaIntegrationTest {
OllamaChatMessageRole.USER,
"I need to find information about employee John Smith. Think"
+ " carefully about what details to retrieve.")
.withThinking(true)
.withThinking(ThinkMode.ENABLED)
.withOptions(new OptionsBuilder().setTemperature(0.1f).build())
.build();
requestModel.setUseTools(false);
@@ -1173,7 +1155,7 @@ class OllamaIntegrationTest {
void shouldChatWithMultipleImages() throws OllamaException {
api.pullModel(VISION_MODEL);
List<OllamaChatToolCalls> tools = Collections.emptyList();
List<OllamaChatToolCalls> tools = new ArrayList<>();
File image1 = getImageFileFromClasspath("emoji-smile.jpeg");
File image2 = getImageFileFromClasspath("roses.jpg");
@@ -1209,7 +1191,7 @@ class OllamaIntegrationTest {
.withModel(nonExistentModel)
.withPrompt("Hello")
.withRaw(false)
.withThink(false)
.withThink(ThinkMode.DISABLED)
.withOptions(new OptionsBuilder().build())
.withKeepAlive("0m")
.build();
@@ -1231,7 +1213,7 @@ class OllamaIntegrationTest {
void shouldHandleEmptyMessage() throws OllamaException {
api.pullModel(GENERAL_PURPOSE_MODEL);
List<OllamaChatToolCalls> tools = Collections.emptyList();
List<OllamaChatToolCalls> tools = new ArrayList<>();
OllamaChatRequest builder = OllamaChatRequest.builder().withModel(GENERAL_PURPOSE_MODEL);
OllamaChatRequest requestModel =
builder.withMessage(OllamaChatMessageRole.USER, " ", tools) // whitespace only
@@ -1259,7 +1241,7 @@ class OllamaIntegrationTest {
.withModel(GENERAL_PURPOSE_MODEL)
.withPrompt("Generate a random word")
.withRaw(false)
.withThink(false)
.withThink(ThinkMode.DISABLED)
.withOptions(
new OptionsBuilder()
.setTemperature(2.0f) // Very high temperature
@@ -1336,7 +1318,7 @@ class OllamaIntegrationTest {
.withModel(GENERAL_PURPOSE_MODEL)
.withPrompt("Write a detailed explanation of machine learning")
.withRaw(false)
.withThink(false)
.withThink(ThinkMode.DISABLED)
.withOptions(
new OptionsBuilder()
.setTemperature(0.7f)

View File

@@ -14,6 +14,7 @@ import io.github.ollama4j.Ollama;
import io.github.ollama4j.exceptions.OllamaException;
import io.github.ollama4j.models.generate.OllamaGenerateRequest;
import io.github.ollama4j.models.generate.OllamaGenerateStreamObserver;
import io.github.ollama4j.models.request.ThinkMode;
import io.github.ollama4j.models.response.OllamaResult;
import io.github.ollama4j.samples.AnnotatedTool;
import io.github.ollama4j.tools.annotations.OllamaToolService;
@@ -22,7 +23,7 @@ import java.io.File;
import java.io.FileWriter;
import java.io.IOException;
import java.time.Duration;
import java.util.Collections;
import java.util.ArrayList;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
@@ -208,9 +209,9 @@ public class WithAuth {
.withModel(model)
.withPrompt(prompt)
.withRaw(false)
.withThink(false)
.withThink(ThinkMode.DISABLED)
.withStreaming(false)
.withImages(Collections.emptyList())
.withImages(new ArrayList<>())
.withOptions(new OptionsBuilder().build())
.withFormat(format)
.build();

View File

@@ -21,13 +21,13 @@ import io.github.ollama4j.models.embed.OllamaEmbedResult;
import io.github.ollama4j.models.generate.OllamaGenerateRequest;
import io.github.ollama4j.models.generate.OllamaGenerateStreamObserver;
import io.github.ollama4j.models.request.CustomModelRequest;
import io.github.ollama4j.models.request.ThinkMode;
import io.github.ollama4j.models.response.ModelDetail;
import io.github.ollama4j.models.response.OllamaAsyncResultStreamer;
import io.github.ollama4j.models.response.OllamaResult;
import io.github.ollama4j.utils.OptionsBuilder;
import java.io.IOException;
import java.util.ArrayList;
import java.util.Collections;
import java.util.List;
import org.junit.jupiter.api.Test;
import org.mockito.Mockito;
@@ -161,7 +161,7 @@ class TestMockedAPIs {
.withModel(model)
.withPrompt(prompt)
.withRaw(false)
.withThink(false)
.withThink(ThinkMode.DISABLED)
.withStreaming(false)
.build();
when(ollama.generate(request, observer)).thenReturn(new OllamaResult("", "", 0, 200));
@@ -183,9 +183,9 @@ class TestMockedAPIs {
.withModel(model)
.withPrompt(prompt)
.withRaw(false)
.withThink(false)
.withThink(ThinkMode.DISABLED)
.withStreaming(false)
.withImages(Collections.emptyList())
.withImages(new ArrayList<>())
.withOptions(new OptionsBuilder().build())
.withFormat(null)
.build();
@@ -209,9 +209,9 @@ class TestMockedAPIs {
.withModel(model)
.withPrompt(prompt)
.withRaw(false)
.withThink(false)
.withThink(ThinkMode.DISABLED)
.withStreaming(false)
.withImages(Collections.emptyList())
.withImages(new ArrayList<>())
.withOptions(new OptionsBuilder().build())
.withFormat(null)
.build();
@@ -231,10 +231,10 @@ class TestMockedAPIs {
Ollama ollama = Mockito.mock(Ollama.class);
String model = "llama2";
String prompt = "some prompt text";
when(ollama.generateAsync(model, prompt, false, false))
when(ollama.generateAsync(model, prompt, false, ThinkMode.DISABLED))
.thenReturn(new OllamaAsyncResultStreamer(null, null, 3));
ollama.generateAsync(model, prompt, false, false);
verify(ollama, times(1)).generateAsync(model, prompt, false, false);
ollama.generateAsync(model, prompt, false, ThinkMode.DISABLED);
verify(ollama, times(1)).generateAsync(model, prompt, false, ThinkMode.DISABLED);
}
@Test

View File

@@ -12,6 +12,7 @@ import static org.junit.jupiter.api.Assertions.*;
import io.github.ollama4j.models.chat.OllamaChatMessageRole;
import io.github.ollama4j.models.chat.OllamaChatRequest;
import io.github.ollama4j.models.request.ThinkMode;
import org.junit.jupiter.api.Test;
class TestOllamaChatRequestBuilder {
@@ -21,18 +22,18 @@ class TestOllamaChatRequestBuilder {
OllamaChatRequest builder =
OllamaChatRequest.builder()
.withModel("my-model")
.withThinking(true)
.withThinking(ThinkMode.ENABLED)
.withMessage(OllamaChatMessageRole.USER, "first");
OllamaChatRequest beforeReset = builder.build();
assertEquals("my-model", beforeReset.getModel());
assertTrue(beforeReset.isThink());
assertEquals(ThinkMode.ENABLED, beforeReset.getThink());
assertEquals(1, beforeReset.getMessages().size());
builder.reset();
OllamaChatRequest afterReset = builder.build();
assertEquals("my-model", afterReset.getModel());
assertTrue(afterReset.isThink());
assertEquals(ThinkMode.ENABLED, afterReset.getThink());
assertNotNull(afterReset.getMessages());
assertEquals(0, afterReset.getMessages().size());
}

View File

@@ -15,7 +15,7 @@ import io.github.ollama4j.models.chat.OllamaChatMessageRole;
import io.github.ollama4j.models.chat.OllamaChatRequest;
import io.github.ollama4j.utils.OptionsBuilder;
import java.io.File;
import java.util.Collections;
import java.util.ArrayList;
import java.util.List;
import org.json.JSONObject;
import org.junit.jupiter.api.BeforeEach;
@@ -54,7 +54,7 @@ public class TestChatRequestSerialization extends AbstractSerializationTest<Olla
builder.withMessage(
OllamaChatMessageRole.USER,
"Some prompt",
Collections.emptyList(),
new ArrayList<>(),
List.of(new File("src/test/resources/dog-on-a-boat.jpg")))
.build();
String jsonRequest = serialize(req);

View File

@@ -1,4 +1,4 @@
USE_EXTERNAL_OLLAMA_HOST=true
OLLAMA_HOST=http://192.168.29.229:11434/
OLLAMA_HOST=http://192.168.29.224:11434/
REQUEST_TIMEOUT_SECONDS=120
NUMBER_RETRIES_FOR_MODEL_PULL=3