Compare commits

...

31 Commits

Author SHA1 Message Date
snyk-bot
ceb1e6f338 fix: upgrade org.projectlombok:lombok from 1.18.40 to 1.18.42
Snyk has created this PR to upgrade org.projectlombok:lombok from 1.18.40 to 1.18.42.

See this package in maven:
org.projectlombok:lombok

See this project in Snyk:
https://app.snyk.io/org/koujalgi.amith/project/e4176cf5-c6db-4650-af21-3778aa308d33?utm_source=github&utm_medium=referral&page=upgrade-pr
2025-10-11 08:36:43 +00:00
Amith Koujalgi
64c629775a Refactor OllamaChatRequest and OllamaGenerateRequest to remove builder classes, implement builder-like methods directly in the request classes, and enhance request handling with additional options and image support. Update integration tests to reflect these changes.
Some checks failed
Mark stale issues / stale (push) Failing after 12s
Mark stale issues and PRs / stale (push) Failing after 8s
CodeQL / Analyze (java) (push) Failing after 11s
CodeQL / Analyze (javascript) (push) Failing after 10s
2025-10-07 18:33:22 +05:30
Amith Koujalgi
46f2d62fed Merge pull request #208 from ollama4j/dependabot/npm_and_yarn/docs/react-dom-19.2.0
Some checks failed
CodeQL / Analyze (java) (push) Failing after 13s
CodeQL / Analyze (javascript) (push) Failing after 12s
Mark stale issues / stale (push) Failing after 43s
Mark stale issues and PRs / stale (push) Failing after 1m30s
Bump react-dom from 19.1.1 to 19.2.0 in /docs
2025-10-06 23:28:31 +05:30
dependabot[bot]
b91943066e Bump react-dom from 19.1.1 to 19.2.0 in /docs
Bumps [react-dom](https://github.com/facebook/react/tree/HEAD/packages/react-dom) from 19.1.1 to 19.2.0.
- [Release notes](https://github.com/facebook/react/releases)
- [Changelog](https://github.com/facebook/react/blob/main/CHANGELOG.md)
- [Commits](https://github.com/facebook/react/commits/v19.2.0/packages/react-dom)

---
updated-dependencies:
- dependency-name: react-dom
  dependency-version: 19.2.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-10-06 17:57:36 +00:00
Amith Koujalgi
58d73637bb Merge pull request #207 from ollama4j/dependabot/npm_and_yarn/docs/react-19.2.0
Bump react from 19.1.1 to 19.2.0 in /docs
2025-10-06 23:26:19 +05:30
Amith Koujalgi
0ffaac65d4 Add new logo 2025-10-06 23:25:10 +05:30
Amith Koujalgi
4ce9c4c191 Add new logo 2025-10-06 23:23:42 +05:30
dependabot[bot]
4681b1986f Bump react from 19.1.1 to 19.2.0 in /docs
Bumps [react](https://github.com/facebook/react/tree/HEAD/packages/react) from 19.1.1 to 19.2.0.
- [Release notes](https://github.com/facebook/react/releases)
- [Changelog](https://github.com/facebook/react/blob/main/CHANGELOG.md)
- [Commits](https://github.com/facebook/react/commits/v19.2.0/packages/react)

---
updated-dependencies:
- dependency-name: react
  dependency-version: 19.2.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-10-06 00:36:12 +00:00
Amith Koujalgi
89d42fd469 Merge pull request #206 from ollama4j/docs
Some checks failed
Mark stale issues / stale (push) Failing after 23s
Mark stale issues and PRs / stale (push) Failing after 1m19s
CodeQL / Analyze (java) (push) Failing after 15s
CodeQL / Analyze (javascript) (push) Failing after 13s
Update metrics.md
2025-10-01 01:24:33 +05:30
amithkoujalgi
8a903f695e Update metrics.md 2025-10-01 01:24:03 +05:30
amithkoujalgi
3a20af25f1 Merge branch 'main' of https://github.com/ollama4j/ollama4j 2025-10-01 01:16:50 +05:30
Amith Koujalgi
24046b6660 Merge pull request #205 from ollama4j/refactor
Add metrics documentation for Ollama4j library
2025-10-01 01:15:45 +05:30
Amith Koujalgi
42a0034728 Merge pull request #203 from ollama4j/dependabot/npm_and_yarn/docs/docusaurus/preset-classic-3.9.1
Some checks failed
CodeQL / Analyze (javascript) (push) Failing after 9s
CodeQL / Analyze (java) (push) Failing after 12s
Mark stale issues and PRs / stale (push) Failing after 1m31s
Mark stale issues / stale (push) Failing after 15s
Bump @docusaurus/preset-classic from 3.9.0 to 3.9.1 in /docs
2025-09-29 23:34:32 +05:30
dependabot[bot]
d8d660be8d Bump @docusaurus/preset-classic from 3.9.0 to 3.9.1 in /docs
Bumps [@docusaurus/preset-classic](https://github.com/facebook/docusaurus/tree/HEAD/packages/docusaurus-preset-classic) from 3.9.0 to 3.9.1.
- [Release notes](https://github.com/facebook/docusaurus/releases)
- [Changelog](https://github.com/facebook/docusaurus/blob/main/CHANGELOG.md)
- [Commits](https://github.com/facebook/docusaurus/commits/v3.9.1/packages/docusaurus-preset-classic)

---
updated-dependencies:
- dependency-name: "@docusaurus/preset-classic"
  dependency-version: 3.9.1
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-09-29 18:02:34 +00:00
Amith Koujalgi
c5a2d583c7 Merge pull request #202 from ollama4j/dependabot/npm_and_yarn/docs/docusaurus/theme-mermaid-3.9.1
Bump @docusaurus/theme-mermaid from 3.9.0 to 3.9.1 in /docs
2025-09-29 23:30:55 +05:30
dependabot[bot]
cd656264cf Bump @docusaurus/theme-mermaid from 3.9.0 to 3.9.1 in /docs
Bumps [@docusaurus/theme-mermaid](https://github.com/facebook/docusaurus/tree/HEAD/packages/docusaurus-theme-mermaid) from 3.9.0 to 3.9.1.
- [Release notes](https://github.com/facebook/docusaurus/releases)
- [Changelog](https://github.com/facebook/docusaurus/blob/main/CHANGELOG.md)
- [Commits](https://github.com/facebook/docusaurus/commits/v3.9.1/packages/docusaurus-theme-mermaid)

---
updated-dependencies:
- dependency-name: "@docusaurus/theme-mermaid"
  dependency-version: 3.9.1
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-09-29 14:24:20 +00:00
Amith Koujalgi
272c8204c7 Merge pull request #201 from ollama4j/dependabot/maven/org.jacoco-jacoco-maven-plugin-0.8.13
Some checks failed
CodeQL / Analyze (javascript) (push) Failing after 11s
CodeQL / Analyze (java) (push) Failing after 14s
Mark stale issues / stale (push) Failing after 21s
Bump org.jacoco:jacoco-maven-plugin from 0.8.7 to 0.8.13
2025-09-29 19:05:43 +05:30
Amith Koujalgi
23c7321b63 Merge pull request #200 from ollama4j/dependabot/maven/org.apache.maven.plugins-maven-compiler-plugin-3.14.1
Bump org.apache.maven.plugins:maven-compiler-plugin from 3.14.0 to 3.14.1
2025-09-29 19:05:27 +05:30
Amith Koujalgi
e24a38f89f Merge pull request #199 from ollama4j/dependabot/npm_and_yarn/docs/docusaurus/plugin-google-gtag-3.9.1
Bump @docusaurus/plugin-google-gtag from 3.9.0 to 3.9.1 in /docs
2025-09-29 19:05:15 +05:30
Amith Koujalgi
5847cfc94c Merge pull request #198 from ollama4j/dependabot/maven/io.github.git-commit-id-git-commit-id-maven-plugin-9.0.2
Bump io.github.git-commit-id:git-commit-id-maven-plugin from 9.0.1 to 9.0.2
2025-09-29 19:05:02 +05:30
Amith Koujalgi
05d5958307 Merge pull request #197 from ollama4j/dependabot/maven/org.mockito-mockito-core-5.20.0
Bump org.mockito:mockito-core from 4.1.0 to 5.20.0
2025-09-29 19:04:48 +05:30
Amith Koujalgi
ffa81cb7df Merge pull request #196 from ollama4j/dependabot/npm_and_yarn/docs/docusaurus/module-type-aliases-3.9.1
Bump @docusaurus/module-type-aliases from 3.9.0 to 3.9.1 in /docs
2025-09-29 19:04:22 +05:30
Amith Koujalgi
bd231e639d Merge pull request #195 from ollama4j/dependabot/maven/com.diffplug.spotless-spotless-maven-plugin-3.0.0
Bump com.diffplug.spotless:spotless-maven-plugin from 2.46.1 to 3.0.0
2025-09-29 19:04:01 +05:30
Amith Koujalgi
73a0a48eab Merge pull request #194 from ollama4j/refactor
Refactor OllamaAPI to Ollama class and update documentation
2025-09-29 14:12:34 +05:30
dependabot[bot]
b347faff83 Bump org.jacoco:jacoco-maven-plugin from 0.8.7 to 0.8.13
Bumps [org.jacoco:jacoco-maven-plugin](https://github.com/jacoco/jacoco) from 0.8.7 to 0.8.13.
- [Release notes](https://github.com/jacoco/jacoco/releases)
- [Commits](https://github.com/jacoco/jacoco/compare/v0.8.7...v0.8.13)

---
updated-dependencies:
- dependency-name: org.jacoco:jacoco-maven-plugin
  dependency-version: 0.8.13
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-09-29 00:39:08 +00:00
dependabot[bot]
16e14ddd1c Bump org.apache.maven.plugins:maven-compiler-plugin
Bumps [org.apache.maven.plugins:maven-compiler-plugin](https://github.com/apache/maven-compiler-plugin) from 3.14.0 to 3.14.1.
- [Release notes](https://github.com/apache/maven-compiler-plugin/releases)
- [Commits](https://github.com/apache/maven-compiler-plugin/compare/maven-compiler-plugin-3.14.0...maven-compiler-plugin-3.14.1)

---
updated-dependencies:
- dependency-name: org.apache.maven.plugins:maven-compiler-plugin
  dependency-version: 3.14.1
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-09-29 00:39:07 +00:00
dependabot[bot]
2117d42f60 Bump @docusaurus/plugin-google-gtag from 3.9.0 to 3.9.1 in /docs
Bumps [@docusaurus/plugin-google-gtag](https://github.com/facebook/docusaurus/tree/HEAD/packages/docusaurus-plugin-google-gtag) from 3.9.0 to 3.9.1.
- [Release notes](https://github.com/facebook/docusaurus/releases)
- [Changelog](https://github.com/facebook/docusaurus/blob/main/CHANGELOG.md)
- [Commits](https://github.com/facebook/docusaurus/commits/v3.9.1/packages/docusaurus-plugin-google-gtag)

---
updated-dependencies:
- dependency-name: "@docusaurus/plugin-google-gtag"
  dependency-version: 3.9.1
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-09-29 00:38:59 +00:00
dependabot[bot]
79b8dbaefd Bump io.github.git-commit-id:git-commit-id-maven-plugin
Bumps [io.github.git-commit-id:git-commit-id-maven-plugin](https://github.com/git-commit-id/git-commit-id-maven-plugin) from 9.0.1 to 9.0.2.
- [Release notes](https://github.com/git-commit-id/git-commit-id-maven-plugin/releases)
- [Commits](https://github.com/git-commit-id/git-commit-id-maven-plugin/compare/v9.0.1...v9.0.2)

---
updated-dependencies:
- dependency-name: io.github.git-commit-id:git-commit-id-maven-plugin
  dependency-version: 9.0.2
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-09-29 00:38:48 +00:00
dependabot[bot]
04f5c28052 Bump org.mockito:mockito-core from 4.1.0 to 5.20.0
Bumps [org.mockito:mockito-core](https://github.com/mockito/mockito) from 4.1.0 to 5.20.0.
- [Release notes](https://github.com/mockito/mockito/releases)
- [Commits](https://github.com/mockito/mockito/compare/v4.1.0...v5.20.0)

---
updated-dependencies:
- dependency-name: org.mockito:mockito-core
  dependency-version: 5.20.0
  dependency-type: direct:development
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-09-29 00:38:48 +00:00
dependabot[bot]
a4da036389 Bump @docusaurus/module-type-aliases from 3.9.0 to 3.9.1 in /docs
Bumps [@docusaurus/module-type-aliases](https://github.com/facebook/docusaurus/tree/HEAD/packages/docusaurus-module-type-aliases) from 3.9.0 to 3.9.1.
- [Release notes](https://github.com/facebook/docusaurus/releases)
- [Changelog](https://github.com/facebook/docusaurus/blob/main/CHANGELOG.md)
- [Commits](https://github.com/facebook/docusaurus/commits/v3.9.1/packages/docusaurus-module-type-aliases)

---
updated-dependencies:
- dependency-name: "@docusaurus/module-type-aliases"
  dependency-version: 3.9.1
  dependency-type: direct:development
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-09-29 00:38:34 +00:00
dependabot[bot]
a73cf015d6 Bump com.diffplug.spotless:spotless-maven-plugin from 2.46.1 to 3.0.0
Bumps [com.diffplug.spotless:spotless-maven-plugin](https://github.com/diffplug/spotless) from 2.46.1 to 3.0.0.
- [Release notes](https://github.com/diffplug/spotless/releases)
- [Changelog](https://github.com/diffplug/spotless/blob/main/CHANGES.md)
- [Commits](https://github.com/diffplug/spotless/compare/maven/2.46.1...lib/3.0.0)

---
updated-dependencies:
- dependency-name: com.diffplug.spotless:spotless-maven-plugin
  dependency-version: 3.0.0
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-09-29 00:38:19 +00:00
18 changed files with 742 additions and 687 deletions

View File

@@ -1,5 +1,5 @@
<div align="center">
<img src='https://raw.githubusercontent.com/ollama4j/ollama4j/65a9d526150da8fcd98e2af6a164f055572bf722/ollama4j.jpeg' width='100' alt="ollama4j-icon">
<img src='https://raw.githubusercontent.com/ollama4j/ollama4j/refs/heads/main/ollama4j-new.jpeg' width='200' alt="ollama4j-icon">
### Ollama4j

View File

@@ -55,7 +55,7 @@ metrics via `/metrics` endpoint:
</dependency>
```
Here is a sample code snippet demonstrating how to retrieve and print metrics:
Here is a sample code snippet demonstrating how to retrieve and print metrics on Grafana:
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/MetricsExample.java" />
@@ -64,8 +64,27 @@ at: http://localhost:8080/metrics
## Integrating with Monitoring Tools
To integrate Ollama4j metrics with external monitoring systems, you can export the metrics endpoint and configure your
monitoring tool to scrape or collect the data. Refer to the [integration guide](../integration/monitoring.md) for
detailed instructions.
### Grafana
For more information on customizing and extending metrics, see the [API documentation](../api/metrics.md).
Use the following sample `docker-compose` file to host a basic Grafana container.
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/docker/docker-compose.yml" />
And run:
```shell
docker-compose -f path/to/your/docker-compose.yml up
```
This starts Granfana at http://localhost:3000
[//]: # (To integrate Ollama4j metrics with external monitoring systems, you can export the metrics endpoint and configure your)
[//]: # (monitoring tool to scrape or collect the data. Refer to the [integration guide]&#40;../integration/monitoring.md&#41; for)
[//]: # (detailed instructions.)
[//]: # ()
[//]: # (For more information on customizing and extending metrics, see the [API documentation]&#40;../api/metrics.md&#41;.)

656
docs/package-lock.json generated

File diff suppressed because it is too large Load Diff

View File

@@ -16,21 +16,21 @@
"dependencies": {
"@docsearch/js": "^4.1.0",
"@docusaurus/core": "^3.9.0",
"@docusaurus/plugin-google-gtag": "^3.8.1",
"@docusaurus/preset-classic": "^3.9.0",
"@docusaurus/theme-mermaid": "^3.9.0",
"@docusaurus/plugin-google-gtag": "^3.9.1",
"@docusaurus/preset-classic": "^3.9.1",
"@docusaurus/theme-mermaid": "^3.9.1",
"@iconify/react": "^6.0.2",
"@mdx-js/react": "^3.1.1",
"clsx": "^2.1.1",
"font-awesome": "^4.7.0",
"prism-react-renderer": "^2.4.1",
"react": "^19.1.1",
"react-dom": "^19.1.1",
"react": "^19.2.0",
"react-dom": "^19.2.0",
"react-icons": "^5.5.0",
"react-image-gallery": "^1.4.0"
},
"devDependencies": {
"@docusaurus/module-type-aliases": "^3.8.1",
"@docusaurus/module-type-aliases": "^3.9.1",
"@docusaurus/types": "^3.4.0"
},
"browserslist": {

BIN
ollama4j-new.jpeg Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 67 KiB

14
pom.xml
View File

@@ -19,7 +19,7 @@
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<maven-surefire-plugin.version>3.5.4</maven-surefire-plugin.version>
<maven-failsafe-plugin.version>3.5.4</maven-failsafe-plugin.version>
<lombok.version>1.18.40</lombok.version>
<lombok.version>1.18.42</lombok.version>
</properties>
<developers>
@@ -150,7 +150,7 @@
<plugin>
<groupId>io.github.git-commit-id</groupId>
<artifactId>git-commit-id-maven-plugin</artifactId>
<version>9.0.1</version>
<version>9.0.2</version>
<executions>
<execution>
<goals>
@@ -167,7 +167,7 @@
<plugin>
<groupId>com.diffplug.spotless</groupId>
<artifactId>spotless-maven-plugin</artifactId>
<version>2.46.1</version>
<version>3.0.0</version>
<configuration>
<formats>
<!-- you can define as many formats as you want, each is independent -->
@@ -232,7 +232,7 @@
<plugins>
<plugin>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.14.0</version>
<version>3.14.1</version>
</plugin>
<plugin>
<artifactId>maven-jar-plugin</artifactId>
@@ -285,7 +285,7 @@
<dependency>
<groupId>org.mockito</groupId>
<artifactId>mockito-core</artifactId>
<version>4.1.0</version>
<version>5.20.0</version>
<scope>test</scope>
</dependency>
<dependency>
@@ -371,7 +371,7 @@
<plugin>
<groupId>org.jacoco</groupId>
<artifactId>jacoco-maven-plugin</artifactId>
<version>0.8.11</version>
<version>0.8.13</version>
<executions>
<execution>
<goals>
@@ -482,7 +482,7 @@
<plugin>
<groupId>org.jacoco</groupId>
<artifactId>jacoco-maven-plugin</artifactId>
<version>0.8.7</version>
<version>0.8.13</version>
<executions>
<execution>
<goals>

View File

@@ -70,10 +70,14 @@ public class Ollama {
*/
@Setter private long requestTimeoutSeconds = 10;
/** The read timeout in seconds for image URLs. */
/**
* The read timeout in seconds for image URLs.
*/
@Setter private int imageURLReadTimeoutSeconds = 10;
/** The connect timeout in seconds for image URLs. */
/**
* The connect timeout in seconds for image URLs.
*/
@Setter private int imageURLConnectTimeoutSeconds = 10;
/**
@@ -280,9 +284,9 @@ public class Ollama {
/**
* Handles retry backoff for pullModel.
*
* @param modelName the name of the model being pulled
* @param currentRetry the current retry attempt (zero-based)
* @param maxRetries the maximum number of retries allowed
* @param modelName the name of the model being pulled
* @param currentRetry the current retry attempt (zero-based)
* @param maxRetries the maximum number of retries allowed
* @param baseDelayMillis the base delay in milliseconds for exponential backoff
* @throws InterruptedException if the thread is interrupted during sleep
*/
@@ -376,7 +380,7 @@ public class Ollama {
* Returns true if the response indicates a successful pull.
*
* @param modelPullResponse the response from the model pull
* @param modelName the name of the model
* @param modelName the name of the model
* @return true if the pull was successful, false otherwise
* @throws OllamaException if the response contains an error
*/
@@ -601,7 +605,7 @@ public class Ollama {
/**
* Deletes a model from the Ollama server.
*
* @param modelName the name of the model to be deleted
* @param modelName the name of the model to be deleted
* @param ignoreIfNotPresent ignore errors if the specified model is not present on the Ollama server
* @throws OllamaException if the response indicates an error status
*/
@@ -758,7 +762,7 @@ public class Ollama {
* Generates a response from a model using the specified parameters and stream observer.
* If {@code streamObserver} is provided, streaming is enabled; otherwise, a synchronous call is made.
*
* @param request the generation request
* @param request the generation request
* @param streamObserver the stream observer for streaming responses, or null for synchronous
* @return the result of the generation
* @throws OllamaException if the request fails
@@ -823,10 +827,10 @@ public class Ollama {
/**
* Generates a response from a model asynchronously, returning a streamer for results.
*
* @param model the model name
* @param model the model name
* @param prompt the prompt to send
* @param raw whether to use raw mode
* @param think whether to use "think" mode
* @param raw whether to use raw mode
* @param think whether to use "think" mode
* @return an OllamaAsyncResultStreamer for streaming results
* @throws OllamaException if the request fails
*/
@@ -861,9 +865,9 @@ public class Ollama {
*
* <p>Note: the OllamaChatRequestModel#getStream() property is not implemented.
*
* @param request request object to be sent to the server
* @param request request object to be sent to the server
* @param tokenHandler callback handler to handle the last token from stream (caution: the
* previous tokens from stream will not be concatenated)
* previous tokens from stream will not be concatenated)
* @return {@link OllamaChatResult}
* @throws OllamaException if the response indicates an error status
*/
@@ -958,12 +962,16 @@ public class Ollama {
* Registers multiple tools in the tool registry.
*
* @param tools a list of {@link Tools.Tool} objects to register. Each tool contains its
* specification and function.
* specification and function.
*/
public void registerTools(List<Tools.Tool> tools) {
toolRegistry.addTools(tools);
}
public List<Tools.Tool> getRegisteredTools() {
return toolRegistry.getRegisteredTools();
}
/**
* Deregisters all tools from the tool registry. This method removes all registered tools,
* effectively clearing the registry.
@@ -979,7 +987,7 @@ public class Ollama {
* and recursively registers annotated tools from all the providers specified in the annotation.
*
* @throws OllamaException if the caller's class is not annotated with {@link
* OllamaToolService} or if reflection-based instantiation or invocation fails
* OllamaToolService} or if reflection-based instantiation or invocation fails
*/
public void registerAnnotatedTools() throws OllamaException {
try {
@@ -1127,7 +1135,7 @@ public class Ollama {
* This method synchronously calls the Ollama API. If a stream handler is provided,
* the request will be streamed; otherwise, a regular synchronous request will be made.
*
* @param ollamaRequestModel the request model containing necessary parameters for the Ollama API request
* @param ollamaRequestModel the request model containing necessary parameters for the Ollama API request
* @param thinkingStreamHandler the stream handler for "thinking" tokens, or null if not used
* @param responseStreamHandler the stream handler to process streaming responses, or null for non-streaming requests
* @return the result of the Ollama API request

View File

@@ -11,6 +11,9 @@ package io.github.ollama4j.models.chat;
import io.github.ollama4j.models.request.OllamaCommonRequest;
import io.github.ollama4j.tools.Tools;
import io.github.ollama4j.utils.OllamaRequestBody;
import io.github.ollama4j.utils.Options;
import java.io.File;
import java.util.ArrayList;
import java.util.Collections;
import java.util.List;
import lombok.Getter;
@@ -20,8 +23,8 @@ import lombok.Setter;
* Defines a Request to use against the ollama /api/chat endpoint.
*
* @see <a href=
* "https://github.com/ollama/ollama/blob/main/docs/api.md#generate-a-chat-completion">Generate
* Chat Completion</a>
* "https://github.com/ollama/ollama/blob/main/docs/api.md#generate-a-chat-completion">Generate
* Chat Completion</a>
*/
@Getter
@Setter
@@ -36,11 +39,15 @@ public class OllamaChatRequest extends OllamaCommonRequest implements OllamaRequ
/**
* Controls whether tools are automatically executed.
*
* <p>If set to {@code true} (the default), tools will be automatically used/applied by the
* library. If set to {@code false}, tool calls will be returned to the client for manual
* <p>
* If set to {@code true} (the default), tools will be automatically
* used/applied by the
* library. If set to {@code false}, tool calls will be returned to the client
* for manual
* handling.
*
* <p>Disabling this should be an explicit operation.
* <p>
* Disabling this should be an explicit operation.
*/
private boolean useTools = true;
@@ -57,7 +64,116 @@ public class OllamaChatRequest extends OllamaCommonRequest implements OllamaRequ
if (!(o instanceof OllamaChatRequest)) {
return false;
}
return this.toString().equals(o.toString());
}
// --- Builder-like fluent API methods ---
public static OllamaChatRequest builder() {
OllamaChatRequest req = new OllamaChatRequest();
req.setMessages(new ArrayList<>());
return req;
}
public OllamaChatRequest withModel(String model) {
this.setModel(model);
return this;
}
public OllamaChatRequest withMessage(OllamaChatMessageRole role, String content) {
return withMessage(role, content, Collections.emptyList());
}
public OllamaChatRequest withMessage(
OllamaChatMessageRole role, String content, List<OllamaChatToolCalls> toolCalls) {
if (this.messages == null || this.messages == Collections.EMPTY_LIST) {
this.messages = new ArrayList<>();
}
this.messages.add(new OllamaChatMessage(role, content, null, toolCalls, null));
return this;
}
public OllamaChatRequest withMessage(
OllamaChatMessageRole role,
String content,
List<OllamaChatToolCalls> toolCalls,
List<File> images) {
if (this.messages == null || this.messages == Collections.EMPTY_LIST) {
this.messages = new ArrayList<>();
}
List<byte[]> imagesAsBytes = new ArrayList<>();
if (images != null) {
for (File image : images) {
try {
imagesAsBytes.add(java.nio.file.Files.readAllBytes(image.toPath()));
} catch (java.io.IOException e) {
throw new RuntimeException(
"Failed to read image file: " + image.getAbsolutePath(), e);
}
}
}
this.messages.add(new OllamaChatMessage(role, content, null, toolCalls, imagesAsBytes));
return this;
}
public OllamaChatRequest withMessages(List<OllamaChatMessage> messages) {
this.setMessages(messages);
return this;
}
public OllamaChatRequest withOptions(Options options) {
if (options != null) {
this.setOptions(options.getOptionsMap());
}
return this;
}
public OllamaChatRequest withGetJsonResponse() {
this.setFormat("json");
return this;
}
public OllamaChatRequest withTemplate(String template) {
this.setTemplate(template);
return this;
}
public OllamaChatRequest withStreaming() {
this.setStream(true);
return this;
}
public OllamaChatRequest withKeepAlive(String keepAlive) {
this.setKeepAlive(keepAlive);
return this;
}
public OllamaChatRequest withThinking(boolean think) {
this.setThink(think);
return this;
}
public OllamaChatRequest withUseTools(boolean useTools) {
this.setUseTools(useTools);
return this;
}
public OllamaChatRequest withTools(List<Tools.Tool> tools) {
this.setTools(tools);
return this;
}
public OllamaChatRequest build() {
return this;
}
public void reset() {
// Only clear the messages, keep model and think as is
if (this.messages == null || this.messages == Collections.EMPTY_LIST) {
this.messages = new ArrayList<>();
} else {
this.messages.clear();
}
}
}

View File

@@ -1,176 +0,0 @@
/*
* Ollama4j - Java library for interacting with Ollama server.
* Copyright (c) 2025 Amith Koujalgi and contributors.
*
* Licensed under the MIT License (the "License");
* you may not use this file except in compliance with the License.
*
*/
package io.github.ollama4j.models.chat;
import io.github.ollama4j.utils.Options;
import io.github.ollama4j.utils.Utils;
import java.io.File;
import java.io.IOException;
import java.nio.file.Files;
import java.util.ArrayList;
import java.util.Collections;
import java.util.List;
import java.util.stream.Collectors;
import lombok.Setter;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
/** Helper class for creating {@link OllamaChatRequest} objects using the builder-pattern. */
public class OllamaChatRequestBuilder {
private static final Logger LOG = LoggerFactory.getLogger(OllamaChatRequestBuilder.class);
private int imageURLConnectTimeoutSeconds = 10;
private int imageURLReadTimeoutSeconds = 10;
private OllamaChatRequest request;
@Setter private boolean useTools = true;
private OllamaChatRequestBuilder() {
request = new OllamaChatRequest();
request.setMessages(new ArrayList<>());
}
public static OllamaChatRequestBuilder builder() {
return new OllamaChatRequestBuilder();
}
public OllamaChatRequestBuilder withImageURLConnectTimeoutSeconds(
int imageURLConnectTimeoutSeconds) {
this.imageURLConnectTimeoutSeconds = imageURLConnectTimeoutSeconds;
return this;
}
public OllamaChatRequestBuilder withImageURLReadTimeoutSeconds(int imageURLReadTimeoutSeconds) {
this.imageURLReadTimeoutSeconds = imageURLReadTimeoutSeconds;
return this;
}
public OllamaChatRequestBuilder withModel(String model) {
request.setModel(model);
return this;
}
public void reset() {
request = new OllamaChatRequest(request.getModel(), request.isThink(), new ArrayList<>());
}
public OllamaChatRequestBuilder withMessage(OllamaChatMessageRole role, String content) {
return withMessage(role, content, Collections.emptyList());
}
public OllamaChatRequestBuilder withMessage(
OllamaChatMessageRole role, String content, List<OllamaChatToolCalls> toolCalls) {
List<OllamaChatMessage> messages = this.request.getMessages();
messages.add(new OllamaChatMessage(role, content, null, toolCalls, null));
return this;
}
public OllamaChatRequestBuilder withMessage(
OllamaChatMessageRole role,
String content,
List<OllamaChatToolCalls> toolCalls,
List<File> images) {
List<OllamaChatMessage> messages = this.request.getMessages();
List<byte[]> binaryImages =
images.stream()
.map(
file -> {
try {
return Files.readAllBytes(file.toPath());
} catch (IOException e) {
LOG.warn(
"File '{}' could not be accessed, will not add to"
+ " message!",
file.toPath(),
e);
return new byte[0];
}
})
.collect(Collectors.toList());
messages.add(new OllamaChatMessage(role, content, null, toolCalls, binaryImages));
return this;
}
public OllamaChatRequestBuilder withMessage(
OllamaChatMessageRole role,
String content,
List<OllamaChatToolCalls> toolCalls,
String... imageUrls)
throws IOException, InterruptedException {
List<OllamaChatMessage> messages = this.request.getMessages();
List<byte[]> binaryImages = null;
if (imageUrls.length > 0) {
binaryImages = new ArrayList<>();
for (String imageUrl : imageUrls) {
try {
binaryImages.add(
Utils.loadImageBytesFromUrl(
imageUrl,
imageURLConnectTimeoutSeconds,
imageURLReadTimeoutSeconds));
} catch (InterruptedException e) {
LOG.error("Failed to load image from URL: '{}'. Cause: {}", imageUrl, e);
Thread.currentThread().interrupt();
throw new InterruptedException(
"Interrupted while loading image from URL: " + imageUrl);
} catch (IOException e) {
LOG.error(
"IOException occurred while loading image from URL '{}'. Cause: {}",
imageUrl,
e.getMessage(),
e);
throw new IOException(
"IOException while loading image from URL: " + imageUrl, e);
}
}
}
messages.add(new OllamaChatMessage(role, content, null, toolCalls, binaryImages));
return this;
}
public OllamaChatRequestBuilder withMessages(List<OllamaChatMessage> messages) {
request.setMessages(messages);
return this;
}
public OllamaChatRequestBuilder withOptions(Options options) {
this.request.setOptions(options.getOptionsMap());
return this;
}
public OllamaChatRequestBuilder withGetJsonResponse() {
this.request.setFormat("json");
return this;
}
public OllamaChatRequestBuilder withTemplate(String template) {
this.request.setTemplate(template);
return this;
}
public OllamaChatRequestBuilder withStreaming() {
this.request.setStream(true);
return this;
}
public OllamaChatRequestBuilder withKeepAlive(String keepAlive) {
this.request.setKeepAlive(keepAlive);
return this;
}
public OllamaChatRequestBuilder withThinking(boolean think) {
this.request.setThink(think);
return this;
}
public OllamaChatRequest build() {
request.setUseTools(useTools);
return request;
}
}

View File

@@ -11,7 +11,14 @@ package io.github.ollama4j.models.generate;
import io.github.ollama4j.models.request.OllamaCommonRequest;
import io.github.ollama4j.tools.Tools;
import io.github.ollama4j.utils.OllamaRequestBody;
import io.github.ollama4j.utils.Options;
import java.io.File;
import java.io.IOException;
import java.nio.file.Files;
import java.util.ArrayList;
import java.util.Base64;
import java.util.List;
import java.util.Map;
import lombok.Getter;
import lombok.Setter;
@@ -41,6 +48,100 @@ public class OllamaGenerateRequest extends OllamaCommonRequest implements Ollama
this.images = images;
}
// --- Builder-style methods ---
public static OllamaGenerateRequest builder() {
return new OllamaGenerateRequest();
}
public OllamaGenerateRequest withPrompt(String prompt) {
this.setPrompt(prompt);
return this;
}
public OllamaGenerateRequest withTools(List<Tools.Tool> tools) {
this.setTools(tools);
return this;
}
public OllamaGenerateRequest withModel(String model) {
this.setModel(model);
return this;
}
public OllamaGenerateRequest withGetJsonResponse() {
this.setFormat("json");
return this;
}
public OllamaGenerateRequest withOptions(Options options) {
this.setOptions(options.getOptionsMap());
return this;
}
public OllamaGenerateRequest withTemplate(String template) {
this.setTemplate(template);
return this;
}
public OllamaGenerateRequest withStreaming(boolean streaming) {
this.setStream(streaming);
return this;
}
public OllamaGenerateRequest withKeepAlive(String keepAlive) {
this.setKeepAlive(keepAlive);
return this;
}
public OllamaGenerateRequest withRaw(boolean raw) {
this.setRaw(raw);
return this;
}
public OllamaGenerateRequest withThink(boolean think) {
this.setThink(think);
return this;
}
public OllamaGenerateRequest withUseTools(boolean useTools) {
this.setUseTools(useTools);
return this;
}
public OllamaGenerateRequest withFormat(Map<String, Object> format) {
this.setFormat(format);
return this;
}
public OllamaGenerateRequest withSystem(String system) {
this.setSystem(system);
return this;
}
public OllamaGenerateRequest withContext(String context) {
this.setContext(context);
return this;
}
public OllamaGenerateRequest withImagesBase64(List<String> images) {
this.setImages(images);
return this;
}
public OllamaGenerateRequest withImages(List<File> imageFiles) throws IOException {
List<String> images = new ArrayList<>();
for (File imageFile : imageFiles) {
images.add(Base64.getEncoder().encodeToString(Files.readAllBytes(imageFile.toPath())));
}
this.setImages(images);
return this;
}
public OllamaGenerateRequest build() {
return this;
}
@Override
public boolean equals(Object o) {
if (!(o instanceof OllamaGenerateRequest)) {

View File

@@ -1,121 +0,0 @@
/*
* Ollama4j - Java library for interacting with Ollama server.
* Copyright (c) 2025 Amith Koujalgi and contributors.
*
* Licensed under the MIT License (the "License");
* you may not use this file except in compliance with the License.
*
*/
package io.github.ollama4j.models.generate;
import io.github.ollama4j.tools.Tools;
import io.github.ollama4j.utils.Options;
import java.io.File;
import java.io.IOException;
import java.nio.file.Files;
import java.util.ArrayList;
import java.util.Base64;
import java.util.List;
/** Helper class for creating {@link OllamaGenerateRequest} objects using the builder-pattern. */
public class OllamaGenerateRequestBuilder {
private OllamaGenerateRequestBuilder() {
request = new OllamaGenerateRequest();
}
private OllamaGenerateRequest request;
public static OllamaGenerateRequestBuilder builder() {
return new OllamaGenerateRequestBuilder();
}
public OllamaGenerateRequest build() {
return request;
}
public OllamaGenerateRequestBuilder withPrompt(String prompt) {
request.setPrompt(prompt);
return this;
}
public OllamaGenerateRequestBuilder withTools(List<Tools.Tool> tools) {
request.setTools(tools);
return this;
}
public OllamaGenerateRequestBuilder withModel(String model) {
request.setModel(model);
return this;
}
public OllamaGenerateRequestBuilder withGetJsonResponse() {
this.request.setFormat("json");
return this;
}
public OllamaGenerateRequestBuilder withOptions(Options options) {
this.request.setOptions(options.getOptionsMap());
return this;
}
public OllamaGenerateRequestBuilder withTemplate(String template) {
this.request.setTemplate(template);
return this;
}
public OllamaGenerateRequestBuilder withStreaming(boolean streaming) {
this.request.setStream(streaming);
return this;
}
public OllamaGenerateRequestBuilder withKeepAlive(String keepAlive) {
this.request.setKeepAlive(keepAlive);
return this;
}
public OllamaGenerateRequestBuilder withRaw(boolean raw) {
this.request.setRaw(raw);
return this;
}
public OllamaGenerateRequestBuilder withThink(boolean think) {
this.request.setThink(think);
return this;
}
public OllamaGenerateRequestBuilder withUseTools(boolean useTools) {
this.request.setUseTools(useTools);
return this;
}
public OllamaGenerateRequestBuilder withFormat(java.util.Map<String, Object> format) {
this.request.setFormat(format);
return this;
}
public OllamaGenerateRequestBuilder withSystem(String system) {
this.request.setSystem(system);
return this;
}
public OllamaGenerateRequestBuilder withContext(String context) {
this.request.setContext(context);
return this;
}
public OllamaGenerateRequestBuilder withImagesBase64(java.util.List<String> images) {
this.request.setImages(images);
return this;
}
public OllamaGenerateRequestBuilder withImages(java.util.List<File> imageFiles)
throws IOException {
java.util.List<String> images = new ArrayList<>();
for (File imageFile : imageFiles) {
images.add(Base64.getEncoder().encodeToString(Files.readAllBytes(imageFile.toPath())));
}
this.request.setImages(images);
return this;
}
}

View File

@@ -96,7 +96,6 @@ public class OllamaChatEndpointCaller extends OllamaEndpointCaller {
getRequestBuilderDefault(uri).POST(body.getBodyPublisher());
HttpRequest request = requestBuilder.build();
LOG.debug("Asking model: {}", body);
System.out.println("Asking model: " + Utils.toJSON(body));
HttpResponse<InputStream> response =
httpClient.send(request, HttpResponse.BodyHandlers.ofInputStream());

View File

@@ -18,7 +18,6 @@ import io.github.ollama4j.models.chat.*;
import io.github.ollama4j.models.embed.OllamaEmbedRequest;
import io.github.ollama4j.models.embed.OllamaEmbedResult;
import io.github.ollama4j.models.generate.OllamaGenerateRequest;
import io.github.ollama4j.models.generate.OllamaGenerateRequestBuilder;
import io.github.ollama4j.models.generate.OllamaGenerateStreamObserver;
import io.github.ollama4j.models.response.Model;
import io.github.ollama4j.models.response.ModelDetail;
@@ -272,7 +271,7 @@ class OllamaIntegrationTest {
format.put("required", List.of("isNoon"));
OllamaGenerateRequest request =
OllamaGenerateRequestBuilder.builder()
OllamaGenerateRequest.builder()
.withModel(TOOLS_MODEL)
.withPrompt(prompt)
.withFormat(format)
@@ -299,7 +298,7 @@ class OllamaIntegrationTest {
boolean raw = false;
boolean thinking = false;
OllamaGenerateRequest request =
OllamaGenerateRequestBuilder.builder()
OllamaGenerateRequest.builder()
.withModel(GENERAL_PURPOSE_MODEL)
.withPrompt(
"What is the capital of France? And what's France's connection with"
@@ -327,7 +326,7 @@ class OllamaIntegrationTest {
api.pullModel(GENERAL_PURPOSE_MODEL);
boolean raw = false;
OllamaGenerateRequest request =
OllamaGenerateRequestBuilder.builder()
OllamaGenerateRequest.builder()
.withModel(GENERAL_PURPOSE_MODEL)
.withPrompt(
"What is the capital of France? And what's France's connection with"
@@ -357,8 +356,7 @@ class OllamaIntegrationTest {
void shouldGenerateWithCustomOptions() throws OllamaException {
api.pullModel(GENERAL_PURPOSE_MODEL);
OllamaChatRequestBuilder builder =
OllamaChatRequestBuilder.builder().withModel(GENERAL_PURPOSE_MODEL);
OllamaChatRequest builder = OllamaChatRequest.builder().withModel(GENERAL_PURPOSE_MODEL);
OllamaChatRequest requestModel =
builder.withMessage(
OllamaChatMessageRole.SYSTEM,
@@ -390,8 +388,7 @@ class OllamaIntegrationTest {
String expectedResponse = "Bhai";
OllamaChatRequestBuilder builder =
OllamaChatRequestBuilder.builder().withModel(GENERAL_PURPOSE_MODEL);
OllamaChatRequest builder = OllamaChatRequest.builder().withModel(GENERAL_PURPOSE_MODEL);
OllamaChatRequest requestModel =
builder.withMessage(
OllamaChatMessageRole.SYSTEM,
@@ -429,8 +426,7 @@ class OllamaIntegrationTest {
@Order(10)
void shouldChatWithHistory() throws Exception {
api.pullModel(THINKING_TOOL_MODEL);
OllamaChatRequestBuilder builder =
OllamaChatRequestBuilder.builder().withModel(THINKING_TOOL_MODEL);
OllamaChatRequest builder = OllamaChatRequest.builder().withModel(THINKING_TOOL_MODEL);
OllamaChatRequest requestModel =
builder.withMessage(
@@ -481,8 +477,7 @@ class OllamaIntegrationTest {
void shouldChatWithExplicitTool() throws OllamaException {
String theToolModel = TOOLS_MODEL;
api.pullModel(theToolModel);
OllamaChatRequestBuilder builder =
OllamaChatRequestBuilder.builder().withModel(theToolModel);
OllamaChatRequest builder = OllamaChatRequest.builder().withModel(theToolModel);
api.registerTool(EmployeeFinderToolSpec.getSpecification());
@@ -534,8 +529,7 @@ class OllamaIntegrationTest {
void shouldChatWithExplicitToolAndUseTools() throws OllamaException {
String theToolModel = TOOLS_MODEL;
api.pullModel(theToolModel);
OllamaChatRequestBuilder builder =
OllamaChatRequestBuilder.builder().withModel(theToolModel);
OllamaChatRequest builder = OllamaChatRequest.builder().withModel(theToolModel);
api.registerTool(EmployeeFinderToolSpec.getSpecification());
@@ -579,8 +573,7 @@ class OllamaIntegrationTest {
String theToolModel = TOOLS_MODEL;
api.pullModel(theToolModel);
OllamaChatRequestBuilder builder =
OllamaChatRequestBuilder.builder().withModel(theToolModel);
OllamaChatRequest builder = OllamaChatRequest.builder().withModel(theToolModel);
api.registerTool(EmployeeFinderToolSpec.getSpecification());
@@ -633,8 +626,7 @@ class OllamaIntegrationTest {
void shouldChatWithAnnotatedToolSingleParam() throws OllamaException {
String theToolModel = TOOLS_MODEL;
api.pullModel(theToolModel);
OllamaChatRequestBuilder builder =
OllamaChatRequestBuilder.builder().withModel(theToolModel);
OllamaChatRequest builder = OllamaChatRequest.builder().withModel(theToolModel);
api.registerAnnotatedTools();
@@ -680,8 +672,7 @@ class OllamaIntegrationTest {
void shouldChatWithAnnotatedToolMultipleParams() throws OllamaException {
String theToolModel = TOOLS_MODEL;
api.pullModel(theToolModel);
OllamaChatRequestBuilder builder =
OllamaChatRequestBuilder.builder().withModel(theToolModel);
OllamaChatRequest builder = OllamaChatRequest.builder().withModel(theToolModel);
api.registerAnnotatedTools(new AnnotatedTool());
@@ -712,8 +703,7 @@ class OllamaIntegrationTest {
void shouldChatWithStream() throws OllamaException {
api.deregisterTools();
api.pullModel(GENERAL_PURPOSE_MODEL);
OllamaChatRequestBuilder builder =
OllamaChatRequestBuilder.builder().withModel(GENERAL_PURPOSE_MODEL);
OllamaChatRequest builder = OllamaChatRequest.builder().withModel(GENERAL_PURPOSE_MODEL);
OllamaChatRequest requestModel =
builder.withMessage(
OllamaChatMessageRole.USER,
@@ -739,8 +729,7 @@ class OllamaIntegrationTest {
@Order(15)
void shouldChatWithThinkingAndStream() throws OllamaException {
api.pullModel(THINKING_TOOL_MODEL_2);
OllamaChatRequestBuilder builder =
OllamaChatRequestBuilder.builder().withModel(THINKING_TOOL_MODEL_2);
OllamaChatRequest builder = OllamaChatRequest.builder().withModel(THINKING_TOOL_MODEL_2);
OllamaChatRequest requestModel =
builder.withMessage(
OllamaChatMessageRole.USER,
@@ -758,32 +747,6 @@ class OllamaIntegrationTest {
assertNotNull(chatResult.getResponseModel().getMessage().getResponse());
}
/**
* Tests chat API with an image input from a URL.
*
* <p>Scenario: Sends a user message with an image URL and verifies the assistant's response.
* Usage: chat, vision model, image from URL, no tools, no thinking, no streaming.
*/
@Test
@Order(10)
void shouldChatWithImageFromURL() throws OllamaException, IOException, InterruptedException {
api.pullModel(VISION_MODEL);
OllamaChatRequestBuilder builder =
OllamaChatRequestBuilder.builder().withModel(VISION_MODEL);
OllamaChatRequest requestModel =
builder.withMessage(
OllamaChatMessageRole.USER,
"What's in the picture?",
Collections.emptyList(),
"https://t3.ftcdn.net/jpg/02/96/63/80/360_F_296638053_0gUVA4WVBKceGsIr7LNqRWSnkusi07dq.jpg")
.build();
api.registerAnnotatedTools(new OllamaIntegrationTest());
OllamaChatResult chatResult = api.chat(requestModel, null);
assertNotNull(chatResult);
}
/**
* Tests chat API with an image input from a file and multi-turn history.
*
@@ -795,8 +758,7 @@ class OllamaIntegrationTest {
@Order(10)
void shouldChatWithImageFromFileAndHistory() throws OllamaException {
api.pullModel(VISION_MODEL);
OllamaChatRequestBuilder builder =
OllamaChatRequestBuilder.builder().withModel(VISION_MODEL);
OllamaChatRequest builder = OllamaChatRequest.builder().withModel(VISION_MODEL);
OllamaChatRequest requestModel =
builder.withMessage(
OllamaChatMessageRole.USER,
@@ -832,7 +794,7 @@ class OllamaIntegrationTest {
api.pullModel(VISION_MODEL);
try {
OllamaGenerateRequest request =
OllamaGenerateRequestBuilder.builder()
OllamaGenerateRequest.builder()
.withModel(VISION_MODEL)
.withPrompt("What is in this image?")
.withRaw(false)
@@ -865,7 +827,7 @@ class OllamaIntegrationTest {
void shouldGenerateWithImageFilesAndResponseStreamed() throws OllamaException, IOException {
api.pullModel(VISION_MODEL);
OllamaGenerateRequest request =
OllamaGenerateRequestBuilder.builder()
OllamaGenerateRequest.builder()
.withModel(VISION_MODEL)
.withPrompt("What is in this image?")
.withRaw(false)
@@ -900,7 +862,7 @@ class OllamaIntegrationTest {
boolean think = true;
OllamaGenerateRequest request =
OllamaGenerateRequestBuilder.builder()
OllamaGenerateRequest.builder()
.withModel(THINKING_TOOL_MODEL)
.withPrompt("Who are you?")
.withRaw(raw)
@@ -929,7 +891,7 @@ class OllamaIntegrationTest {
api.pullModel(THINKING_TOOL_MODEL);
boolean raw = false;
OllamaGenerateRequest request =
OllamaGenerateRequestBuilder.builder()
OllamaGenerateRequest.builder()
.withModel(THINKING_TOOL_MODEL)
.withPrompt("Who are you?")
.withRaw(raw)
@@ -967,7 +929,7 @@ class OllamaIntegrationTest {
boolean raw = true;
boolean thinking = false;
OllamaGenerateRequest request =
OllamaGenerateRequestBuilder.builder()
OllamaGenerateRequest.builder()
.withModel(GENERAL_PURPOSE_MODEL)
.withPrompt("What is 2+2?")
.withRaw(raw)
@@ -995,7 +957,7 @@ class OllamaIntegrationTest {
api.pullModel(GENERAL_PURPOSE_MODEL);
boolean raw = true;
OllamaGenerateRequest request =
OllamaGenerateRequestBuilder.builder()
OllamaGenerateRequest.builder()
.withModel(GENERAL_PURPOSE_MODEL)
.withPrompt("What is the largest planet in our solar system?")
.withRaw(raw)
@@ -1028,7 +990,7 @@ class OllamaIntegrationTest {
// 'response' tokens
boolean raw = true;
OllamaGenerateRequest request =
OllamaGenerateRequestBuilder.builder()
OllamaGenerateRequest.builder()
.withModel(THINKING_TOOL_MODEL)
.withPrompt(
"Count 1 to 5. Just give me the numbers and do not give any other"
@@ -1093,7 +1055,7 @@ class OllamaIntegrationTest {
format.put("required", List.of("cities"));
OllamaGenerateRequest request =
OllamaGenerateRequestBuilder.builder()
OllamaGenerateRequest.builder()
.withModel(TOOLS_MODEL)
.withPrompt(prompt)
.withFormat(format)
@@ -1119,8 +1081,7 @@ class OllamaIntegrationTest {
@Order(26)
void shouldChatWithThinkingNoStream() throws OllamaException {
api.pullModel(THINKING_TOOL_MODEL);
OllamaChatRequestBuilder builder =
OllamaChatRequestBuilder.builder().withModel(THINKING_TOOL_MODEL);
OllamaChatRequest builder = OllamaChatRequest.builder().withModel(THINKING_TOOL_MODEL);
OllamaChatRequest requestModel =
builder.withMessage(
OllamaChatMessageRole.USER,
@@ -1149,8 +1110,7 @@ class OllamaIntegrationTest {
void shouldChatWithCustomOptionsAndStreaming() throws OllamaException {
api.pullModel(GENERAL_PURPOSE_MODEL);
OllamaChatRequestBuilder builder =
OllamaChatRequestBuilder.builder().withModel(GENERAL_PURPOSE_MODEL);
OllamaChatRequest builder = OllamaChatRequest.builder().withModel(GENERAL_PURPOSE_MODEL);
OllamaChatRequest requestModel =
builder.withMessage(
OllamaChatMessageRole.USER,
@@ -1184,8 +1144,7 @@ class OllamaIntegrationTest {
api.registerTool(EmployeeFinderToolSpec.getSpecification());
OllamaChatRequestBuilder builder =
OllamaChatRequestBuilder.builder().withModel(THINKING_TOOL_MODEL_2);
OllamaChatRequest builder = OllamaChatRequest.builder().withModel(THINKING_TOOL_MODEL_2);
OllamaChatRequest requestModel =
builder.withMessage(
OllamaChatMessageRole.USER,
@@ -1219,8 +1178,7 @@ class OllamaIntegrationTest {
File image1 = getImageFileFromClasspath("emoji-smile.jpeg");
File image2 = getImageFileFromClasspath("roses.jpg");
OllamaChatRequestBuilder builder =
OllamaChatRequestBuilder.builder().withModel(VISION_MODEL);
OllamaChatRequest builder = OllamaChatRequest.builder().withModel(VISION_MODEL);
OllamaChatRequest requestModel =
builder.withMessage(
OllamaChatMessageRole.USER,
@@ -1247,7 +1205,7 @@ class OllamaIntegrationTest {
void shouldHandleNonExistentModel() {
String nonExistentModel = "this-model-does-not-exist:latest";
OllamaGenerateRequest request =
OllamaGenerateRequestBuilder.builder()
OllamaGenerateRequest.builder()
.withModel(nonExistentModel)
.withPrompt("Hello")
.withRaw(false)
@@ -1274,8 +1232,7 @@ class OllamaIntegrationTest {
api.pullModel(GENERAL_PURPOSE_MODEL);
List<OllamaChatToolCalls> tools = Collections.emptyList();
OllamaChatRequestBuilder builder =
OllamaChatRequestBuilder.builder().withModel(GENERAL_PURPOSE_MODEL);
OllamaChatRequest builder = OllamaChatRequest.builder().withModel(GENERAL_PURPOSE_MODEL);
OllamaChatRequest requestModel =
builder.withMessage(OllamaChatMessageRole.USER, " ", tools) // whitespace only
.build();
@@ -1298,7 +1255,7 @@ class OllamaIntegrationTest {
void shouldGenerateWithExtremeParameters() throws OllamaException {
api.pullModel(GENERAL_PURPOSE_MODEL);
OllamaGenerateRequest request =
OllamaGenerateRequestBuilder.builder()
OllamaGenerateRequest.builder()
.withModel(GENERAL_PURPOSE_MODEL)
.withPrompt("Generate a random word")
.withRaw(false)
@@ -1351,8 +1308,7 @@ class OllamaIntegrationTest {
void shouldChatWithKeepAlive() throws OllamaException {
api.pullModel(GENERAL_PURPOSE_MODEL);
OllamaChatRequestBuilder builder =
OllamaChatRequestBuilder.builder().withModel(GENERAL_PURPOSE_MODEL);
OllamaChatRequest builder = OllamaChatRequest.builder().withModel(GENERAL_PURPOSE_MODEL);
OllamaChatRequest requestModel =
builder.withMessage(OllamaChatMessageRole.USER, "Hello, how are you?")
.withKeepAlive("5m") // Keep model loaded for 5 minutes
@@ -1376,7 +1332,7 @@ class OllamaIntegrationTest {
void shouldGenerateWithAdvancedOptions() throws OllamaException {
api.pullModel(GENERAL_PURPOSE_MODEL);
OllamaGenerateRequest request =
OllamaGenerateRequestBuilder.builder()
OllamaGenerateRequest.builder()
.withModel(GENERAL_PURPOSE_MODEL)
.withPrompt("Write a detailed explanation of machine learning")
.withRaw(false)
@@ -1421,8 +1377,8 @@ class OllamaIntegrationTest {
new Thread(
() -> {
try {
OllamaChatRequestBuilder builder =
OllamaChatRequestBuilder.builder()
OllamaChatRequest builder =
OllamaChatRequest.builder()
.withModel(GENERAL_PURPOSE_MODEL);
OllamaChatRequest requestModel =
builder.withMessage(

View File

@@ -13,7 +13,6 @@ import static org.junit.jupiter.api.Assertions.*;
import io.github.ollama4j.Ollama;
import io.github.ollama4j.exceptions.OllamaException;
import io.github.ollama4j.models.generate.OllamaGenerateRequest;
import io.github.ollama4j.models.generate.OllamaGenerateRequestBuilder;
import io.github.ollama4j.models.generate.OllamaGenerateStreamObserver;
import io.github.ollama4j.models.response.OllamaResult;
import io.github.ollama4j.samples.AnnotatedTool;
@@ -205,7 +204,7 @@ public class WithAuth {
format.put("required", List.of("isNoon"));
OllamaGenerateRequest request =
OllamaGenerateRequestBuilder.builder()
OllamaGenerateRequest.builder()
.withModel(model)
.withPrompt(prompt)
.withRaw(false)

View File

@@ -19,7 +19,6 @@ import io.github.ollama4j.models.chat.OllamaChatMessageRole;
import io.github.ollama4j.models.embed.OllamaEmbedRequest;
import io.github.ollama4j.models.embed.OllamaEmbedResult;
import io.github.ollama4j.models.generate.OllamaGenerateRequest;
import io.github.ollama4j.models.generate.OllamaGenerateRequestBuilder;
import io.github.ollama4j.models.generate.OllamaGenerateStreamObserver;
import io.github.ollama4j.models.request.CustomModelRequest;
import io.github.ollama4j.models.response.ModelDetail;
@@ -158,7 +157,7 @@ class TestMockedAPIs {
OllamaGenerateStreamObserver observer = new OllamaGenerateStreamObserver(null, null);
try {
OllamaGenerateRequest request =
OllamaGenerateRequestBuilder.builder()
OllamaGenerateRequest.builder()
.withModel(model)
.withPrompt(prompt)
.withRaw(false)
@@ -180,7 +179,7 @@ class TestMockedAPIs {
String prompt = "some prompt text";
try {
OllamaGenerateRequest request =
OllamaGenerateRequestBuilder.builder()
OllamaGenerateRequest.builder()
.withModel(model)
.withPrompt(prompt)
.withRaw(false)
@@ -206,7 +205,7 @@ class TestMockedAPIs {
String prompt = "some prompt text";
try {
OllamaGenerateRequest request =
OllamaGenerateRequestBuilder.builder()
OllamaGenerateRequest.builder()
.withModel(model)
.withPrompt(prompt)
.withRaw(false)

View File

@@ -12,15 +12,14 @@ import static org.junit.jupiter.api.Assertions.*;
import io.github.ollama4j.models.chat.OllamaChatMessageRole;
import io.github.ollama4j.models.chat.OllamaChatRequest;
import io.github.ollama4j.models.chat.OllamaChatRequestBuilder;
import org.junit.jupiter.api.Test;
class TestOllamaChatRequestBuilder {
@Test
void testResetClearsMessagesButKeepsModelAndThink() {
OllamaChatRequestBuilder builder =
OllamaChatRequestBuilder.builder()
OllamaChatRequest builder =
OllamaChatRequest.builder()
.withModel("my-model")
.withThinking(true)
.withMessage(OllamaChatMessageRole.USER, "first");

View File

@@ -13,7 +13,6 @@ import static org.junit.jupiter.api.Assertions.assertThrowsExactly;
import io.github.ollama4j.models.chat.OllamaChatMessageRole;
import io.github.ollama4j.models.chat.OllamaChatRequest;
import io.github.ollama4j.models.chat.OllamaChatRequestBuilder;
import io.github.ollama4j.utils.OptionsBuilder;
import java.io.File;
import java.util.Collections;
@@ -24,11 +23,11 @@ import org.junit.jupiter.api.Test;
public class TestChatRequestSerialization extends AbstractSerializationTest<OllamaChatRequest> {
private OllamaChatRequestBuilder builder;
private OllamaChatRequest builder;
@BeforeEach
public void init() {
builder = OllamaChatRequestBuilder.builder().withModel("DummyModel");
builder = OllamaChatRequest.builder().withModel("DummyModel");
}
@Test

View File

@@ -11,7 +11,6 @@ package io.github.ollama4j.unittests.jackson;
import static org.junit.jupiter.api.Assertions.assertEquals;
import io.github.ollama4j.models.generate.OllamaGenerateRequest;
import io.github.ollama4j.models.generate.OllamaGenerateRequestBuilder;
import io.github.ollama4j.utils.OptionsBuilder;
import org.json.JSONObject;
import org.junit.jupiter.api.BeforeEach;
@@ -19,16 +18,17 @@ import org.junit.jupiter.api.Test;
class TestGenerateRequestSerialization extends AbstractSerializationTest<OllamaGenerateRequest> {
private OllamaGenerateRequestBuilder builder;
private OllamaGenerateRequest builder;
@BeforeEach
public void init() {
builder = OllamaGenerateRequestBuilder.builder().withModel("Dummy Model");
builder = OllamaGenerateRequest.builder().withModel("Dummy Model");
}
@Test
public void testRequestOnlyMandatoryFields() {
OllamaGenerateRequest req = builder.withPrompt("Some prompt").build();
OllamaGenerateRequest req =
builder.withPrompt("Some prompt").withModel("Dummy Model").build();
String jsonRequest = serialize(req);
assertEqualsAfterUnmarshalling(deserialize(jsonRequest, OllamaGenerateRequest.class), req);
@@ -38,7 +38,10 @@ class TestGenerateRequestSerialization extends AbstractSerializationTest<OllamaG
public void testRequestWithOptions() {
OptionsBuilder b = new OptionsBuilder();
OllamaGenerateRequest req =
builder.withPrompt("Some prompt").withOptions(b.setMirostat(1).build()).build();
builder.withPrompt("Some prompt")
.withOptions(b.setMirostat(1).build())
.withModel("Dummy Model")
.build();
String jsonRequest = serialize(req);
OllamaGenerateRequest deserializeRequest =
@@ -49,7 +52,11 @@ class TestGenerateRequestSerialization extends AbstractSerializationTest<OllamaG
@Test
public void testWithJsonFormat() {
OllamaGenerateRequest req = builder.withPrompt("Some prompt").withGetJsonResponse().build();
OllamaGenerateRequest req =
builder.withPrompt("Some prompt")
.withGetJsonResponse()
.withModel("Dummy Model")
.build();
String jsonRequest = serialize(req);
System.out.printf(jsonRequest);