Compare commits

..

170 Commits

Author SHA1 Message Date
Amith Koujalgi
48d0a494ee Merge pull request #119 from ollama4j/113-tests-fix
All checks were successful
Mark stale issues / stale (push) Successful in 27s
[Fix]: NPE when tool not found
2025-04-06 23:34:53 +05:30
amithkoujalgi
b2e1330ec0 Updated 'build-docs' to 'docs' and 'start-docs' to 'docs-dev' for clarity.
Updated a workflow to run tests and build docs on pull requests.
2025-04-06 23:22:29 +05:30
amithkoujalgi
a2d95a052a - Updated Makefile to add a new remote integration test command.
- Updated docs – Enhanced DBQueryFunction to validate input arguments and throw a RuntimeException if required arguments are missing.
- Updated docs – Refactored tool specifications for fuel price, weather, and employee details to use a unified prompt function structure.  (Addresses #116).
- Improved javadoc for `chatStreaming()` API. (Addresses #115).
- Introduced `ToolInvocationException` to handle errors during tool invocation in OllamaAPI. (Addresses #117).
- Updated integration tests to include `ToolInvocationException` in method signatures for better error handling.
2025-04-06 23:05:26 +05:30
Amith Koujalgi
d86217dd0b Merge pull request #114 from ollama4j/113-tests-fix
All checks were successful
Mark stale issues / stale (push) Successful in 39s
Tests fix
2025-03-25 22:30:56 +05:30
amithkoujalgi
70b4a7961a Used smaller-sized images for the test cases 2025-03-25 21:56:45 +05:30
amithkoujalgi
0248f21654 Used smaller-sized images for the test cases 2025-03-25 21:34:13 +05:30
Amith Koujalgi
252ea50717 Update OllamaAPIIntegrationTest.java 2025-03-25 20:40:58 +05:30
Amith Koujalgi
1155a9be9f Merge pull request #113 from bavardage/support-handing-images-as-byte-arrays
add api methods to support passing images as byte[]
2025-03-25 20:36:12 +05:30
Benjamin Duffield
bc87d0c7ec add api methods to support passing images as byte[] 2025-03-25 14:00:18 +00:00
Amith Koujalgi
a895e3a0ea Merge pull request #112 from ollama4j/structured-output
Add structured response model
2025-03-25 18:59:56 +05:30
amithkoujalgi
68acedfd0a Merge branch 'structured-output' of https://github.com/ollama4j/ollama4j into structured-output 2025-03-25 00:22:22 +05:30
amithkoujalgi
be08f11027 Update OllamaAPIIntegrationTest.java 2025-03-25 00:20:25 +05:30
amithkoujalgi
b9b18271a1 Support for structured output
Added support for structured output
2025-03-24 23:45:01 +05:30
amithkoujalgi
12aa38cab0 updated doc 2025-03-24 23:43:20 +05:30
amithkoujalgi
b05f1d9b12 Refactor imports in OllamaAPIIntegrationTest for improved clarity and organization 2025-03-24 23:33:35 +05:30
Amith Koujalgi
bc2a931586 Enhance OllamaAPI and OllamaResult for improved model pulling and structured responses
- Added a retry mechanism in OllamaAPI for model pulling, allowing configurable retries.
- Introduced new methods in OllamaResult for structured response handling, including parsing JSON responses into a Map or specific class types.
- Updated integration tests to validate the new functionality and ensure robust testing of model interactions.
- Improved code formatting and consistency across the OllamaAPI and integration test classes.
2025-03-24 21:40:20 +05:30
Amith Koujalgi
1bda78e35b revert OllamaResult.java
Signed-off-by: Amith Koujalgi <koujalgi.amith@gmail.com>
2025-03-24 18:24:09 +05:30
Amith Koujalgi
e7e71f6421 Enhance integration tests and Makefile for external Ollama host support
- Updated the Makefile to include a new target for local integration tests with external Ollama host configuration.
- Modified the OllamaAPIIntegrationTest to dynamically set the Ollama host based on environment variables, allowing for both external and Testcontainers usage.
- Refactored model pulling logic in tests to use constants for model names, improving readability and maintainability.
2025-03-24 17:59:24 +05:30
Amith Koujalgi
57f874921c Add Windows installation note for Chocolatey in README
- Added a note in the README.md to guide Windows users on installing Chocolatey Package Manager and using it to install `make`.
- Suggested running the installation command with administrator privileges for better success.
2025-03-24 15:48:01 +05:30
Amith Koujalgi
2d7902167b Enhance OllamaAPI and documentation for structured responses
- Updated OllamaAPI to return an instance of OllamaResult instead of OllamaStructuredResult for structured responses.
- Removed the obsolete OllamaStructuredResult class.
- Added new methods in OllamaResult for retrieving structured responses as a Map or mapped to a specific class type.
- Updated integration tests to validate the new structured response functionality.
- Improved Makefile with a new full-build target for building the project.
2025-03-24 15:30:00 +05:30
amithkoujalgi
407b7eb280 Refactor OllamaAPI documentation and add structured response model
- Improved formatting and readability of comments in OllamaAPI.java.
- Introduced OllamaStructuredResult class to handle structured responses from the Ollama API.
- Updated integration tests to include a new test for structured output from the API.
- Cleaned up imports and ensured consistent code style across the OllamaAPIIntegrationTest class.
2025-03-24 00:25:20 +05:30
amithkoujalgi
e62a7511db Merge remote-tracking branch 'origin/main'
All checks were successful
Mark stale issues / stale (push) Successful in 38s
2025-03-23 22:21:42 +05:30
Amith Koujalgi
c904a69b09 Merge pull request #110 from ollama4j/integration-tests-updates
All checks were successful
Mark stale issues / stale (push) Successful in 40s
Updated docs
2025-03-19 08:50:22 +05:30
Amith Koujalgi
11bf20c405 Updated docs 2025-03-19 08:49:58 +05:30
Amith Koujalgi
c3273ea8ca Merge pull request #109 from ollama4j/integration-tests-updates
Updated docs
2025-03-19 08:32:00 +05:30
Amith Koujalgi
f6a29842b5 Updated docs 2025-03-19 08:31:10 +05:30
Amith Koujalgi
3781ea7a51 Merge pull request #108 from ollama4j/integration-tests-updates
Add blog post about tooling with Couchbase
2025-03-19 08:20:39 +05:30
Amith Koujalgi
6f1da25f7e Updated GH action 2025-03-19 08:20:09 +05:30
Amith Koujalgi
e74ef7115c Add blog post about tooling with couchbase 2025-03-19 08:08:39 +05:30
Amith Koujalgi
c9db51a71e Merge pull request #107 from ollama4j/integration-tests-updates
All checks were successful
Mark stale issues / stale (push) Successful in 21s
test
2025-03-18 23:29:21 +05:30
Amith Koujalgi
681a692ca9 Updated integration tests 2025-03-18 23:18:42 +05:30
Amith Koujalgi
9a6065fdb3 Updated integration tests 2025-03-18 23:07:19 +05:30
Amith Koujalgi
e245d9633f Updated integration tests 2025-03-18 22:54:56 +05:30
Amith Koujalgi
590364dd53 test 2025-03-18 22:33:29 +05:30
Amith Koujalgi
bb4e7477bd Merge pull request #106 from ollama4j/integration-tests-updates
test
2025-03-18 22:32:38 +05:30
Amith Koujalgi
c33c1c8315 test 2025-03-18 22:23:06 +05:30
Amith Koujalgi
05eecdccaa Merge pull request #105 from ollama4j/integration-tests-refactor
Integration tests refactor
2025-03-18 22:17:33 +05:30
Amith Koujalgi
26bb2f9bab Updated GH workflow 2025-03-18 22:15:58 +05:30
Amith Koujalgi
bbafc95577 Updated GH workflow 2025-03-18 22:13:39 +05:30
Amith Koujalgi
bee09aa626 Updated integration tests 2025-03-18 22:03:04 +05:30
Amith Koujalgi
8aa6e3b066 Updated integration tests 2025-03-18 21:41:20 +05:30
Amith Koujalgi
d40912c638 Merge remote-tracking branch 'origin/main' into integration-tests-refactor 2025-03-18 20:54:04 +05:30
Amith Koujalgi
ba0444194f Merge pull request #98 from csware/bearertoken
Support bearer token
2025-03-18 20:30:08 +05:30
Amith Koujalgi
ac3f505aa6 Switch image model to "moondream" in integration test 2025-03-11 13:12:55 +05:30
Amith Koujalgi
7e5ca53fda Merge remote-tracking branch 'origin/integration-tests-refactor' into integration-tests-refactor
# Conflicts:
#	Makefile
#	README.md
2025-03-11 12:28:39 +05:30
Amith Koujalgi
2b0238b9e8 Ensure Docker availability in dev setup and integration tests
Updated `README.md` to include Docker as a prerequisite for running integration tests using Testcontainers. Modified the `Makefile` to check for Docker installation during the dev environment setup.
2025-03-11 12:26:35 +05:30
amithkoujalgi
469a0fe491 Refactor
- Remove TestRealAPIs and enhance OllamaAPIIntegrationTest
- Add dev setup instruction
2025-03-11 12:26:08 +05:30
Amith Koujalgi
983a3617f0 Add dev setup instructions and update pre-commit config 2025-03-11 12:15:19 +05:30
Amith Koujalgi
b638b981c9 Remove unnecessary blank lines from pom.xml
Cleaned up redundant blank lines at the end of the pom.xml file to ensure consistent formatting. This helps improve code readability and adheres to standard practices.
2025-03-11 12:05:11 +05:30
amithkoujalgi
fe5078891f Remove TestRealAPIs and enhance OllamaAPIIntegrationTest 2025-03-11 11:41:51 +05:30
Amith Koujalgi
44b4de9ed9 Merge pull request #102 from ollama4j/update-pre-commit-hook
All checks were successful
Mark stale issues / stale (push) Successful in 31s
update-pre-commit-hook
2025-03-11 10:38:05 +05:30
amithkoujalgi
854c0b4acf test GH action 2025-03-11 10:35:06 +05:30
Amith Koujalgi
18c5d06a6c Merge pull request #101 from ollama4j/update-pre-commit-hook
update-pre-commit-hook
2025-03-11 10:34:10 +05:30
amithkoujalgi
22b403d0b0 Remove unnecessary write permission for packages in workflow 2025-03-11 10:33:14 +05:30
amithkoujalgi
ee0493eb57 Rename and adjust workflows for PR builds and testing.
Renamed the PR-related workflow for clarity and replaced `build-on-pr-create.yml` with `build-and-test-on-pr-open.yml` for better naming consistency. Also commented out the push trigger in `run-tests.yml` to refine its activation criteria.
2025-03-11 10:19:34 +05:30
Amith Koujalgi
f966b4b74e Merge pull request #100 from ollama4j/update-pre-commit-hook
update-pre-commit-hook
2025-03-11 10:18:06 +05:30
amithkoujalgi
1dadbacd2c Enable no-commit-to-branch pre-commit hook. 2025-03-11 10:12:03 +05:30
amithkoujalgi
714c16c216 Merge remote-tracking branch 'origin/main' 2025-03-11 10:10:46 +05:30
amithkoujalgi
cf2c510b23 Add integration test step to CI workflow
Previously, only unit tests were run during the PR workflow. This update introduces a separate step to run integration tests, ensuring broader test coverage. It enhances build verification by validating both unit and integration aspects.
2025-03-11 10:10:14 +05:30
amithkoujalgi
a0bcc47b2e Add pre-commit configuration file
test
2025-03-11 10:08:57 +05:30
amithkoujalgi
57ecbc2572 Add pre-commit configuration file
test
2025-03-11 10:00:11 +05:30
amithkoujalgi
99beb3e6d0 Add pre-commit configuration file
test
2025-03-11 09:31:12 +05:30
amithkoujalgi
7756eed9a0 Add pre-commit configuration file
Introduce a pre-commit-config.yaml to automate code quality checks and enforce best practices. Includes hooks for file validation, formatting, and commit message standardization, as well as Java-specific quality tools. This ensures consistent coding standards and reduces manual errors.
2025-03-11 09:29:57 +05:30
amithkoujalgi
b795117f0a Add integration test step to CI workflow
Some checks failed
Run Unit and Integration Tests / run-tests (push) Failing after 2m2s
Mark stale issues / stale (push) Successful in 15s
Previously, only unit tests were run during the PR workflow. This update introduces a separate step to run integration tests, ensuring broader test coverage. It enhances build verification by validating both unit and integration aspects.
2025-03-11 00:20:01 +05:30
amithkoujalgi
0d091d1826 Update integration test 2025-03-11 00:06:30 +05:30
amithkoujalgi
9fd77a6743 Add branch trigger for tests and update README with badge
Added a trigger to run tests on pushes to the main branch in the GitHub Actions workflow. Also updated the README to include a badge linking to the test workflow for better visibility.
2025-03-11 00:02:13 +05:30
amithkoujalgi
57a962148b Update workflow name and job for clarity in testing
Renamed the workflow to specify both unit and integration tests. Adjusted the job name to better reflect its purpose and ensured clear descriptions for inputs. These changes enhance readability and intent in the CI configuration.
2025-03-10 23:58:03 +05:30
amithkoujalgi
3c64f2099f Add GPG plugin with test-skipping configuration
Integrated the Maven GPG plugin to sign artifacts during the "verify" phase, with the ability to skip it during tests using a new `skipGpgPluginDuringTests` property. Enhanced the build profiles to manage GPG signing selectively, ensuring smoother test and build workflows.
2025-03-10 23:53:02 +05:30
amithkoujalgi
a9c7f4e5e0 Rename TestAPIsTest to OllamaAPIIntegrationTest 2025-03-10 23:44:18 +05:30
amithkoujalgi
e7f58d4e0d Add integration tests and enhance test configurations
Introduced integration tests for various API functionalities, ensuring comprehensive coverage. Updated test dependencies in `pom.xml` and added handling for unknown JSON properties in the `Model` class. Also included configuration to support running unit and integration tests in the CI workflow.
2025-03-10 23:40:44 +05:30
Sven Strickroth
138497b30f Introduce BearerAuth class
Signed-off-by: Sven Strickroth <email@cs-ware.de>
2025-03-10 14:55:38 +01:00
Sven Strickroth
3a792090e2 Support bearer token
May be use as follows:
```
ollamaAPI.setBasicAuth(new BasicAuth() {
	@Override
	public String getBasicAuthHeaderValue() { return "Bearer [sometext]"; }
});
```

Signed-off-by: Sven Strickroth <email@cs-ware.de>
2025-03-10 14:39:54 +01:00
amithkoujalgi
7ef859bba5 clean up
All checks were successful
Mark stale issues / stale (push) Successful in 31s
2025-03-09 20:29:34 +05:30
amithkoujalgi
3c30593e1e clean up
All checks were successful
Mark stale issues / stale (push) Successful in 31s
2025-03-08 17:44:00 +05:30
amithkoujalgi
98b794ca2b Add a test GitHub Actions workflow to publish to a GitHub repository, Maven repository, and GitHub Pages 2025-03-08 17:24:29 +05:30
amithkoujalgi
bc90a15a68 Add "Examples" link to navbar and footer
Added a link to the "Examples" repository in both the navbar and the "Usage" section of the footer. This improves accessibility to code examples, providing users with an easier way to explore practical use cases.
2025-03-08 16:45:43 +05:30
amithkoujalgi
bb4689e94b Add examples section and GitHub links to docs and README
Introduced an examples section in the README to highlight the `ollama4j-examples` repository. Added an iframe and link in the chat API documentation for better reference. Adjusted formatting for improved readability and consistency.
2025-03-08 16:34:00 +05:30
amithkoujalgi
6739c93edc Refactor workflows and update dependencies.
Renamed workflow for clearer purpose and updated project dependencies across multiple packages to newer versions. Improved the `pom.xml` Maven config by adding a build phase and output directory for Javadoc generation. Upgraded several NPM packages, removing deprecated versions and adding license metadata for better dependency management.
2025-03-08 16:05:17 +05:30
amithkoujalgi
c8c30d703b Refactor code to enhance robustness and clarity
Refactored `OllamaChatMessageRole` to simplify custom role creation, guarding against nulls in `OllamaToolsResult`, and made `OllamaChatResult` properties immutable. Improved error handling in `OllamaAPI`, added verbose logs, and ensured safer JSON parsing for tool responses. Introduced `@JsonIgnoreProperties` for better deserialization support.
2025-03-08 15:46:43 +05:30
Amith Koujalgi
419b0369c9 Merge pull request #93 from hboutemy/reproducible-builds
All checks were successful
Mark stale issues / stale (push) Successful in 34s
enable Reproducible Builds
2025-02-28 21:57:31 +05:30
Hervé Boutemy
d9a94b95e1 enable Reproducible Builds 2025-02-23 18:02:15 +01:00
Amith Koujalgi
db1db948c8 Merge pull request #94 from hboutemy/release
All checks were successful
Mark stale issues / stale (push) Successful in 28s
use release flag instead of old source/target
2025-02-23 19:48:26 +05:30
Hervé Boutemy
41ad780224 use release flag instead of old source/target 2025-02-21 17:14:54 +01:00
Amith Koujalgi
cb58c6a9b0 Updated README.md
All checks were successful
Mark stale issues / stale (push) Successful in 10s
2025-02-17 23:47:48 +05:30
Amith Koujalgi
d1115e0b35 Updated README.md 2025-02-17 23:47:11 +05:30
Amith Koujalgi
740bd3750b Merge pull request #92 from ollama4j/90
Addresses issue where creation of model was failing
2025-02-17 22:37:20 +05:30
Amith Koujalgi
c9aa6c9e08 Merge branch 'main' into 90 2025-02-17 22:35:38 +05:30
Amith Koujalgi
71bba6ee0d Added GH action to run tests 2025-02-17 22:33:02 +05:30
Amith Koujalgi
27b2201ff9 Added GH action to run tests 2025-02-17 22:31:56 +05:30
Amith Koujalgi
23d23c4ad7 Added new createModel API to make it conform to Ollama's new API - https://github.com/ollama/ollama/blob/main/docs/api.md#create-a-model 2025-02-17 22:25:25 +05:30
amithkoujalgi
e409ff1cf9 Update OllamaAPI.java
All checks were successful
Mark stale issues / stale (push) Successful in 14s
2025-02-03 08:56:49 +05:30
amithkoujalgi
9a12cebb68 Update README.md 2025-02-01 23:13:35 +05:30
amithkoujalgi
24f5bc4fec Delete close-issue.yml 2025-02-01 09:10:46 +05:30
Amith Koujalgi
d7c313417b Update label-issue-stale.yml 2025-02-01 09:03:57 +05:30
Amith Koujalgi
b67b4c7eb5 Update publish-docs.yml
Some checks failed
Close inactive issues / close-issues (push) Has been cancelled
2025-02-01 00:10:37 +05:30
Amith Koujalgi
ab70201844 Merge pull request #89 from kwongiho/main
Add support for deepseek-r1 model
2025-02-01 00:00:26 +05:30
kwongiho
ac8a40a017 Add support for deepseek-r1 model 2025-01-30 21:32:29 +09:00
Amith Koujalgi
1ac65f821b Update label-issue-stale.yml 2025-01-29 00:34:17 +05:30
Amith Koujalgi
d603c4b94b Update label-issue-stale.yml 2025-01-29 00:17:34 +05:30
Amith Koujalgi
a418cbc1dc Create label-issue-stale.yml 2025-01-29 00:17:11 +05:30
Amith Koujalgi
785dd12730 Update close-issue.yml 2025-01-28 23:52:22 +05:30
Amith Koujalgi
dda807d818 Merge pull request #88 from seeseemelk/feature/token-streamer
Add ability to stream tokens in chat
2025-01-26 17:37:22 +05:30
Amith Koujalgi
a06a4025fa Merge pull request #87 from seeseemelk/feature/annotated-objects
Add support for registering object instances
2025-01-26 17:35:58 +05:30
761fbc3398 Add support for streaming tokens 2025-01-24 15:05:33 +01:00
a96dc11679 Fix random test failure 2025-01-24 15:05:32 +01:00
b2b3febdaa Add support for registering object instances instead of only through the @OllamaToolService annotation 2025-01-24 13:38:47 +01:00
amithkoujalgi
f27bea11d5 Merge branch 'main' of https://github.com/ollama4j/ollama4j 2025-01-14 10:44:13 +05:30
amithkoujalgi
9503451d5a Create close-issue.yml 2025-01-14 10:42:58 +05:30
Amith Koujalgi
04bae4ca6a Update README.md 2025-01-14 10:06:30 +05:30
Amith Koujalgi
3e33b8df62 Update README.md 2025-01-13 20:08:42 +05:30
Amith Koujalgi
a494053263 Merge pull request #85 from AgentSchmecker/feature/annotationBasedTools
Feature/annotation based tools
2025-01-04 23:52:51 +05:30
Markus Klenke
260c57ca84 Removes system.err lines 2024-12-27 23:10:08 +01:00
Markus Klenke
db008de0ca Adds documentation for annotation based Tool registration 2024-12-27 23:07:35 +01:00
Markus Klenke
1b38466f44 Adds BigDecimal type for ToolProperty typeCast 2024-12-27 23:05:08 +01:00
Markus Klenke
26ec00dab8 Adds Javadoc for new classes and annotations 2024-12-27 22:33:44 +01:00
Markus Klenke
5e6971cc4a Adds first approach to annotation based tool callings using basic java reflection 2024-12-27 22:20:34 +01:00
Amith Koujalgi
8b3417ecda Merge pull request #82 from AgentSchmecker/feature/toolextension_for_chat_model
Enable chat API to use Tools
2024-12-17 12:03:56 +05:30
Markus Klenke
35f5f34196 Adds doc for tool-based chat API calls 2024-12-09 23:30:05 +01:00
Markus Klenke
d8c3edd55f Parametrizes the max chat tool call retries for a single chat request 2024-12-09 23:29:43 +01:00
Markus Klenke
7ffbc5d3f2 Adds implicit tool calling for streamed chat requests (requires Ollama v0.4.6) 2024-12-09 23:07:25 +01:00
Markus Klenke
c4b7830614 Fixes merge conflicts 2024-12-07 01:18:12 +01:00
Markus Klenke
69f6fd81cf Enables in chat tool calling 2024-12-07 01:17:08 +01:00
Markus Klenke
b6a293add7 Makes changes to OllamaChatResult backwards compatible 2024-12-07 01:17:08 +01:00
Markus Klenke
25694a8bc9 extends ollamaChatResult to have full access to OllamaChatResult 2024-12-07 01:17:01 +01:00
Markus Klenke
12bb10392e Extends ChatModels to use Tools and ToolCalls 2024-12-07 01:16:25 +01:00
Markus Klenke
e9c33ab0b2 Extends chat API to automatically load registered Tools 2024-12-07 01:16:25 +01:00
Markus Klenke
903a8176cd Extends ToolSpec to have PromptDef for ChatRequests 2024-12-07 01:16:25 +01:00
Amith Koujalgi
4a91918e84 Merge pull request #81 from AgentSchmecker/bugfix/79
Fixes for #78 and #79
2024-12-05 16:42:30 +05:30
Markus Klenke
ff3344616c Fixes NPE in #78 2024-12-04 22:57:48 +01:00
Markus Klenke
726fea5b74 Fixes #79 2024-12-04 22:28:00 +01:00
Markus Klenke
a09f1362e9 Adds Builder for EmbedRequests and deprecates old Embedding Models 2024-12-02 22:48:33 +01:00
amithkoujalgi
4ef0821932 updated README.md
Update README.md
2024-11-09 01:01:38 +05:30
amithkoujalgi
2d3cf228cb added findModelTagFromLibrary API 2024-11-08 12:37:58 +05:30
amithkoujalgi
5b3713c69e added getLibraryModelDetails API and pullModel API with LibraryModelTag 2024-11-08 11:23:47 +05:30
Amith Koujalgi
e9486cbb8e Merge pull request #76 from ollama4j/model-listing
updated `listModelsFromLibrary` API
2024-11-08 10:05:19 +05:30
amithkoujalgi
057f0babeb updated listModelsFromLibrary API
updated `listModelsFromLibrary` API
2024-11-08 10:02:27 +05:30
Amith Koujalgi
da146640ca Update README.md 2024-11-08 00:08:13 +05:30
Amith Koujalgi
82be761b86 Merge pull request #75 from ollama4j/model-listing
`listModelsFromLibrary` API
2024-11-07 23:54:56 +05:30
amithkoujalgi
9c3fc49df1 added listModelsFromLibrary API 2024-11-07 23:53:11 +05:30
Amith Koujalgi
5f19eb17ac Update OllamaAPI.java 2024-11-07 21:53:41 +05:30
Amith Koujalgi
ecb04d6d82 Cleanup 2024-10-31 21:22:17 +05:30
Amith Koujalgi
3fc7e9423c Updated docs 2024-10-31 17:57:02 +05:30
Amith Koujalgi
405a08b330 Updated docs 2024-10-31 16:25:05 +05:30
Amith Koujalgi
921f745435 Custom roles support
Adds support for custom roles using `OllamaChatMessageRole`
2024-10-31 16:15:21 +05:30
Amith Koujalgi
bedfec6bf9 Update generate-embeddings.md 2024-10-30 11:07:40 +05:30
Amith Koujalgi
afa09e87a5 Update OllamaAPI.java 2024-10-30 11:02:37 +05:30
Amith Koujalgi
baf2320ea6 Updated javadoc 2024-10-30 11:01:23 +05:30
Amith Koujalgi
948a7444fb Update README.md 2024-10-30 00:54:33 +05:30
Amith Koujalgi
ec0eb8b469 Update README.md 2024-10-30 00:44:08 +05:30
Amith Koujalgi
8f33de7e59 Update README.md 2024-10-30 00:43:09 +05:30
Amith Koujalgi
8c59e6511b Update README.md 2024-10-30 00:41:55 +05:30
Amith Koujalgi
b93fc7623a Updated javadoc 2024-10-30 00:28:53 +05:30
Amith Koujalgi
bd1a57c7e0 Added support for new embed API /api/embed 2024-10-30 00:03:49 +05:30
Amith Koujalgi
7fabead249 Merge pull request #73 from daguava/tool-role
Add 'tool' role
2024-10-29 22:05:43 +05:30
Mitchell Lutzke
268a973d5e Add tool role 2024-10-27 17:06:25 -07:00
Amith Koujalgi
d949a3cb69 Merge pull request #72 from daguava/minp-custom-options
Add minp option and ability to set custom options
2024-10-27 20:05:24 +05:30
Mitchell Lutzke
e2443ed68a Add throws to the docs 2024-10-26 22:08:03 -07:00
Mitchell Lutzke
37193b1f5b slight cleanup 2024-10-26 21:36:43 -07:00
Mitchell Lutzke
e33071ae38 Add minp option and ability to set custom options 2024-10-26 21:22:46 -07:00
Amith Koujalgi
fffc8dc526 Update README.md 2024-10-16 00:55:01 +05:30
Amith Koujalgi
def950cc9c Update README.md 2024-10-15 14:27:07 +05:30
Amith Koujalgi
f4db7ca326 Update README.md 2024-10-15 11:51:20 +05:30
Amith Koujalgi
18760250ea Update README.md 2024-10-11 21:21:20 +05:30
Amith Koujalgi
233597efd1 Update README.md 2024-10-01 23:38:49 +05:30
Amith Koujalgi
cec9f29eb7 Update README.md 2024-09-15 09:43:37 +05:30
Amith Koujalgi
20cb92a418 Update README.md 2024-09-15 08:48:31 +05:30
Amith Koujalgi
b0dc38954b Update README.md 2024-09-15 08:47:22 +05:30
Amith Koujalgi
1479d0a494 Update README.md 2024-09-08 16:28:46 +05:30
Amith Koujalgi
b328daee43 Update README.md 2024-09-08 16:28:23 +05:30
Amith Koujalgi
b90c8bc622 Docs build fixes
Signed-off-by: Amith Koujalgi <koujalgi.amith@gmail.com>
2024-09-05 01:41:24 +05:30
73 changed files with 5435 additions and 1409 deletions

View File

@@ -1,34 +0,0 @@
# This workflow will build a package using Maven and then publish it to GitHub packages when a release is created
# For more information see: https://github.com/actions/setup-java/blob/main/docs/advanced-usage.md#apache-maven-with-a-settings-path
name: Build on PR Create
on:
pull_request:
types: [ opened, reopened ]
branches: [ "main" ]
jobs:
build:
runs-on: ubuntu-latest
permissions:
contents: read
packages: write
steps:
- uses: actions/checkout@v3
- name: Set up JDK 11
uses: actions/setup-java@v3
with:
java-version: '11'
distribution: 'adopt-hotspot'
server-id: github # Value of the distributionManagement/repository/id field of the pom.xml
settings-path: ${{ github.workspace }} # location for the settings.xml file
- name: Build with Maven
run: mvn --file pom.xml -U clean package
- name: Run Tests
run: mvn --file pom.xml -U clean test -Punit-tests

View File

@@ -0,0 +1,46 @@
name: Run Tests
on:
pull_request:
# types: [opened, reopened, synchronize, edited]
branches: [ "main" ]
paths:
- 'src/**' # Run if changes occur in the 'src' folder
- 'pom.xml' # Run if changes occur in the 'pom.xml' file
concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.ref }}
cancel-in-progress: true
jobs:
run-tests:
runs-on: ubuntu-latest
permissions:
contents: read
steps:
- uses: actions/checkout@v3
- name: Set up JDK 11
uses: actions/setup-java@v3
with:
java-version: '11'
distribution: 'adopt-hotspot'
server-id: github # Value of the distributionManagement/repository/id field of the pom.xml
settings-path: ${{ github.workspace }} # location for the settings.xml file
- name: Build with Maven
run: mvn --file pom.xml -U clean package
- name: Run unit tests
run: mvn --file pom.xml -U clean test -Punit-tests
- name: Run integration tests
run: mvn --file pom.xml -U clean verify -Pintegration-tests
- name: Use Node.js
uses: actions/setup-node@v3
with:
node-version: '20.x'
- run: cd docs && npm ci
- run: cd docs && npm run build

24
.github/workflows/label-issue-stale.yml vendored Normal file
View File

@@ -0,0 +1,24 @@
name: Mark stale issues
on:
workflow_dispatch: # for manual run
schedule:
- cron: '0 0 * * *' # Runs every day at midnight
permissions:
contents: write # only for delete-branch option
issues: write
jobs:
stale:
runs-on: ubuntu-latest
steps:
- name: Mark stale issues
uses: actions/stale@v8
with:
repo-token: ${{ github.token }}
days-before-stale: 15
stale-issue-message: 'This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs.'
days-before-close: 7
stale-issue-label: 'stale'
exempt-issue-labels: 'pinned,security'

View File

@@ -1,5 +1,5 @@
# Simple workflow for deploying static content to GitHub Pages
name: Deploy Docs to GH Pages
name: Publish Docs to GH Pages
on:
release:
@@ -63,12 +63,12 @@ jobs:
working-directory: "."
- name: Setup Pages
uses: actions/configure-pages@v3
uses: actions/configure-pages@v5
- name: Upload artifact
uses: actions/upload-pages-artifact@v2
uses: actions/upload-pages-artifact@v3
with:
# Upload entire repository
path: './docs/build/.'
- name: Deploy to GitHub Pages
id: deployment
uses: actions/deploy-pages@v2
uses: actions/deploy-pages@v4

35
.github/workflows/run-tests.yml vendored Normal file
View File

@@ -0,0 +1,35 @@
name: Run Unit and Integration Tests
on:
# push:
# branches:
# - main
workflow_dispatch:
inputs:
branch:
description: 'Branch name to run the tests on'
required: true
default: 'main'
jobs:
run-tests:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
with:
ref: ${{ github.event.inputs.branch }}
- name: Set up JDK 17
uses: actions/setup-java@v3
with:
java-version: '17'
distribution: 'temurin'
server-id: github
settings-path: ${{ github.workspace }}
- name: Run unit tests
run: mvn clean test -Punit-tests
- name: Run integration tests
run: mvn clean verify -Pintegration-tests

38
.pre-commit-config.yaml Normal file
View File

@@ -0,0 +1,38 @@
repos:
# pre-commit hooks
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: "v5.0.0"
hooks:
- id: no-commit-to-branch
args: ['--branch', 'main']
- id: check-merge-conflict
- id: check-added-large-files
- id: check-yaml
- id: check-xml
- id: check-json
- id: pretty-format-json
args: ['--no-sort-keys', '--autofix', '--indent=4']
- id: end-of-file-fixer
exclude: \.json$
files: \.java$|\.xml$
- id: trailing-whitespace
- id: mixed-line-ending
# for commit message formatting
- repo: https://github.com/commitizen-tools/commitizen
rev: v4.4.1
hooks:
- id: commitizen
stages: [commit-msg]
# # for java code quality
# - repo: https://github.com/gherynos/pre-commit-java
# rev: v0.6.10
# hooks:
# - id: pmd
# exclude: /test/
# - id: cpd
# exclude: /test/
# - id: checkstyle
# exclude: /test/

View File

@@ -1,24 +1,38 @@
dev:
@echo "Setting up dev environment..."
@command -v pre-commit >/dev/null 2>&1 || { echo "Error: pre-commit is not installed. Please install it first."; exit 1; }
@command -v docker >/dev/null 2>&1 || { echo "Error: docker is not installed. Please install it first."; exit 1; }
pre-commit install
pre-commit autoupdate
pre-commit install --install-hooks
build:
mvn -B clean install -Dgpg.skip=true
full-build:
mvn -B clean install
unit-tests:
mvn clean test -Punit-tests
integration-tests:
mvn clean verify -Pintegration-tests
export USE_EXTERNAL_OLLAMA_HOST=false && mvn clean verify -Pintegration-tests
integration-tests-remote:
export USE_EXTERNAL_OLLAMA_HOST=true && export OLLAMA_HOST=http://192.168.29.223:11434 && mvn clean verify -Pintegration-tests -Dgpg.skip=true
doxygen:
doxygen Doxyfile
list-releases:
curl 'https://central.sonatype.com/api/internal/browse/component/versions?sortField=normalizedVersion&sortDirection=asc&page=0&size=12&filter=namespace%3Aio.github.amithkoujalgi%2Cname%3Aollama4j' \
curl 'https://central.sonatype.com/api/internal/browse/component/versions?sortField=normalizedVersion&sortDirection=desc&page=0&size=20&filter=namespace%3Aio.github.ollama4j%2Cname%3Aollama4j' \
--compressed \
--silent | jq '.components[].version'
--silent | jq -r '.components[].version'
build-docs:
docs:
npm i --prefix docs && npm run build --prefix docs
start-docs:
docs-dev:
npm i --prefix docs && npm run start --prefix docs
start-cpu:

166
README.md
View File

@@ -4,13 +4,11 @@
<img src='https://raw.githubusercontent.com/ollama4j/ollama4j/65a9d526150da8fcd98e2af6a164f055572bf722/ollama4j.jpeg' width='100' alt="ollama4j-icon">
</p>
A Java library (wrapper/binding) for [Ollama](https://ollama.ai/) server.
<div align="center">
A Java library (wrapper/binding) for Ollama server.
Find more details on the [website](https://ollama4j.github.io/ollama4j/).
<div align="center">
![GitHub stars](https://img.shields.io/github/stars/ollama4j/ollama4j)
![GitHub forks](https://img.shields.io/github/forks/ollama4j/ollama4j)
![GitHub watchers](https://img.shields.io/github/watchers/ollama4j/ollama4j)
@@ -33,8 +31,11 @@ Find more details on the [website](https://ollama4j.github.io/ollama4j/).
![GitHub last commit](https://img.shields.io/github/last-commit/ollama4j/ollama4j?color=green)
[![codecov](https://codecov.io/gh/ollama4j/ollama4j/graph/badge.svg?token=U0TE7BGP8L)](https://codecov.io/gh/ollama4j/ollama4j)
[![Run Unit and Integration Tests](https://github.com/ollama4j/ollama4j/actions/workflows/run-tests.yml/badge.svg)](https://github.com/ollama4j/ollama4j/actions/workflows/run-tests.yml)
![Build Status](https://github.com/ollama4j/ollama4j/actions/workflows/maven-publish.yml/badge.svg)
</div>
[//]: # (![Hits]&#40;https://hits.seeyoufarm.com/api/count/incr/badge.svg?url=https%3A%2F%2Fgithub.com%2Follama4j%2Follama4j&count_bg=%2379C83D&title_bg=%23555555&icon=&icon_color=%23E7E7E7&title=hits&edge_flat=false&#41;)
@@ -47,6 +48,7 @@ Find more details on the [website](https://ollama4j.github.io/ollama4j/).
- [Requirements](#requirements)
- [Installation](#installation)
- [API Spec](https://ollama4j.github.io/ollama4j/category/apis---model-management)
- [Examples](#examples)
- [Javadoc](https://ollama4j.github.io/ollama4j/apidocs/)
- [Development](#development)
- [Contributions](#get-involved)
@@ -75,61 +77,6 @@ Find more details on the [website](https://ollama4j.github.io/ollama4j/).
<img src="https://img.shields.io/badge/v0.3.0-green.svg?style=for-the-badge&labelColor=gray&label=Ollama&color=blue" alt=""/>
</a>
<table>
<tr>
<td>
<a href="https://ollama.ai/" target="_blank">Local Installation</a>
</td>
<td>
<a href="https://hub.docker.com/r/ollama/ollama" target="_blank">Docker Installation</a>
</td>
</tr>
<tr>
<td>
<a href="https://ollama.com/download/Ollama-darwin.zip" target="_blank">Download for macOS</a>
<a href="https://ollama.com/download/OllamaSetup.exe" target="_blank">Download for Windows</a>
Install on Linux
```shell
curl -fsSL https://ollama.com/install.sh | sh
```
</td>
<td>
CPU only
```shell
docker run -d -p 11434:11434 \
-v ollama:/root/.ollama \
--name ollama \
ollama/ollama
```
NVIDIA GPU
```shell
docker run -d -p 11434:11434 \
--gpus=all \
-v ollama:/root/.ollama \
--name ollama \
ollama/ollama
```
</td>
</tr>
</table>
## Installation
> [!NOTE]
@@ -155,7 +102,7 @@ In your Maven project, add this dependency:
<dependency>
<groupId>io.github.ollama4j</groupId>
<artifactId>ollama4j</artifactId>
<version>1.0.79</version>
<version>1.0.93</version>
</dependency>
```
@@ -211,7 +158,7 @@ In your Maven project, add this dependency:
<dependency>
<groupId>io.github.ollama4j</groupId>
<artifactId>ollama4j</artifactId>
<version>1.0.79</version>
<version>1.0.93</version>
</dependency>
```
@@ -221,7 +168,7 @@ In your Maven project, add this dependency:
```groovy
dependencies {
implementation 'io.github.ollama4j:ollama4j:1.0.79'
implementation 'io.github.ollama4j:ollama4j:1.0.93'
}
```
@@ -244,51 +191,86 @@ dependencies {
> [!TIP]
> Find the full API specifications on the [website](https://ollama4j.github.io/ollama4j/).
#### Development
### Development
Build:
Make sure you have `pre-commit` installed.
With `brew`:
```shell
brew install pre-commit
```
With `pip`:
```shell
pip install pre-commit
```
#### Setup dev environment
> **Note**
> If you're on Windows, install [Chocolatey Package Manager for Windows](https://chocolatey.org/install) and then install `make` by running `choco install make`. Just a little tip - run the command with administrator privileges if installation faiils.
```shell
make dev
```
#### Build
```shell
make build
```
Run unit tests:
#### Run unit tests
```shell
make unit-tests
```
Run integration tests:
#### Run integration tests
Make sure you have Docker running as this uses [testcontainers](https://testcontainers.com/) to run the integration
tests on Ollama Docker container.
```shell
make integration-tests
```
#### Releases
### Releases
Newer artifacts are published via GitHub Actions CI workflow when a new release is created from `main` branch.
#### Who's using Ollama4j?
## Examples
- `Datafaker`: a library to generate fake data
- https://github.com/datafaker-net/datafaker-experimental/tree/main/ollama-api
- `Vaadin Web UI`: UI-Tester for Interactions with Ollama via ollama4j
- https://github.com/TEAMPB/ollama4j-vaadin-ui
- `ollama-translator`: Minecraft 1.20.6 spigot plugin allows to easily break language barriers by using ollama on the
server to translate all messages into a specfic target language.
- https://github.com/liebki/ollama-translator
- https://www.reddit.com/r/fabricmc/comments/1e65x5s/comment/ldr2vcf/
- `Ollama4j Web UI`: A web UI for Ollama written in Java using Spring Boot and Vaadin framework and
Ollama4j.
- https://github.com/ollama4j/ollama4j-web-ui
- `JnsCLI`: A command-line tool for Jenkins that manages jobs, builds, and configurations directly from the terminal while offering AI-powered error analysis for quick troubleshooting.
- https://github.com/mirum8/jnscli
The `ollama4j-examples` repository contains examples for using the Ollama4j library. You can explore
it [here](https://github.com/ollama4j/ollama4j-examples).
#### Traction
## ⭐ Give us a Star!
If you like or are using this project to build your own, please give us a star. It's a free way to show your support.
## Who's using Ollama4j?
| # | Project Name | Description | Link |
|----|-------------------|--------------------------------------------------------------------------------------------------------------------------------------------------------------------|-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 1 | Datafaker | A library to generate fake data | [GitHub](https://github.com/datafaker-net/datafaker-experimental/tree/main/ollama-api) |
| 2 | Vaadin Web UI | UI-Tester for interactions with Ollama via ollama4j | [GitHub](https://github.com/TEAMPB/ollama4j-vaadin-ui) |
| 3 | ollama-translator | A Minecraft 1.20.6 Spigot plugin that translates all messages into a specific target language via Ollama | [GitHub](https://github.com/liebki/ollama-translator) |
| 4 | AI Player | A Minecraft mod that adds an intelligent "second player" to the game | [Website](https://modrinth.com/mod/ai-player), [GitHub](https://github.com/shasankp000/AI-Player), <br/> [Reddit Thread](https://www.reddit.com/r/fabricmc/comments/1e65x5s/comment/ldr2vcf/) |
| 5 | Ollama4j Web UI | A web UI for Ollama written in Java using Spring Boot, Vaadin, and Ollama4j | [GitHub](https://github.com/ollama4j/ollama4j-web-ui) |
| 6 | JnsCLI | A command-line tool for Jenkins that manages jobs, builds, and configurations, with AI-powered error analysis | [GitHub](https://github.com/mirum8/jnscli) |
| 7 | Katie Backend | An open-source AI-based question-answering platform for accessing private domain knowledge | [GitHub](https://github.com/wyona/katie-backend) |
| 8 | TeleLlama3 Bot | A question-answering Telegram bot | [Repo](https://git.hiast.edu.sy/mohamadbashar.disoki/telellama3-bot) |
| 9 | moqui-wechat | A moqui-wechat component | [GitHub](https://github.com/heguangyong/moqui-wechat) |
| 10 | B4X | A set of simple and powerful RAD tool for Desktop and Server development | [Website](https://www.b4x.com/android/forum/threads/ollama4j-library-pnd_ollama4j-your-local-offline-llm-like-chatgpt.165003/) |
| 11 | Research Article | Article: `Large language model based mutations in genetic improvement` - published on National Library of Medicine (National Center for Biotechnology Information) | [Website](https://pmc.ncbi.nlm.nih.gov/articles/PMC11750896/) |
## Traction
[![Star History Chart](https://api.star-history.com/svg?repos=ollama4j/ollama4j&type=Date)](https://star-history.com/#ollama4j/ollama4j&Date)
### Get Involved
## Get Involved
<div align="center">
@@ -316,6 +298,22 @@ Contributions are most welcome! Whether it's reporting a bug, proposing an enhan
with code - any sort
of contribution is much appreciated.
## 🏷️ License and Citation
The code is available under [MIT License](./LICENSE).
If you find this project helpful in your research, please cite this work at
```
@misc{ollama4j2024,
author = {Amith Koujalgi},
title = {Ollama4j: A Java Library (Wrapper/Binding) for Ollama Server},
year = {2024},
month = {January},
url = {https://github.com/ollama4j/ollama4j}
}
```
### References
- [Ollama REST APIs](https://github.com/jmorganca/ollama/blob/main/docs/api.md)
@@ -333,7 +331,7 @@ project.
</a>
</p>
### Appreciate my work?
### Appreciate the work?
<p align="center">
<a href="https://www.buymeacoffee.com/amithkoujalgi" target="_blank"><img src="https://cdn.buymeacoffee.com/buttons/v2/default-yellow.png" alt="Buy Me A Coffee" style="height: 60px !important;width: 217px !important;" ></a>

View File

@@ -0,0 +1,709 @@
---
slug: talk-to-your-data-on-couchbase-via-ollama4j
title: "Talk to Your Data Using Natural Language: A Guide to Interacting with Couchbase via Ollama4j"
authors: [ amith ]
tags: [ Java, AI, LLM, GenAI, GenerativeAI, Generative AI Tools, Ollama, Ollama4J, OpenSource, Developers,
]
---
Sometime back, I created a small wrapper called Ollama4j to interact with the Ollama server over the REST API in Java as
a side project and made the [repository](https://github.com/ollama4j/ollama4j) public on GitHub. Over time, the project
gained traction, with many fellow Java
developers contributing, and it now boasts over _300 stars_! 😍
Weve consistently introduced new features, and when we added the tool-calling capability, the library became incredibly
powerful, opening up so many possibilities. With this addition, we could automate numerous tasks using natural language!
I wanted to share how to make the most of this functionality.
In this article, well explore how to use Ollama4j, a Java SDK for interacting with Ollama-hosted models, to leverage
tool-calling models like Mistral for querying a Couchbase database. The goal is to create a system where you can query
your database using natural, conversational language — just like interacting with a virtual assistant. Well walk you
through the code, explain the key components, and show you how to set up your environment to ensure everything runs
smoothly.
### Overview of the Technologies Involved
Before diving into the implementation, lets understand the core technologies were using:
- **Ollama4j**: A Java SDK that interacts with hosted AI models through a convenient API. Ollama allows you to interact
with
pre-trained models (like Mistral) and access additional tools that can be applied to real-world tasks.
- **Mistral**: A powerful, language-based model that can be used for a variety of tasks, including answering questions,
text
generation, and data retrieval from external sources. While Ive used Mistral in this instance, you can easily replace
it with [any other model](https://ollama.com/search?c=tools) that supports tool-calling capabilities.
- **Couchbase**: A NoSQL database that provides a flexible and scalable data model. In this example, well query a
Couchbase
database to retrieve airline information.
The magic happens when we combine these technologies to allow the model to query the database in a more intuitive and
human-like way, acting as an interface between the users natural language and Couchbases structured data.
> Oh, by the way, you can either set up
> a [Couchbase server](https://www.couchbase.com/downloads/?family=couchbase-server) on your own or, if you prefer a
> more
> effortless approach like
> I
> do, give [Couchbase Capella](https://www.couchbase.com/products/capella/) a spin. Its a fully managed
> Database-as-a-Service (DBaaS) with a free tier 🎉 thats so
> easy
> to set up, youll be querying your data in no time. Its perfect for developers who want to dive in without any
> hassle —
> its like having your own cloud database, minus the headache!
In the following section, we will walk you through the simple steps to create your free Couchbase Capella database
cluster. If youd prefer to set up your own Couchbase server elsewhere, feel free to skip this section and go directly
to the [Code Environment Setup](#setting-up-the-environment-for-code) section.
Sign up for a free database cluster on Couchbase Capella
Head over to https://cloud.couchbase.com/sign-in and sign up for an account.
<img src={'https://miro.medium.com/v2/resize:fit:1400/format:webp/1*vsJC0ugfoh9vpYNapt4-5A.png'} />
Once youre in, you will be able to create a new database cluster. Click on the _**Operational**_ tab and click on the
**_Create Cluster_** button.
<img src={'https://miro.medium.com/v2/resize:fit:1400/format:webp/1*ZNicgmYNkclgaBIxwRN7Ug.png'} />
Select the default project named **_My First Project_** and click on the **_Continue_** button.
<img src={'https://miro.medium.com/v2/resize:fit:1400/format:webp/1*vfc2cF7IgkjLtNXvls8giQ.png'} />
Youll now see the available cluster options. Go ahead and select the **_Free_** option! 😍
Next, choose your preferred cloud provider (you can select any provider or stick with the default AWS provider).
Pick a region (or leave it set to the default).
Finally, click on the Create Cluster button to proceed.
<img src={'https://miro.medium.com/v2/resize:fit:1400/format:webp/1*rdWpeSrUaBKC6Y5q8Kd6EA.png'} />
Give it a couple of minutes, and let the magic happen as your cluster gets deployed.
<img src={'https://miro.medium.com/v2/resize:fit:1400/format:webp/1*no3uHx8cIzVBn7qccYEZ3A.png'} />
Once your cluster is deployed, youll see the status of your cluster as **_Healthy_**.
<img src={'https://miro.medium.com/v2/resize:fit:1400/format:webp/1*Jyu9uiSDSE0o-EQRb53CJA.png'} />
Click on the listed cluster to open its details. Here, you can view the version of the deployed Couchbase server, the
enabled services, as well as the cloud provider and region.
<img src={'https://miro.medium.com/v2/resize:fit:1400/format:webp/1*Sv-7wQuAoD0l0bjbI5I7Aw.png'} />
Click on **_Explore Data_** button. Notice that a default bucket called **_travel-sample_** with some sample data has
been created
for you.
<img src={'https://miro.medium.com/v2/resize:fit:1400/format:webp/1*z85GsgMBvdR2mrvKUrIjJg.png'} />
Browse through the collection to explore the pre-created buckets, scopes and collections available to you.
<img src={'https://miro.medium.com/v2/resize:fit:1400/format:webp/1*Qr84bs1dvn6m9ZjkNxXvUg.png'} />
Open up a sample document from the **_travel-sample_** (bucket) > **_inventory_** (scope) > **_airline_** (collection)
to see the contents
of the document.
The document shown in the image below is about an airline named **_Astraeus_**, whose call sign (a unique name or code
used to
identify an airline or aircraft in communication) is **_FLYSTAR_**.
<img src={'https://miro.medium.com/v2/resize:fit:1400/format:webp/1*AmvixYfdNNKC6nXNNXbe4Q.png'} />
Navigate to the **_Connect_** tab, and you will see a **_Public Connection String_** that allows you to access the
Capella cluster
endpoint from your client application, which looks like the following URL:
```
couchbases://cb.uniqueclusteridentifer.cloud.couchbase.com
```
<img src={'https://miro.medium.com/v2/resize:fit:1400/format:webp/1*jwnVdj5ZOQMHoggj9JZeJQ.png'} />
To access this cluster endpoint, you need to allow the IP addresses that are permitted to connect. Click on the
**_Settings_**
tab, which will take you to the **_Cluster Settings_** view. Then, click on **_Allowed IP Addresses_** in the left pane
under
**_Networking_**, where you can add allowed IP addresses. Then, click on the **_Add Allowed IP_** button.
<img src={'https://miro.medium.com/v2/resize:fit:1400/format:webp/1*tS83AJaNzlBa4Q3aadxohw.png'} />
You can either click on the **_Add Current IP Address_** button to limit access to your cluster to your IP address
alone, or
if youd like to allow access from anywhere, click on the **_Allow Access from Anywhere_** button.
<img src={'https://miro.medium.com/v2/resize:fit:1400/format:webp/1*XBgqQoXQQJyYg51Ztugw6w.png'} />
Confirm that you want to allow the IP addresses.
<img src={'https://miro.medium.com/v2/resize:fit:1400/format:webp/1*WjfYQQaiT2WqwNnWvUCyww.png'} />
The IP addresses have now been added to the allow list, and the networking is set up.
<img src={'https://miro.medium.com/v2/resize:fit:1400/format:webp/1*5BHIp2rqUf7E_GNX8TENoA.png'} />
Now that youve allowed IP addresses, its time to create credentials for accessing the cluster using a username and
password. Click on the **_Cluster Access_** tab in the left pane, then click on the **_Create Cluster Access_** button.
<img src={'https://miro.medium.com/v2/resize:fit:1400/format:webp/1*Q5l_EE3gGtxiANdkKilVTQ.png'} />
Enter a username of your choice in the **_Cluster Access Name_** text field, and then enter a password of your choice in
the
**_Password_** text field.
Next, select the bucket, scope, and the read/write permissions you want these credentials to have access to. In this
example, Ive granted access to all buckets and scopes with both read and write permissions.
<img src={'https://miro.medium.com/v2/resize:fit:1400/format:webp/1*j2DRB1oDWE78SKpcsIb2SA.png'} />
Alright, your cluster access is now set up.
<img src={'https://miro.medium.com/v2/resize:fit:1400/format:webp/1*8TY-5DPDfQlwz0-2IYR8Sg.png'} />
One last step: you just need to select the **_Cluster Access Credentials_** that you want to allow to connect to your
Capella
cluster. Head over to the **_Connect_** tab, then click on the **_SDKs_** tab in the left pane. Under Choose the
**_Cluster Access Credentials you want to use to connect to your Capella cluster_**, select the cluster credentials you
just created.
<img src={'https://miro.medium.com/v2/resize:fit:1400/format:webp/1*sIlH51v2HllTzBDV8K-9Aw.png'} />
Awesome! Your cluster access is all set up, and youre ready to connect to your Capella cluster using a Couchbase
client. Thats it — youre all set and good to go!
### Setting Up the Environment For Code
Before you begin, ensure you have the following components setup.
**Java**: Make sure you have Java 11+ installed on your system. Set it up
from [here](https://www.oracle.com/in/java/technologies/downloads/). Verify it by running the following
command in your terminal.
```shell
java --version
```
**Maven**: Make sure you have the Maven build system set up. Set it up from [here](https://maven.apache.org/download.cgi).
Verify it by running the following command
in your terminal.
```
mvn --version
```
**Ollama Server**: Make sure you have installed the latest version of [Ollama server](https://ollama.com/) and it is up
and running. Verify it by
running the following command in your terminal.
```shell
ollama --version
```
**Model**: Youll need [tool-calling model](https://ollama.com/search?c=tools) (such as Mistral) downloaded and ready to
serve from your Ollama server.
To download/pull the model into your Ollama server, run the following command in your terminal.
```shell
ollama pull mistral
```
You can list the models available on your model server by running the following command in your terminal.
```shell
ollama list
```
Once you have these, you can start setting up the application.
Setup `pom.xml` for your Maven project.
```xml
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>io.github.ollama4j.couchbase</groupId>
<artifactId>ollama4j-couchbase</artifactId>
<version>0.0.1</version>
<name>Ollama4j Couchbase</name>
<description>Talk to your data in Couchbase over Ollama4j</description>
<packaging>jar</packaging>
<properties>
<maven.compiler.release>11</maven.compiler.release>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<lombok.version>1.18.30</lombok.version>
</properties>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-source-plugin</artifactId>
<version>3.3.1</version>
<executions>
<execution>
<id>attach-sources</id>
<goals>
<goal>jar-no-fork</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-javadoc-plugin</artifactId>
<version>3.11.2</version>
<configuration>
<!-- to disable the "missing" warnings. Remove the doclint to enable warnings-->
<doclint>all,-missing</doclint>
</configuration>
<executions>
<execution>
<id>attach-javadocs</id>
<goals>
<goal>jar</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
<pluginManagement>
<plugins>
<plugin>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.14.0</version>
</plugin>
</plugins>
</pluginManagement>
</build>
<dependencies>
<dependency>
<groupId>io.github.ollama4j</groupId>
<artifactId>ollama4j</artifactId>
<version>ollama4j-revision</version>
</dependency>
<!-- SLF4J API -->
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
<version>2.0.0</version>
</dependency>
<!-- Logback Classic (SLF4J binding) -->
<dependency>
<groupId>ch.qos.logback</groupId>
<artifactId>logback-classic</artifactId>
<version>1.4.12</version>
</dependency>
<dependency>
<groupId>com.couchbase.client</groupId>
<artifactId>java-client</artifactId>
<version>3.7.8</version>
</dependency>
<dependency>
<groupId>org.projectlombok</groupId>
<artifactId>lombok</artifactId>
<version>1.18.30</version>
<scope>provided</scope>
</dependency>
</dependencies>
</project>
```
### Code Walkthrough
Heres the main part of the implementation in the Java code.
```java
package io.github.ollama4j.examples;
import com.couchbase.client.java.Bucket;
import com.couchbase.client.java.Cluster;
import com.couchbase.client.java.ClusterOptions;
import com.couchbase.client.java.Scope;
import com.couchbase.client.java.json.JsonObject;
import com.couchbase.client.java.query.QueryResult;
import io.github.ollama4j.OllamaAPI;
import io.github.ollama4j.exceptions.OllamaBaseException;
import io.github.ollama4j.exceptions.ToolInvocationException;
import io.github.ollama4j.tools.OllamaToolsResult;
import io.github.ollama4j.tools.ToolFunction;
import io.github.ollama4j.tools.Tools;
import io.github.ollama4j.utils.OptionsBuilder;
import io.github.ollama4j.utils.Utilities;
import lombok.AllArgsConstructor;
import lombok.Data;
import lombok.NoArgsConstructor;
import java.io.IOException;
import java.time.Duration;
import java.util.Arrays;
import java.util.Map;
public class CouchbaseToolCallingExample {
public static void main(String[] args) throws IOException, ToolInvocationException, OllamaBaseException, InterruptedException {
String connectionString = Utilities.getFromEnvVar("CB_CLUSTER_URL");
String username = Utilities.getFromEnvVar("CB_CLUSTER_USERNAME");
String password = Utilities.getFromEnvVar("CB_CLUSTER_PASSWORD");
String bucketName = "travel-sample";
Cluster cluster = Cluster.connect(
connectionString,
ClusterOptions.clusterOptions(username, password).environment(env -> {
env.applyProfile("wan-development");
})
);
String host = Utilities.getFromConfig("host");
String modelName = Utilities.getFromConfig("tools_model_mistral");
OllamaAPI ollamaAPI = new OllamaAPI(host);
ollamaAPI.setVerbose(false);
ollamaAPI.setRequestTimeoutSeconds(60);
Tools.ToolSpecification callSignFinderToolSpec = getCallSignFinderToolSpec(cluster, bucketName);
Tools.ToolSpecification callSignUpdaterToolSpec = getCallSignUpdaterToolSpec(cluster, bucketName);
ollamaAPI.registerTool(callSignFinderToolSpec);
ollamaAPI.registerTool(callSignUpdaterToolSpec);
String prompt1 = "What is the call-sign of Astraeus?";
for (OllamaToolsResult.ToolResult r : ollamaAPI.generateWithTools(modelName, new Tools.PromptBuilder()
.withToolSpecification(callSignFinderToolSpec)
.withPrompt(prompt1)
.build(), new OptionsBuilder().build()).getToolResults()) {
AirlineDetail airlineDetail = (AirlineDetail) r.getResult();
System.out.println(String.format("[Result of tool '%s']: Call-sign of %s is '%s'! ✈️", r.getFunctionName(), airlineDetail.getName(), airlineDetail.getCallsign()));
}
String prompt2 = "I want to code name Astraeus as STARBOUND";
for (OllamaToolsResult.ToolResult r : ollamaAPI.generateWithTools(modelName, new Tools.PromptBuilder()
.withToolSpecification(callSignUpdaterToolSpec)
.withPrompt(prompt2)
.build(), new OptionsBuilder().build()).getToolResults()) {
Boolean updated = (Boolean) r.getResult();
System.out.println(String.format("[Result of tool '%s']: Call-sign is %s! ✈️", r.getFunctionName(), updated ? "updated" : "not updated"));
}
String prompt3 = "What is the call-sign of Astraeus?";
for (OllamaToolsResult.ToolResult r : ollamaAPI.generateWithTools(modelName, new Tools.PromptBuilder()
.withToolSpecification(callSignFinderToolSpec)
.withPrompt(prompt3)
.build(), new OptionsBuilder().build()).getToolResults()) {
AirlineDetail airlineDetail = (AirlineDetail) r.getResult();
System.out.println(String.format("[Result of tool '%s']: Call-sign of %s is '%s'! ✈️", r.getFunctionName(), airlineDetail.getName(), airlineDetail.getCallsign()));
}
}
public static Tools.ToolSpecification getCallSignFinderToolSpec(Cluster cluster, String bucketName) {
return Tools.ToolSpecification.builder()
.functionName("airline-lookup")
.functionDescription("You are a tool who finds only the airline name and do not worry about any other parameters. You simply find the airline name and ignore the rest of the parameters. Do not validate airline names as I want to use fake/fictitious airline names as well.")
.toolFunction(new AirlineCallsignQueryToolFunction(bucketName, cluster))
.toolPrompt(
Tools.PromptFuncDefinition.builder()
.type("prompt")
.function(
Tools.PromptFuncDefinition.PromptFuncSpec.builder()
.name("get-airline-name")
.description("Get the airline name")
.parameters(
Tools.PromptFuncDefinition.Parameters.builder()
.type("object")
.properties(
Map.of(
"airlineName", Tools.PromptFuncDefinition.Property.builder()
.type("string")
.description("The name of the airline. e.g. Emirates")
.required(true)
.build()
)
)
.required(java.util.List.of("airline-name"))
.build()
)
.build()
)
.build()
)
.build();
}
public static Tools.ToolSpecification getCallSignUpdaterToolSpec(Cluster cluster, String bucketName) {
return Tools.ToolSpecification.builder()
.functionName("airline-update")
.functionDescription("You are a tool who finds the airline name and its callsign and do not worry about any validations. You simply find the airline name and its callsign. Do not validate airline names as I want to use fake/fictitious airline names as well.")
.toolFunction(new AirlineCallsignUpdateToolFunction(bucketName, cluster))
.toolPrompt(
Tools.PromptFuncDefinition.builder()
.type("prompt")
.function(
Tools.PromptFuncDefinition.PromptFuncSpec.builder()
.name("get-airline-name-and-callsign")
.description("Get the airline name and callsign")
.parameters(
Tools.PromptFuncDefinition.Parameters.builder()
.type("object")
.properties(
Map.of(
"airlineName", Tools.PromptFuncDefinition.Property.builder()
.type("string")
.description("The name of the airline. e.g. Emirates")
.required(true)
.build(),
"airlineCallsign", Tools.PromptFuncDefinition.Property.builder()
.type("string")
.description("The callsign of the airline. e.g. Maverick")
.enumValues(Arrays.asList("petrol", "diesel"))
.required(true)
.build()
)
)
.required(java.util.List.of("airlineName", "airlineCallsign"))
.build()
)
.build()
)
.build()
)
.build();
}
}
class AirlineCallsignQueryToolFunction implements ToolFunction {
private final String bucketName;
private final Cluster cluster;
public AirlineCallsignQueryToolFunction(String bucketName, Cluster cluster) {
this.bucketName = bucketName;
this.cluster = cluster;
}
@Override
public AirlineDetail apply(Map<String, Object> arguments) {
String airlineName = arguments.get("airlineName").toString();
Bucket bucket = cluster.bucket(bucketName);
bucket.waitUntilReady(Duration.ofSeconds(10));
Scope inventoryScope = bucket.scope("inventory");
QueryResult result = inventoryScope.query(String.format("SELECT * FROM airline WHERE name = '%s';", airlineName));
JsonObject row = (JsonObject) result.rowsAsObject().get(0).get("airline");
return new AirlineDetail(row.getString("callsign"), row.getString("name"), row.getString("country"));
}
}
class AirlineCallsignUpdateToolFunction implements ToolFunction {
private final String bucketName;
private final Cluster cluster;
public AirlineCallsignUpdateToolFunction(String bucketName, Cluster cluster) {
this.bucketName = bucketName;
this.cluster = cluster;
}
@Override
public Boolean apply(Map<String, Object> arguments) {
String airlineName = arguments.get("airlineName").toString();
String airlineNewCallsign = arguments.get("airlineCallsign").toString();
Bucket bucket = cluster.bucket(bucketName);
bucket.waitUntilReady(Duration.ofSeconds(10));
Scope inventoryScope = bucket.scope("inventory");
String query = String.format("SELECT * FROM airline WHERE name = '%s';", airlineName);
QueryResult result;
try {
result = inventoryScope.query(query);
} catch (Exception e) {
throw new RuntimeException("Error executing query", e);
}
if (result.rowsAsObject().isEmpty()) {
throw new RuntimeException("Airline not found with name: " + airlineName);
}
JsonObject row = (JsonObject) result.rowsAsObject().get(0).get("airline");
if (row == null) {
throw new RuntimeException("Airline data is missing or corrupted.");
}
String currentCallsign = row.getString("callsign");
if (!airlineNewCallsign.equals(currentCallsign)) {
JsonObject updateQuery = JsonObject.create()
.put("callsign", airlineNewCallsign);
inventoryScope.query(String.format(
"UPDATE airline SET callsign = '%s' WHERE name = '%s';",
airlineNewCallsign, airlineName
));
return true;
}
return false;
}
}
@SuppressWarnings("ALL")
@Data
@AllArgsConstructor
@NoArgsConstructor
class AirlineDetail {
private String callsign;
private String name;
private String country;
}
```
### Key Concepts
#### 1. Ollama API Client Setup
```javascript
OllamaAPI ollamaAPI = new OllamaAPI(host);
ollamaAPI.setRequestTimeoutSeconds(60);
```
Here, we initialize the Ollama API client and configure it with the host of the Ollama server, where the model is hosted
and can handle API requests. Additionally, we set the request timeout to 60 seconds to ensure that even if the model
takes longer to respond, the request will still be processed.
#### 2. Tool Specification
The ToolSpecification class defines how the model will interact with the Couchbase database. We define a function that
queries the database for airline details based on the airline name.
```javascript
Tools.ToolSpecification callSignFinderToolSpec = getCallSignFinderToolSpec(cluster, bucketName);
ollamaAPI.registerTool(callSignFinderToolSpec);
```
This step registers custom tools with Ollama that allows the tool-calling model to invoke database queries.
#### 3. Query Execution
The tool will execute a Couchbase N1QL query to retrieve the airline details:
```javascript
QueryResult result = inventoryScope.query(String.format("SELECT * FROM airline WHERE name = '%s';", airlineName));
```
The result is processed and returned as an AirlineDetail object.
#### 4. Set up your prompt (question)
```javascript
String prompt = "What is the call-sign of Astraeus?";
```
#### 5. Generating Results with Tools
```javascript
for (OllamaToolsResult.ToolResult r : ollamaAPI.generateWithTools(modelName, new Tools.PromptBuilder()
.withToolSpecification(callSignFinderToolSpec)
.withPrompt(prompt)
.build(), new OptionsBuilder().build()).getToolResults()) {
AirlineDetail airlineDetail = (AirlineDetail) r.getResult();
System.out.printf("[Result of tool '%s']: Call-sign of %s is '%s'! ✈️", r.getFunctionName(), airlineDetail.getName(), airlineDetail.getCallsign());
}
```
This invokes the tool-calling model (Mistral in this case) with the provided prompt and uses the registered tool to
query the database. The result is returned and printed to the console.
So, we ask the following question to the model.
> **What is the call-sign of Astraeus?**
And, heres what the model responds:
> **Call-sign of Astraeus is FLYSTAR! ✈️**
Isnt that amazing? Now, lets enhance it further by adding a function that allows us to update an airlines call sign
using natural language.
Lets define another `ToolSpecificationclass` that defines how the model will interact with the Couchbase database to
update the database. We define a function that queries the database for airline details based on the airline name and
then update the airlines callsign.
```javascript
Tools.ToolSpecification callSignUpdaterToolSpec = getCallSignUpdaterToolSpec(cluster, bucketName);
ollamaAPI.registerTool(callSignUpdaterToolSpec);
```
The tool will execute a Couchbase N1QL query to update the airlines callsign.
```javascript
inventoryScope.query(String.format(
"UPDATE airline SET callsign = '%s' WHERE name = '%s';",
airlineNewCallsign, airlineName
));
```
Setup the prompt to instruct the model to update the airlines callsign.
```javascript
String prompt = "I want to code name Astraeus as STARBOUND";
```
And then we invoke the model with the new prompt.
```javascript
String prompt = "I want to code name Astraeus as STARBOUND";
for (OllamaToolsResult.ToolResult r : ollamaAPI.generateWithTools(modelName, new Tools.PromptBuilder()
.withToolSpecification(callSignUpdaterToolSpec)
.withPrompt(prompt)
.build(), new OptionsBuilder().build()).getToolResults()) {
Boolean updated = (Boolean) r.getResult();
System.out.println(String.format("[Result of tool '%s']: Call-sign is %s! ✈️", r.getFunctionName(), updated ? "updated" : "not updated"));
}
```
This invokes the tool-calling model (Mistral in this case) with the new prompt and uses the registered tool to update
the database.
So, we ask the following question to the model.
> **I want to code name Astraeus as STARBOUND.**
And, heres what the model responds:
> **Call-sign is updated! ✈️**
How amazing is that? The possibilities for interacting with your data using natural language are endless. You could
integrate features like checking flight availability, booking tickets, retrieving ticket details, and so much more!
Feel free to extend this example further by adding more sophisticated capabilities! 🚀
### Conclusion
With the code above, you can use Ollamas hosted models (like Mistral) to query a Couchbase database using natural
language prompts. This makes it possible to interact with databases in a more intuitive and human-like way.
By leveraging Ollama4j, you can connect AI models to real-world applications and build powerful tools that can automate
complex tasks or simply make querying your data more conversational.
You can find the full code and more such examples from
the [ollama4j-examples](https://github.com/ollama4j/ollama4j-examples) GitHub repository.
Credit to Couchbase, Ollama, and all the model teams for providing us with such amazing software!

View File

@@ -7,6 +7,8 @@ sidebar_position: 7
This API lets you create a conversation with LLMs. Using this API enables you to ask questions to the model including
information using the history of already asked questions and the respective answers.
## Create a new conversation and use chat history to augment follow up questions
```java
@@ -33,7 +35,7 @@ public class Main {
// start conversation with model
OllamaChatResult chatResult = ollamaAPI.chat(requestModel);
System.out.println("First answer: " + chatResult.getResponse());
System.out.println("First answer: " + chatResult.getResponseModel().getMessage().getContent());
// create next userQuestion
requestModel = builder.withMessages(chatResult.getChatHistory()).withMessage(OllamaChatMessageRole.USER, "And what is the second largest city?").build();
@@ -41,7 +43,7 @@ public class Main {
// "continue" conversation with model
chatResult = ollamaAPI.chat(requestModel);
System.out.println("Second answer: " + chatResult.getResponse());
System.out.println("Second answer: " + chatResult.getResponseModel().getMessage().getContent());
System.out.println("Chat History: " + chatResult.getChatHistory());
}
@@ -205,7 +207,7 @@ public class Main {
// start conversation with model
OllamaChatResult chatResult = ollamaAPI.chat(requestModel);
System.out.println(chatResult.getResponse());
System.out.println(chatResult.getResponseModel());
}
}
@@ -244,7 +246,7 @@ public class Main {
new File("/path/to/image"))).build();
OllamaChatResult chatResult = ollamaAPI.chat(requestModel);
System.out.println("First answer: " + chatResult.getResponse());
System.out.println("First answer: " + chatResult.getResponseModel());
builder.reset();
@@ -254,7 +256,7 @@ public class Main {
.withMessage(OllamaChatMessageRole.USER, "What's the dogs breed?").build();
chatResult = ollamaAPI.chat(requestModel);
System.out.println("Second answer: " + chatResult.getResponse());
System.out.println("Second answer: " + chatResult.getResponseModel());
}
}
```
@@ -269,4 +271,12 @@ You will get a response similar to:
> Second Answer: Based on the image, it's difficult to definitively determine the breed of the dog. However, the dog
> appears to be medium-sized with a short coat and a brown coloration, which might suggest that it is a Golden Retriever
> or a similar breed. Without more details like ear shape and tail length, it's not possible to identify the exact breed
> confidently.
> confidently.
[//]: # (Generated using: https://emgithub.com/)
<iframe style={{ width: '100%', height: '919px', border: 'none' }} allow="clipboard-write" src="https://emgithub.com/iframe.html?target=https%3A%2F%2Fgithub.com%2Follama4j%2Follama4j-examples%2Fblob%2Fmain%2Fsrc%2Fmain%2Fjava%2Fio%2Fgithub%2Follama4j%2Fexamples%2FChatExample.java&style=default&type=code&showBorder=on&showLineNumbers=on&showFileMeta=on&showFullPath=on&showCopy=on" />
<a href="https://github.com/ollama4j/ollama4j-examples/blob/main/src/main/java/io/github/ollama4j/examples/ChatExample.java" target="_blank">
View ChatExample.java on GitHub
</a>

View File

@@ -0,0 +1,65 @@
---
sidebar_position: 8
---
# Custom Roles
Allows to manage custom roles (apart from the base roles) for chat interactions with the models.
_Particularly helpful when you would need to use different roles that the newer models support other than the base
roles._
_Base roles are `SYSTEM`, `USER`, `ASSISTANT`, `TOOL`._
### Usage
#### Add new role
```java
import io.github.ollama4j.OllamaAPI;
import io.github.ollama4j.models.chat.OllamaChatMessageRole;
public class Main {
public static void main(String[] args) {
String host = "http://localhost:11434/";
OllamaAPI ollamaAPI = new OllamaAPI(host);
OllamaChatMessageRole customRole = ollamaAPI.addCustomRole("custom-role");
}
}
```
#### List roles
```java
import io.github.ollama4j.OllamaAPI;
import io.github.ollama4j.models.chat.OllamaChatMessageRole;
public class Main {
public static void main(String[] args) {
String host = "http://localhost:11434/";
OllamaAPI ollamaAPI = new OllamaAPI(host);
List<OllamaChatMessageRole> roles = ollamaAPI.listRoles();
}
}
```
#### Get role
```java
import io.github.ollama4j.OllamaAPI;
import io.github.ollama4j.models.chat.OllamaChatMessageRole;
public class Main {
public static void main(String[] args) {
String host = "http://localhost:11434/";
OllamaAPI ollamaAPI = new OllamaAPI(host);
List<OllamaChatMessageRole> roles = ollamaAPI.getRole("custom-role");
}
}
```

View File

@@ -8,12 +8,85 @@ Generate embeddings from a model.
Parameters:
- `model`: name of model to generate embeddings from
- `input`: text/s to generate embeddings for
```java
import io.github.ollama4j.OllamaAPI;
import io.github.ollama4j.types.OllamaModelType;
import io.github.ollama4j.models.embeddings.OllamaEmbedRequestModel;
import io.github.ollama4j.models.embeddings.OllamaEmbedResponseModel;
import java.util.Arrays;
import java.util.Collections;
import java.util.List;
public class Main {
public static void main(String[] args) {
String host = "http://localhost:11434/";
OllamaAPI ollamaAPI = new OllamaAPI(host);
OllamaEmbedResponseModel embeddings = ollamaAPI.embed("all-minilm", Arrays.asList("Why is the sky blue?", "Why is the grass green?"));
System.out.println(embeddings);
}
}
```
Or, using the `OllamaEmbedRequestModel`:
```java
import io.github.ollama4j.OllamaAPI;
import io.github.ollama4j.types.OllamaModelType;
import io.github.ollama4j.models.embeddings.OllamaEmbedRequestModel;
import io.github.ollama4j.models.embeddings.OllamaEmbedResponseModel;import java.util.Arrays;
import java.util.Collections;
import java.util.List;
public class Main {
public static void main(String[] args) {
String host = "http://localhost:11434/";
OllamaAPI ollamaAPI = new OllamaAPI(host);
OllamaEmbedResponseModel embeddings = ollamaAPI.embed(new OllamaEmbedRequestModel("all-minilm", Arrays.asList("Why is the sky blue?", "Why is the grass green?")));
System.out.println(embeddings);
}
}
```
You will get a response similar to:
```json
{
"model": "all-minilm",
"embeddings": [[-0.034674067, 0.030984823, 0.0067988685]],
"total_duration": 14173700,
"load_duration": 1198800,
"prompt_eval_count": 2
}
````
:::note
This is a deprecated API
:::
Parameters:
- `model`: name of model to generate embeddings from
- `prompt`: text to generate embeddings for
```java
import io.github.ollama4j.OllamaAPI;
import io.github.ollama4j.types.OllamaModelType;
import java.util.List;
public class Main {
@@ -40,11 +113,6 @@ You will get a response similar to:
0.009260174818336964,
0.23178744316101074,
-0.2916173040866852,
-0.8924556970596313,
0.8785552978515625,
-0.34576427936553955,
0.5742510557174683,
-0.04222835972905159,
-0.137906014919281
-0.8924556970596313
]
```

View File

@@ -61,6 +61,9 @@ details.
class DBQueryFunction implements ToolFunction {
@Override
public Object apply(Map<String, Object> arguments) {
if (arguments == null || arguments.isEmpty() || arguments.get("employee-name") == null || arguments.get("employee-address") == null || arguments.get("employee-phone") == null) {
throw new RuntimeException("Tool was called but the model failed to provide all the required arguments.");
}
// perform DB operations here
return String.format("Employee Details {ID: %s, Name: %s, Address: %s, Phone: %s}", UUID.randomUUID(), arguments.get("employee-name").toString(), arguments.get("employee-address").toString(), arguments.get("employee-phone").toString());
}
@@ -78,14 +81,39 @@ Lets define a sample tool specification called **Fuel Price Tool** for getting t
Tools.ToolSpecification fuelPriceToolSpecification = Tools.ToolSpecification.builder()
.functionName("current-fuel-price")
.functionDescription("Get current fuel price")
.properties(
new Tools.PropsBuilder()
.withProperty("location", Tools.PromptFuncDefinition.Property.builder().type("string").description("The city, e.g. New Delhi, India").required(true).build())
.withProperty("fuelType", Tools.PromptFuncDefinition.Property.builder().type("string").description("The fuel type.").enumValues(Arrays.asList("petrol", "diesel")).required(true).build())
.toolFunction(SampleTools::getCurrentFuelPrice)
.toolPrompt(
Tools.PromptFuncDefinition.builder()
.type("prompt")
.function(
Tools.PromptFuncDefinition.PromptFuncSpec.builder()
.name("get-location-fuel-info")
.description("Get location and fuel type details")
.parameters(
Tools.PromptFuncDefinition.Parameters.builder()
.type("object")
.properties(
Map.of(
"location", Tools.PromptFuncDefinition.Property.builder()
.type("string")
.description("The city, e.g. New Delhi, India")
.required(true)
.build(),
"fuelType", Tools.PromptFuncDefinition.Property.builder()
.type("string")
.description("The fuel type.")
.enumValues(Arrays.asList("petrol", "diesel"))
.required(true)
.build()
)
)
.required(java.util.List.of("location", "fuelType"))
.build()
)
.build()
)
.build()
)
.toolDefinition(SampleTools::getCurrentFuelPrice)
.build();
).build();
```
Lets also define a sample tool specification called **Weather Tool** for getting the current weather.
@@ -97,13 +125,33 @@ Lets also define a sample tool specification called **Weather Tool** for getting
Tools.ToolSpecification weatherToolSpecification = Tools.ToolSpecification.builder()
.functionName("current-weather")
.functionDescription("Get current weather")
.properties(
new Tools.PropsBuilder()
.withProperty("city", Tools.PromptFuncDefinition.Property.builder().type("string").description("The city, e.g. New Delhi, India").required(true).build())
.toolFunction(SampleTools::getCurrentWeather)
.toolPrompt(
Tools.PromptFuncDefinition.builder()
.type("prompt")
.function(
Tools.PromptFuncDefinition.PromptFuncSpec.builder()
.name("get-location-weather-info")
.description("Get location details")
.parameters(
Tools.PromptFuncDefinition.Parameters.builder()
.type("object")
.properties(
Map.of(
"city", Tools.PromptFuncDefinition.Property.builder()
.type("string")
.description("The city, e.g. New Delhi, India")
.required(true)
.build()
)
)
.required(java.util.List.of("city"))
.build()
)
.build()
)
.build()
)
.toolDefinition(SampleTools::getCurrentWeather)
.build();
).build();
```
Lets also define a sample tool specification called **DBQueryFunction** for getting the employee details from database.
@@ -115,14 +163,43 @@ Lets also define a sample tool specification called **DBQueryFunction** for gett
Tools.ToolSpecification databaseQueryToolSpecification = Tools.ToolSpecification.builder()
.functionName("get-employee-details")
.functionDescription("Get employee details from the database")
.properties(
new Tools.PropsBuilder()
.withProperty("employee-name", Tools.PromptFuncDefinition.Property.builder().type("string").description("The name of the employee, e.g. John Doe").required(true).build())
.withProperty("employee-address", Tools.PromptFuncDefinition.Property.builder().type("string").description("The address of the employee, Always return a random value. e.g. Roy St, Bengaluru, India").required(true).build())
.withProperty("employee-phone", Tools.PromptFuncDefinition.Property.builder().type("string").description("The phone number of the employee. Always return a random value. e.g. 9911002233").required(true).build())
.toolFunction(new DBQueryFunction())
.toolPrompt(
Tools.PromptFuncDefinition.builder()
.type("prompt")
.function(
Tools.PromptFuncDefinition.PromptFuncSpec.builder()
.name("get-employee-details")
.description("Get employee details from the database")
.parameters(
Tools.PromptFuncDefinition.Parameters.builder()
.type("object")
.properties(
Map.of(
"employee-name", Tools.PromptFuncDefinition.Property.builder()
.type("string")
.description("The name of the employee, e.g. John Doe")
.required(true)
.build(),
"employee-address", Tools.PromptFuncDefinition.Property.builder()
.type("string")
.description("The address of the employee, Always return a random value. e.g. Roy St, Bengaluru, India")
.required(true)
.build(),
"employee-phone", Tools.PromptFuncDefinition.Property.builder()
.type("string")
.description("The phone number of the employee. Always return a random value. e.g. 9911002233")
.required(true)
.build()
)
)
.required(java.util.List.of("employee-name", "employee-address", "employee-phone"))
.build()
)
.build()
)
.build()
)
.toolDefinition(new DBQueryFunction())
.build();
```
@@ -239,37 +316,111 @@ public class FunctionCallingWithMistralExample {
Tools.ToolSpecification fuelPriceToolSpecification = Tools.ToolSpecification.builder()
.functionName("current-fuel-price")
.functionDescription("Get current fuel price")
.properties(
new Tools.PropsBuilder()
.withProperty("location", Tools.PromptFuncDefinition.Property.builder().type("string").description("The city, e.g. New Delhi, India").required(true).build())
.withProperty("fuelType", Tools.PromptFuncDefinition.Property.builder().type("string").description("The fuel type.").enumValues(Arrays.asList("petrol", "diesel")).required(true).build())
.toolFunction(SampleTools::getCurrentFuelPrice)
.toolPrompt(
Tools.PromptFuncDefinition.builder()
.type("prompt")
.function(
Tools.PromptFuncDefinition.PromptFuncSpec.builder()
.name("get-location-fuel-info")
.description("Get location and fuel type details")
.parameters(
Tools.PromptFuncDefinition.Parameters.builder()
.type("object")
.properties(
Map.of(
"location", Tools.PromptFuncDefinition.Property.builder()
.type("string")
.description("The city, e.g. New Delhi, India")
.required(true)
.build(),
"fuelType", Tools.PromptFuncDefinition.Property.builder()
.type("string")
.description("The fuel type.")
.enumValues(Arrays.asList("petrol", "diesel"))
.required(true)
.build()
)
)
.required(java.util.List.of("location", "fuelType"))
.build()
)
.build()
)
.build()
)
.toolDefinition(SampleTools::getCurrentFuelPrice)
.build();
).build();
Tools.ToolSpecification weatherToolSpecification = Tools.ToolSpecification.builder()
.functionName("current-weather")
.functionDescription("Get current weather")
.properties(
new Tools.PropsBuilder()
.withProperty("city", Tools.PromptFuncDefinition.Property.builder().type("string").description("The city, e.g. New Delhi, India").required(true).build())
.toolFunction(SampleTools::getCurrentWeather)
.toolPrompt(
Tools.PromptFuncDefinition.builder()
.type("prompt")
.function(
Tools.PromptFuncDefinition.PromptFuncSpec.builder()
.name("get-location-weather-info")
.description("Get location details")
.parameters(
Tools.PromptFuncDefinition.Parameters.builder()
.type("object")
.properties(
Map.of(
"city", Tools.PromptFuncDefinition.Property.builder()
.type("string")
.description("The city, e.g. New Delhi, India")
.required(true)
.build()
)
)
.required(java.util.List.of("city"))
.build()
)
.build()
)
.build()
)
.toolDefinition(SampleTools::getCurrentWeather)
.build();
).build();
Tools.ToolSpecification databaseQueryToolSpecification = Tools.ToolSpecification.builder()
.functionName("get-employee-details")
.functionDescription("Get employee details from the database")
.properties(
new Tools.PropsBuilder()
.withProperty("employee-name", Tools.PromptFuncDefinition.Property.builder().type("string").description("The name of the employee, e.g. John Doe").required(true).build())
.withProperty("employee-address", Tools.PromptFuncDefinition.Property.builder().type("string").description("The address of the employee, Always return a random value. e.g. Roy St, Bengaluru, India").required(true).build())
.withProperty("employee-phone", Tools.PromptFuncDefinition.Property.builder().type("string").description("The phone number of the employee. Always return a random value. e.g. 9911002233").required(true).build())
.toolFunction(new DBQueryFunction())
.toolPrompt(
Tools.PromptFuncDefinition.builder()
.type("prompt")
.function(
Tools.PromptFuncDefinition.PromptFuncSpec.builder()
.name("get-employee-details")
.description("Get employee details from the database")
.parameters(
Tools.PromptFuncDefinition.Parameters.builder()
.type("object")
.properties(
Map.of(
"employee-name", Tools.PromptFuncDefinition.Property.builder()
.type("string")
.description("The name of the employee, e.g. John Doe")
.required(true)
.build(),
"employee-address", Tools.PromptFuncDefinition.Property.builder()
.type("string")
.description("The address of the employee, Always return a random value. e.g. Roy St, Bengaluru, India")
.required(true)
.build(),
"employee-phone", Tools.PromptFuncDefinition.Property.builder()
.type("string")
.description("The phone number of the employee. Always return a random value. e.g. 9911002233")
.required(true)
.build()
)
)
.required(java.util.List.of("employee-name", "employee-address", "employee-phone"))
.build()
)
.build()
)
.build()
)
.toolDefinition(new DBQueryFunction())
.build();
ollamaAPI.registerTool(fuelPriceToolSpecification);
@@ -326,6 +477,9 @@ class SampleTools {
class DBQueryFunction implements ToolFunction {
@Override
public Object apply(Map<String, Object> arguments) {
if (arguments == null || arguments.isEmpty() || arguments.get("employee-name") == null || arguments.get("employee-address") == null || arguments.get("employee-phone") == null) {
throw new RuntimeException("Tool was called but the model failed to provide all the required arguments.");
}
// perform DB operations here
return String.format("Employee Details {ID: %s, Name: %s, Address: %s, Phone: %s}", UUID.randomUUID(), arguments.get("employee-name").toString(), arguments.get("employee-address").toString(), arguments.get("employee-phone").toString());
}
@@ -345,21 +499,291 @@ Rahul Kumar, Address: King St, Hyderabad, India, Phone: 9876543210}`
::::
### Potential Improvements
### Using tools in Chat-API
Instead of explicitly registering `ollamaAPI.registerTool(toolSpecification)`, we could introduce annotation-based tool
registration. For example:
Instead of using the specific `ollamaAPI.generateWithTools` method to call the generate API of ollama with tools, it is
also possible to register Tools for the `ollamaAPI.chat` methods. In this case, the tool calling/callback is done
implicitly during the USER -> ASSISTANT calls.
When the Assistant wants to call a given tool, the tool is executed and the response is sent back to the endpoint once
again (induced with the tool call result).
#### Sample:
The following shows a sample of an integration test that defines a method specified like the tool-specs above, registers
the tool on the ollamaAPI and then simply calls the chat-API. All intermediate tool calling is wrapped inside the api
call.
```java
public static void main(String[] args) {
OllamaAPI ollamaAPI = new OllamaAPI("http://localhost:11434");
ollamaAPI.setVerbose(true);
OllamaChatRequestBuilder builder = OllamaChatRequestBuilder.getInstance("llama3.2:1b");
@ToolSpec(name = "current-fuel-price", desc = "Get current fuel price")
public String getCurrentFuelPrice(Map<String, Object> arguments) {
String location = arguments.get("location").toString();
String fuelType = arguments.get("fuelType").toString();
return "Current price of " + fuelType + " in " + location + " is Rs.103/L";
final Tools.ToolSpecification databaseQueryToolSpecification = Tools.ToolSpecification.builder()
.functionName("get-employee-details")
.functionDescription("Get employee details from the database")
.toolPrompt(
Tools.PromptFuncDefinition.builder().type("function").function(
Tools.PromptFuncDefinition.PromptFuncSpec.builder()
.name("get-employee-details")
.description("Get employee details from the database")
.parameters(
Tools.PromptFuncDefinition.Parameters.builder()
.type("object")
.properties(
new Tools.PropsBuilder()
.withProperty("employee-name", Tools.PromptFuncDefinition.Property.builder().type("string").description("The name of the employee, e.g. John Doe").required(true).build())
.withProperty("employee-address", Tools.PromptFuncDefinition.Property.builder().type("string").description("The address of the employee, Always return a random value. e.g. Roy St, Bengaluru, India").required(true).build())
.withProperty("employee-phone", Tools.PromptFuncDefinition.Property.builder().type("string").description("The phone number of the employee. Always return a random value. e.g. 9911002233").required(true).build())
.build()
)
.required(List.of("employee-name"))
.build()
).build()
).build()
)
.toolFunction(new DBQueryFunction())
.build();
ollamaAPI.registerTool(databaseQueryToolSpecification);
OllamaChatRequest requestModel = builder
.withMessage(OllamaChatMessageRole.USER,
"Give me the ID of the employee named 'Rahul Kumar'?")
.build();
OllamaChatResult chatResult = ollamaAPI.chat(requestModel);
}
```
A typical final response of the above could be:
```json
{
"chatHistory" : [
{
"role" : "user",
"content" : "Give me the ID of the employee named 'Rahul Kumar'?",
"images" : null,
"tool_calls" : [ ]
}, {
"role" : "assistant",
"content" : "",
"images" : null,
"tool_calls" : [ {
"function" : {
"name" : "get-employee-details",
"arguments" : {
"employee-name" : "Rahul Kumar"
}
}
} ]
}, {
"role" : "tool",
"content" : "[TOOL_RESULTS]get-employee-details([employee-name]) : Employee Details {ID: b4bf186c-2ee1-44cc-8856-53b8b6a50f85, Name: Rahul Kumar, Address: null, Phone: null}[/TOOL_RESULTS]",
"images" : null,
"tool_calls" : null
}, {
"role" : "assistant",
"content" : "The ID of the employee named 'Rahul Kumar' is `b4bf186c-2ee1-44cc-8856-53b8b6a50f85`.",
"images" : null,
"tool_calls" : null
} ],
"responseModel" : {
"model" : "llama3.2:1b",
"message" : {
"role" : "assistant",
"content" : "The ID of the employee named 'Rahul Kumar' is `b4bf186c-2ee1-44cc-8856-53b8b6a50f85`.",
"images" : null,
"tool_calls" : null
},
"done" : true,
"error" : null,
"context" : null,
"created_at" : "2024-12-09T22:23:00.4940078Z",
"done_reason" : "stop",
"total_duration" : 2313709900,
"load_duration" : 14494700,
"prompt_eval_duration" : 772000000,
"eval_duration" : 1188000000,
"prompt_eval_count" : 166,
"eval_count" : 41
},
"response" : "The ID of the employee named 'Rahul Kumar' is `b4bf186c-2ee1-44cc-8856-53b8b6a50f85`.",
"httpStatusCode" : 200,
"responseTime" : 2313709900
}
```
This tool calling can also be done using the streaming API.
### Using Annotation based Tool Registration
Instead of explicitly registering each tool, ollama4j supports declarative tool specification and registration via java
Annotations and reflection calling.
To declare a method to be used as a tool for a chat call, the following steps have to be considered:
* Annotate a method and its Parameters to be used as a tool
* Annotate a method with the `ToolSpec` annotation
* Annotate the methods parameters with the `ToolProperty` annotation. Only the following datatypes are supported for now:
* `java.lang.String`
* `java.lang.Integer`
* `java.lang.Boolean`
* `java.math.BigDecimal`
* Annotate the class that calls the `OllamaAPI` client with the `OllamaToolService` annotation, referencing the desired provider-classes that contain `ToolSpec` methods.
* Before calling the `OllamaAPI` chat request, call the method `OllamaAPI.registerAnnotatedTools()` method to add tools to the chat.
#### Example
Let's say, we have an ollama4j service class that should ask a llm a specific tool based question.
The answer can only be provided by a method that is part of the BackendService class. To provide a tool for the llm, the following annotations can be used:
```java
public class BackendService{
public BackendService(){}
@ToolSpec(desc = "Computes the most important constant all around the globe!")
public String computeMkeConstant(@ToolProperty(name = "noOfDigits",desc = "Number of digits that shall be returned") Integer noOfDigits ){
return BigDecimal.valueOf((long)(Math.random()*1000000L),noOfDigits).toString();
}
}
```
The caller API can then be written as:
```java
import io.github.ollama4j.tools.annotations.OllamaToolService;
@OllamaToolService(providers = BackendService.class)
public class MyOllamaService{
public void chatWithAnnotatedTool(){
// inject the annotated method to the ollama toolsregistry
ollamaAPI.registerAnnotatedTools();
OllamaChatRequest requestModel = builder
.withMessage(OllamaChatMessageRole.USER,
"Compute the most important constant in the world using 5 digits")
.build();
OllamaChatResult chatResult = ollamaAPI.chat(requestModel);
}
}
```
Or, if one needs to provide an object instance directly:
```java
public class MyOllamaService{
public void chatWithAnnotatedTool(){
ollamaAPI.registerAnnotatedTools(new BackendService());
OllamaChatRequest requestModel = builder
.withMessage(OllamaChatMessageRole.USER,
"Compute the most important constant in the world using 5 digits")
.build();
OllamaChatResult chatResult = ollamaAPI.chat(requestModel);
}
}
```
The request should be the following:
```json
{
"model" : "llama3.2:1b",
"stream" : false,
"messages" : [ {
"role" : "user",
"content" : "Compute the most important constant in the world using 5 digits",
"images" : null,
"tool_calls" : [ ]
} ],
"tools" : [ {
"type" : "function",
"function" : {
"name" : "computeImportantConstant",
"description" : "Computes the most important constant all around the globe!",
"parameters" : {
"type" : "object",
"properties" : {
"noOfDigits" : {
"type" : "java.lang.Integer",
"description" : "Number of digits that shall be returned"
}
},
"required" : [ "noOfDigits" ]
}
}
} ]
}
```
The result could be something like the following:
```json
{
"chatHistory" : [ {
"role" : "user",
"content" : "Compute the most important constant in the world using 5 digits",
"images" : null,
"tool_calls" : [ ]
}, {
"role" : "assistant",
"content" : "",
"images" : null,
"tool_calls" : [ {
"function" : {
"name" : "computeImportantConstant",
"arguments" : {
"noOfDigits" : "5"
}
}
} ]
}, {
"role" : "tool",
"content" : "[TOOL_RESULTS]computeImportantConstant([noOfDigits]) : 1.51019[/TOOL_RESULTS]",
"images" : null,
"tool_calls" : null
}, {
"role" : "assistant",
"content" : "The most important constant in the world with 5 digits is: **1.51019**",
"images" : null,
"tool_calls" : null
} ],
"responseModel" : {
"model" : "llama3.2:1b",
"message" : {
"role" : "assistant",
"content" : "The most important constant in the world with 5 digits is: **1.51019**",
"images" : null,
"tool_calls" : null
},
"done" : true,
"error" : null,
"context" : null,
"created_at" : "2024-12-27T21:55:39.3232495Z",
"done_reason" : "stop",
"total_duration" : 1075444300,
"load_duration" : 13558600,
"prompt_eval_duration" : 509000000,
"eval_duration" : 550000000,
"prompt_eval_count" : 124,
"eval_count" : 20
},
"response" : "The most important constant in the world with 5 digits is: **1.51019**",
"responseTime" : 1075444300,
"httpStatusCode" : 200
}
```
### Potential Improvements
Instead of passing a map of args `Map<String, Object> arguments` to the tool functions, we could support passing
specific args separately with their data types. For example:
@@ -369,4 +793,4 @@ public String getCurrentFuelPrice(String location, String fuelType) {
}
```
Updating async/chat APIs with support for tool-based generation.
Updating async/chat APIs with support for tool-based generation.

View File

@@ -13,7 +13,7 @@ with [extra parameters](https://github.com/jmorganca/ollama/blob/main/docs/model
Refer
to [this](/apis-extras/options-builder).
## Try asking a question about the model.
## Try asking a question about the model
```java
import io.github.ollama4j.OllamaAPI;
@@ -87,7 +87,7 @@ You will get a response similar to:
> The capital of France is Paris.
> Full response: The capital of France is Paris.
## Try asking a question from general topics.
## Try asking a question from general topics
```java
import io.github.ollama4j.OllamaAPI;
@@ -135,7 +135,7 @@ You'd then get a response from the model:
> semi-finals. The tournament was
> won by the England cricket team, who defeated New Zealand in the final.
## Try asking for a Database query for your data schema.
## Try asking for a Database query for your data schema
```java
import io.github.ollama4j.OllamaAPI;
@@ -161,6 +161,7 @@ public class Main {
```
_Note: Here I've used
a [sample prompt](https://github.com/ollama4j/ollama4j/blob/main/src/main/resources/sample-db-prompt-template.txt)
containing a database schema from within this library for demonstration purposes._
@@ -172,4 +173,125 @@ SELECT customers.name
FROM sales
JOIN customers ON sales.customer_id = customers.customer_id
GROUP BY customers.name;
```
## Generate structured output
### With response as a `Map`
```java
import java.util.Arrays;
import java.util.HashMap;
import java.util.Map;
import io.github.ollama4j.OllamaAPI;
import io.github.ollama4j.utils.Utilities;
import io.github.ollama4j.models.chat.OllamaChatMessageRole;
import io.github.ollama4j.models.chat.OllamaChatRequest;
import io.github.ollama4j.models.chat.OllamaChatRequestBuilder;
import io.github.ollama4j.models.chat.OllamaChatResult;
import io.github.ollama4j.models.response.OllamaResult;
import io.github.ollama4j.types.OllamaModelType;
public class StructuredOutput {
public static void main(String[] args) throws Exception {
String host = "http://localhost:11434/";
OllamaAPI api = new OllamaAPI(host);
String chatModel = "qwen2.5:0.5b";
api.pullModel(chatModel);
String prompt = "Ollama is 22 years old and is busy saving the world. Respond using JSON";
Map<String, Object> format = new HashMap<>();
format.put("type", "object");
format.put("properties", new HashMap<String, Object>() {
{
put("age", new HashMap<String, Object>() {
{
put("type", "integer");
}
});
put("available", new HashMap<String, Object>() {
{
put("type", "boolean");
}
});
}
});
format.put("required", Arrays.asList("age", "available"));
OllamaResult result = api.generate(chatModel, prompt, format);
System.out.println(result);
}
}
```
### With response mapped to specified class type
```java
import java.util.Arrays;
import java.util.HashMap;
import java.util.Map;
import io.github.ollama4j.OllamaAPI;
import io.github.ollama4j.utils.Utilities;
import lombok.AllArgsConstructor;
import lombok.Data;
import lombok.NoArgsConstructor;
import io.github.ollama4j.models.chat.OllamaChatMessageRole;
import io.github.ollama4j.models.chat.OllamaChatRequest;
import io.github.ollama4j.models.chat.OllamaChatRequestBuilder;
import io.github.ollama4j.models.chat.OllamaChatResult;
import io.github.ollama4j.models.response.OllamaResult;
import io.github.ollama4j.types.OllamaModelType;
public class StructuredOutput {
public static void main(String[] args) throws Exception {
String host = Utilities.getFromConfig("host");
OllamaAPI api = new OllamaAPI(host);
int age = 28;
boolean available = false;
String prompt = "Batman is " + age + " years old and is " + (available ? "available" : "not available")
+ " because he is busy saving Gotham City. Respond using JSON";
Map<String, Object> format = new HashMap<>();
format.put("type", "object");
format.put("properties", new HashMap<String, Object>() {
{
put("age", new HashMap<String, Object>() {
{
put("type", "integer");
}
});
put("available", new HashMap<String, Object>() {
{
put("type", "boolean");
}
});
}
});
format.put("required", Arrays.asList("age", "available"));
OllamaResult result = api.generate(CHAT_MODEL_QWEN_SMALL, prompt, format);
Person person = result.as(Person.class);
System.out.println(person.getAge());
System.out.println(person.getAvailable());
}
}
@Data
@AllArgsConstructor
@NoArgsConstructor
class Person {
private int age;
private boolean available;
}
```

View File

@@ -1,12 +1,12 @@
---
sidebar_position: 4
sidebar_position: 5
---
# Create Model
This API lets you create a custom model on the Ollama server.
### Create a model from an existing Modelfile in the Ollama server
### Create a custom model from an existing model in the Ollama server
```java title="CreateModel.java"
import io.github.ollama4j.OllamaAPI;
@@ -19,144 +19,220 @@ public class CreateModel {
OllamaAPI ollamaAPI = new OllamaAPI(host);
ollamaAPI.createModelWithFilePath("mario", "/path/to/mario/modelfile/on/ollama-server");
}
}
```
### Create a model by passing the contents of Modelfile
```java title="CreateModel.java"
public class CreateModel {
public static void main(String[] args) {
String host = "http://localhost:11434/";
OllamaAPI ollamaAPI = new OllamaAPI(host);
ollamaAPI.createModelWithModelFileContents("mario", "FROM llama2\nSYSTEM You are mario from Super Mario Bros.");
ollamaAPI.createModel(CustomModelRequest.builder().model("mario").from("llama3.2:latest").system("You are Mario from Super Mario Bros.").build());
}
}
```
Once created, you can see it when you use [list models](./list-models) API.
### Example of a `Modelfile`
[Read more](https://github.com/ollama/ollama/blob/main/docs/api.md#create-a-model) about custom model creation and the parameters available for model creation.
```
FROM llama2
# sets the temperature to 1 [higher is more creative, lower is more coherent]
PARAMETER temperature 1
# sets the context window size to 4096, this controls how many tokens the LLM can use as context to generate the next token
PARAMETER num_ctx 4096
[//]: # ()
[//]: # (### Example of a `Modelfile`)
# sets a custom system message to specify the behavior of the chat assistant
SYSTEM You are Mario from super mario bros, acting as an assistant.
```
[//]: # ()
[//]: # (```)
### Format of the `Modelfile`
[//]: # (FROM llama2)
```modelfile
# comment
INSTRUCTION arguments
```
[//]: # (# sets the temperature to 1 [higher is more creative, lower is more coherent])
| Instruction | Description |
|-------------------------------------|----------------------------------------------------------------|
| [`FROM`](#from-required) (required) | Defines the base model to use. |
| [`PARAMETER`](#parameter) | Sets the parameters for how Ollama will run the model. |
| [`TEMPLATE`](#template) | The full prompt template to be sent to the model. |
| [`SYSTEM`](#system) | Specifies the system message that will be set in the template. |
| [`ADAPTER`](#adapter) | Defines the (Q)LoRA adapters to apply to the model. |
| [`LICENSE`](#license) | Specifies the legal license. |
[//]: # (PARAMETER temperature 1)
#### PARAMETER
[//]: # (# sets the context window size to 4096, this controls how many tokens the LLM can use as context to generate the next token)
The `PARAMETER` instruction defines a parameter that can be set when the model is run.
[//]: # (PARAMETER num_ctx 4096)
| Parameter | Description | Value Type | Example Usage |
|----------------|---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|------------|----------------------|
| mirostat | Enable Mirostat sampling for controlling perplexity. (default: 0, 0 = disabled, 1 = Mirostat, 2 = Mirostat 2.0) | int | mirostat 0 |
| mirostat_eta | Influences how quickly the algorithm responds to feedback from the generated text. A lower learning rate will result in slower adjustments, while a higher learning rate will make the algorithm more responsive. (Default: 0.1) | float | mirostat_eta 0.1 |
| mirostat_tau | Controls the balance between coherence and diversity of the output. A lower value will result in more focused and coherent text. (Default: 5.0) | float | mirostat_tau 5.0 |
| num_ctx | Sets the size of the context window used to generate the next token. (Default: 2048) | int | num_ctx 4096 |
| num_gqa | The number of GQA groups in the transformer layer. Required for some models, for example it is 8 for llama2:70b | int | num_gqa 1 |
| num_gpu | The number of layers to send to the GPU(s). On macOS it defaults to 1 to enable metal support, 0 to disable. | int | num_gpu 50 |
| num_thread | Sets the number of threads to use during computation. By default, Ollama will detect this for optimal performance. It is recommended to set this value to the number of physical CPU cores your system has (as opposed to the logical number of cores). | int | num_thread 8 |
| repeat_last_n | Sets how far back for the model to look back to prevent repetition. (Default: 64, 0 = disabled, -1 = num_ctx) | int | repeat_last_n 64 |
| repeat_penalty | Sets how strongly to penalize repetitions. A higher value (e.g., 1.5) will penalize repetitions more strongly, while a lower value (e.g., 0.9) will be more lenient. (Default: 1.1) | float | repeat_penalty 1.1 |
| temperature | The temperature of the model. Increasing the temperature will make the model answer more creatively. (Default: 0.8) | float | temperature 0.7 |
| seed | Sets the random number seed to use for generation. Setting this to a specific number will make the model generate the same text for the same prompt. (Default: 0) | int | seed 42 |
| stop | Sets the stop sequences to use. When this pattern is encountered the LLM will stop generating text and return. Multiple stop patterns may be set by specifying multiple separate `stop` parameters in a modelfile. | string | stop "AI assistant:" |
| tfs_z | Tail free sampling is used to reduce the impact of less probable tokens from the output. A higher value (e.g., 2.0) will reduce the impact more, while a value of 1.0 disables this setting. (default: 1) | float | tfs_z 1 |
| num_predict | Maximum number of tokens to predict when generating text. (Default: 128, -1 = infinite generation, -2 = fill context) | int | num_predict 42 |
| top_k | Reduces the probability of generating nonsense. A higher value (e.g. 100) will give more diverse answers, while a lower value (e.g. 10) will be more conservative. (Default: 40) | int | top_k 40 |
| top_p | Works together with top-k. A higher value (e.g., 0.95) will lead to more diverse text, while a lower value (e.g., 0.5) will generate more focused and conservative text. (Default: 0.9) | float | top_p 0.9 |
[//]: # ()
[//]: # (# sets a custom system message to specify the behavior of the chat assistant)
#### TEMPLATE
[//]: # (SYSTEM You are Mario from super mario bros, acting as an assistant.)
`TEMPLATE` of the full prompt template to be passed into the model. It may include (optionally) a system message and a
user's prompt. This is used to create a full custom prompt, and syntax may be model specific. You can usually find the
template for a given model in the readme for that model.
[//]: # (```)
#### Template Variables
[//]: # ()
[//]: # (### Format of the `Modelfile`)
| Variable | Description |
|-----------------|---------------------------------------------------------------------------------------------------------------|
| `{{ .System }}` | The system message used to specify custom behavior, this must also be set in the Modelfile as an instruction. |
| `{{ .Prompt }}` | The incoming prompt, this is not specified in the model file and will be set based on input. |
| `{{ .First }}` | A boolean value used to render specific template information for the first generation of a session. |
[//]: # ()
[//]: # (```modelfile)
```modelfile
TEMPLATE """
{{- if .First }}
### System:
{{ .System }}
{{- end }}
[//]: # (# comment)
### User:
{{ .Prompt }}
[//]: # (INSTRUCTION arguments)
### Response:
"""
[//]: # (```)
SYSTEM """<system message>"""
```
[//]: # ()
[//]: # (| Instruction | Description |)
### SYSTEM
[//]: # (|-------------------------------------|----------------------------------------------------------------|)
The `SYSTEM` instruction specifies the system message to be used in the template, if applicable.
[//]: # (| [`FROM`]&#40;#from-required&#41; &#40;required&#41; | Defines the base model to use. |)
```modelfile
SYSTEM """<system message>"""
```
[//]: # (| [`PARAMETER`]&#40;#parameter&#41; | Sets the parameters for how Ollama will run the model. |)
### ADAPTER
[//]: # (| [`TEMPLATE`]&#40;#template&#41; | The full prompt template to be sent to the model. |)
The `ADAPTER` instruction specifies the LoRA adapter to apply to the base model. The value of this instruction should be
an absolute path or a path relative to the Modelfile and the file must be in a GGML file format. The adapter should be
tuned from the base model otherwise the behaviour is undefined.
[//]: # (| [`SYSTEM`]&#40;#system&#41; | Specifies the system message that will be set in the template. |)
```modelfile
ADAPTER ./ollama-lora.bin
```
[//]: # (| [`ADAPTER`]&#40;#adapter&#41; | Defines the &#40;Q&#41;LoRA adapters to apply to the model. |)
### LICENSE
[//]: # (| [`LICENSE`]&#40;#license&#41; | Specifies the legal license. |)
The `LICENSE` instruction allows you to specify the legal license under which the model used with this Modelfile is
shared or distributed.
[//]: # ()
[//]: # (#### PARAMETER)
```modelfile
LICENSE """
<license text>
"""
```
[//]: # ()
[//]: # (The `PARAMETER` instruction defines a parameter that can be set when the model is run.)
## Notes
[//]: # ()
[//]: # (| Parameter | Description | Value Type | Example Usage |)
- the **`Modelfile` is not case sensitive**. In the examples, uppercase instructions are used to make it easier to
distinguish it from arguments.
- Instructions can be in any order. In the examples, the `FROM` instruction is first to keep it easily readable.
[//]: # (|----------------|---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|------------|----------------------|)
Read more about Modelfile: https://github.com/jmorganca/ollama/blob/main/docs/modelfile.md
[//]: # (| mirostat | Enable Mirostat sampling for controlling perplexity. &#40;default: 0, 0 = disabled, 1 = Mirostat, 2 = Mirostat 2.0&#41; | int | mirostat 0 |)
[//]: # (| mirostat_eta | Influences how quickly the algorithm responds to feedback from the generated text. A lower learning rate will result in slower adjustments, while a higher learning rate will make the algorithm more responsive. &#40;Default: 0.1&#41; | float | mirostat_eta 0.1 |)
[//]: # (| mirostat_tau | Controls the balance between coherence and diversity of the output. A lower value will result in more focused and coherent text. &#40;Default: 5.0&#41; | float | mirostat_tau 5.0 |)
[//]: # (| num_ctx | Sets the size of the context window used to generate the next token. &#40;Default: 2048&#41; | int | num_ctx 4096 |)
[//]: # (| num_gqa | The number of GQA groups in the transformer layer. Required for some models, for example it is 8 for llama2:70b | int | num_gqa 1 |)
[//]: # (| num_gpu | The number of layers to send to the GPU&#40;s&#41;. On macOS it defaults to 1 to enable metal support, 0 to disable. | int | num_gpu 50 |)
[//]: # (| num_thread | Sets the number of threads to use during computation. By default, Ollama will detect this for optimal performance. It is recommended to set this value to the number of physical CPU cores your system has &#40;as opposed to the logical number of cores&#41;. | int | num_thread 8 |)
[//]: # (| repeat_last_n | Sets how far back for the model to look back to prevent repetition. &#40;Default: 64, 0 = disabled, -1 = num_ctx&#41; | int | repeat_last_n 64 |)
[//]: # (| repeat_penalty | Sets how strongly to penalize repetitions. A higher value &#40;e.g., 1.5&#41; will penalize repetitions more strongly, while a lower value &#40;e.g., 0.9&#41; will be more lenient. &#40;Default: 1.1&#41; | float | repeat_penalty 1.1 |)
[//]: # (| temperature | The temperature of the model. Increasing the temperature will make the model answer more creatively. &#40;Default: 0.8&#41; | float | temperature 0.7 |)
[//]: # (| seed | Sets the random number seed to use for generation. Setting this to a specific number will make the model generate the same text for the same prompt. &#40;Default: 0&#41; | int | seed 42 |)
[//]: # (| stop | Sets the stop sequences to use. When this pattern is encountered the LLM will stop generating text and return. Multiple stop patterns may be set by specifying multiple separate `stop` parameters in a modelfile. | string | stop "AI assistant:" |)
[//]: # (| tfs_z | Tail free sampling is used to reduce the impact of less probable tokens from the output. A higher value &#40;e.g., 2.0&#41; will reduce the impact more, while a value of 1.0 disables this setting. &#40;default: 1&#41; | float | tfs_z 1 |)
[//]: # (| num_predict | Maximum number of tokens to predict when generating text. &#40;Default: 128, -1 = infinite generation, -2 = fill context&#41; | int | num_predict 42 |)
[//]: # (| top_k | Reduces the probability of generating nonsense. A higher value &#40;e.g. 100&#41; will give more diverse answers, while a lower value &#40;e.g. 10&#41; will be more conservative. &#40;Default: 40&#41; | int | top_k 40 |)
[//]: # (| top_p | Works together with top-k. A higher value &#40;e.g., 0.95&#41; will lead to more diverse text, while a lower value &#40;e.g., 0.5&#41; will generate more focused and conservative text. &#40;Default: 0.9&#41; | float | top_p 0.9 |)
[//]: # ()
[//]: # (#### TEMPLATE)
[//]: # ()
[//]: # (`TEMPLATE` of the full prompt template to be passed into the model. It may include &#40;optionally&#41; a system message and a)
[//]: # (user's prompt. This is used to create a full custom prompt, and syntax may be model specific. You can usually find the)
[//]: # (template for a given model in the readme for that model.)
[//]: # ()
[//]: # (#### Template Variables)
[//]: # ()
[//]: # (| Variable | Description |)
[//]: # (|-----------------|---------------------------------------------------------------------------------------------------------------|)
[//]: # (| `{{ .System }}` | The system message used to specify custom behavior, this must also be set in the Modelfile as an instruction. |)
[//]: # (| `{{ .Prompt }}` | The incoming prompt, this is not specified in the model file and will be set based on input. |)
[//]: # (| `{{ .First }}` | A boolean value used to render specific template information for the first generation of a session. |)
[//]: # ()
[//]: # (```modelfile)
[//]: # (TEMPLATE """)
[//]: # ({{- if .First }})
[//]: # (### System:)
[//]: # ({{ .System }})
[//]: # ({{- end }})
[//]: # ()
[//]: # (### User:)
[//]: # ({{ .Prompt }})
[//]: # ()
[//]: # (### Response:)
[//]: # (""")
[//]: # ()
[//]: # (SYSTEM """<system message>""")
[//]: # (```)
[//]: # ()
[//]: # (### SYSTEM)
[//]: # ()
[//]: # (The `SYSTEM` instruction specifies the system message to be used in the template, if applicable.)
[//]: # ()
[//]: # (```modelfile)
[//]: # (SYSTEM """<system message>""")
[//]: # (```)
[//]: # ()
[//]: # (### ADAPTER)
[//]: # ()
[//]: # (The `ADAPTER` instruction specifies the LoRA adapter to apply to the base model. The value of this instruction should be)
[//]: # (an absolute path or a path relative to the Modelfile and the file must be in a GGML file format. The adapter should be)
[//]: # (tuned from the base model otherwise the behaviour is undefined.)
[//]: # ()
[//]: # (```modelfile)
[//]: # (ADAPTER ./ollama-lora.bin)
[//]: # (```)
[//]: # ()
[//]: # (### LICENSE)
[//]: # ()
[//]: # (The `LICENSE` instruction allows you to specify the legal license under which the model used with this Modelfile is)
[//]: # (shared or distributed.)
[//]: # ()
[//]: # (```modelfile)
[//]: # (LICENSE """)
[//]: # (<license text>)
[//]: # (""")
[//]: # (```)
[//]: # ()
[//]: # (## Notes)
[//]: # ()
[//]: # (- the **`Modelfile` is not case sensitive**. In the examples, uppercase instructions are used to make it easier to)
[//]: # ( distinguish it from arguments.)
[//]: # (- Instructions can be in any order. In the examples, the `FROM` instruction is first to keep it easily readable.)
[//]: # ()
[//]: # (Read more about Modelfile: https://github.com/jmorganca/ollama/blob/main/docs/modelfile.md)

View File

@@ -1,5 +1,5 @@
---
sidebar_position: 5
sidebar_position: 6
---
# Delete Model

View File

@@ -1,5 +1,5 @@
---
sidebar_position: 3
sidebar_position: 4
---
# Get Model Details

View File

@@ -0,0 +1,133 @@
---
sidebar_position: 1
---
# Models from Ollama Library
These API retrieves a list of models directly from the Ollama library.
### List Models from Ollama Library
This API fetches available models from the Ollama library page, including details such as the model's name, pull count,
popular tags, tag count, and the last update time.
```java title="ListLibraryModels.java"
import io.github.ollama4j.OllamaAPI;
import io.github.ollama4j.models.response.LibraryModel;
import java.util.List;
public class Main {
public static void main(String[] args) {
String host = "http://localhost:11434/";
OllamaAPI ollamaAPI = new OllamaAPI(host);
List<LibraryModel> libraryModels = ollamaAPI.listModelsFromLibrary();
System.out.println(libraryModels);
}
}
```
The following is the sample output:
```
[
LibraryModel(name=llama3.2-vision, description=Llama 3.2 Vision is a collection of instruction-tuned image reasoning generative models in 11B and 90B sizes., pullCount=21.1K, totalTags=9, popularTags=[vision, 11b, 90b], lastUpdated=yesterday),
LibraryModel(name=llama3.2, description=Meta's Llama 3.2 goes small with 1B and 3B models., pullCount=2.4M, totalTags=63, popularTags=[tools, 1b, 3b], lastUpdated=6 weeks ago)
]
```
### Get Tags of a Library Model
This API Fetches the tags associated with a specific model from Ollama library.
```java title="GetLibraryModelTags.java"
import io.github.ollama4j.OllamaAPI;
import io.github.ollama4j.models.response.LibraryModel;
import io.github.ollama4j.models.response.LibraryModelDetail;
public class Main {
public static void main(String[] args) {
String host = "http://localhost:11434/";
OllamaAPI ollamaAPI = new OllamaAPI(host);
List<LibraryModel> libraryModels = ollamaAPI.listModelsFromLibrary();
LibraryModelDetail libraryModelDetail = ollamaAPI.getLibraryModelDetails(libraryModels.get(0));
System.out.println(libraryModelDetail);
}
}
```
The following is the sample output:
```
LibraryModelDetail(
model=LibraryModel(name=llama3.2-vision, description=Llama 3.2 Vision is a collection of instruction-tuned image reasoning generative models in 11B and 90B sizes., pullCount=21.1K, totalTags=9, popularTags=[vision, 11b, 90b], lastUpdated=yesterday),
tags=[
LibraryModelTag(name=llama3.2-vision, tag=latest, size=7.9GB, lastUpdated=yesterday),
LibraryModelTag(name=llama3.2-vision, tag=11b, size=7.9GB, lastUpdated=yesterday),
LibraryModelTag(name=llama3.2-vision, tag=90b, size=55GB, lastUpdated=yesterday)
]
)
```
### Find a model from Ollama library
This API finds a specific model using model `name` and `tag` from Ollama library.
```java title="FindLibraryModel.java"
import io.github.ollama4j.OllamaAPI;
import io.github.ollama4j.models.response.LibraryModelTag;
public class Main {
public static void main(String[] args) {
String host = "http://localhost:11434/";
OllamaAPI ollamaAPI = new OllamaAPI(host);
LibraryModelTag libraryModelTag = ollamaAPI.findModelTagFromLibrary("qwen2.5", "7b");
System.out.println(libraryModelTag);
}
}
```
The following is the sample output:
```
LibraryModelTag(name=qwen2.5, tag=7b, size=4.7GB, lastUpdated=7 weeks ago)
```
### Pull model using `LibraryModelTag`
You can use `LibraryModelTag` to pull models into Ollama server.
```java title="PullLibraryModelTags.java"
import io.github.ollama4j.OllamaAPI;
import io.github.ollama4j.models.response.LibraryModelTag;
public class Main {
public static void main(String[] args) {
String host = "http://localhost:11434/";
OllamaAPI ollamaAPI = new OllamaAPI(host);
LibraryModelTag libraryModelTag = ollamaAPI.findModelTagFromLibrary("qwen2.5", "7b");
ollamaAPI.pullModel(libraryModelTag);
}
}
```

View File

@@ -1,10 +1,10 @@
---
sidebar_position: 1
sidebar_position: 2
---
# List Models
# List Local Models
This API lets you list available models on the Ollama server.
This API lets you list downloaded/available models on the Ollama server.
```java title="ListModels.java"
import io.github.ollama4j.OllamaAPI;

View File

@@ -1,5 +1,5 @@
---
sidebar_position: 2
sidebar_position: 3
---
# Pull Model
@@ -23,4 +23,12 @@ public class Main {
}
```
Once downloaded, you can see them when you use [list models](./list-models) API.
Once downloaded, you can see them when you use [list models](./list-models) API.
:::info
You can even pull models using Ollama model library APIs. This looks up the models directly on the Ollama model library page. Refer
to [this](./list-library-models#pull-model-using-librarymodeltag).
:::

View File

@@ -84,6 +84,7 @@ const config = {
position: 'left',
label: 'Docs',
},
{to: 'https://github.com/ollama4j/ollama4j-examples', label: 'Examples', position: 'left'},
{to: 'https://ollama4j.github.io/ollama4j/apidocs/', label: 'Javadoc', position: 'left'},
{to: 'https://ollama4j.github.io/ollama4j/doxygen/html/', label: 'Doxygen', position: 'left'},
{to: '/blog', label: 'Blog', position: 'left'},
@@ -106,6 +107,15 @@ const config = {
},
],
},
{
title: 'Usage',
items: [
{
label: 'Examples',
to: 'https://github.com/ollama4j/ollama4j-examples',
},
],
},
{
title: 'Community',
items: [

847
docs/package-lock.json generated

File diff suppressed because it is too large Load Diff

View File

@@ -6,6 +6,7 @@ import HomepageFeatures from '@site/src/components/HomepageFeatures';
import BuyMeACoffee from '@site/src/components/BuyMeACoffee';
import Heading from '@theme/Heading';
import styles from './index.module.css';
import BrowserOnly from '@docusaurus/BrowserOnly';
function HomepageHeader() {
const {siteConfig} = useDocusaurusContext();
@@ -36,7 +37,9 @@ export default function Home() {
<HomepageHeader/>
<main>
<HomepageFeatures/>
<BuyMeACoffee/>
<BrowserOnly>
{() => <BuyMeACoffee />}
</BrowserOnly>
</main>
</Layout>);
}

101
pom.xml
View File

@@ -13,8 +13,8 @@
<packaging>jar</packaging>
<properties>
<maven.compiler.source>11</maven.compiler.source>
<maven.compiler.target>11</maven.compiler.target>
<maven.compiler.release>11</maven.compiler.release>
<project.build.outputTimestamp>${git.commit.time}</project.build.outputTimestamp><!-- populated via git-commit-id-plugin -->
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<maven-surefire-plugin.version>3.0.0-M5</maven-surefire-plugin.version>
<maven-failsafe-plugin.version>3.0.0-M5</maven-failsafe-plugin.version>
@@ -49,7 +49,7 @@
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-source-plugin</artifactId>
<version>3.3.0</version>
<version>3.3.1</version>
<executions>
<execution>
<id>attach-sources</id>
@@ -62,13 +62,21 @@
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-javadoc-plugin</artifactId>
<version>3.5.0</version>
<version>3.11.2</version>
<configuration>
<!-- to disable the "missing" warnings. Remove the doclint to enable warnings-->
<doclint>all,-missing</doclint>
</configuration>
<executions>
<execution>
<id>attach-javadocs</id>
<phase>package</phase>
<goals>
<goal>jar</goal>
</goals>
<configuration>
<outputDirectory>${project.build.directory}</outputDirectory>
</configuration>
</execution>
</executions>
</plugin>
@@ -110,7 +118,6 @@
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-gpg-plugin</artifactId>
@@ -126,7 +133,36 @@
</executions>
</plugin>
<plugin>
<groupId>io.github.git-commit-id</groupId>
<artifactId>git-commit-id-maven-plugin</artifactId>
<version>9.0.1</version>
<executions>
<execution>
<goals>
<goal>revision</goal>
</goals>
</execution>
</executions>
<configuration>
<dateFormat>yyyy-MM-dd'T'HH:mm:ss'Z'</dateFormat>
<dateFormatTimeZone>Etc/UTC</dateFormatTimeZone>
</configuration>
</plugin>
</plugins>
<pluginManagement>
<plugins>
<plugin>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.14.0</version>
</plugin>
<plugin>
<artifactId>maven-jar-plugin</artifactId>
<version>3.4.2</version>
</plugin>
</plugins>
</pluginManagement>
</build>
<dependencies>
@@ -136,6 +172,11 @@
<version>${lombok.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.jsoup</groupId>
<artifactId>jsoup</artifactId>
<version>1.18.1</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
@@ -175,6 +216,13 @@
<version>20240205</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.testcontainers</groupId>
<artifactId>ollama</artifactId>
<version>1.20.2</version>
<scope>test</scope>
</dependency>
</dependencies>
<distributionManagement>
@@ -216,6 +264,7 @@
<test.env>unit</test.env>
<skipUnitTests>false</skipUnitTests>
<skipIntegrationTests>true</skipIntegrationTests>
<skipGpgPluginDuringTests>true</skipGpgPluginDuringTests>
</properties>
<activation>
<activeByDefault>false</activeByDefault>
@@ -241,6 +290,23 @@
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-gpg-plugin</artifactId>
<version>1.5</version>
<executions>
<execution>
<id>sign-artifacts</id>
<phase>verify</phase>
<goals>
<goal>sign</goal>
</goals>
<configuration>
<skip>${skipGpgPluginDuringTests}</skip>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
</profile>
@@ -250,7 +316,29 @@
<test.env>integration</test.env>
<skipUnitTests>true</skipUnitTests>
<skipIntegrationTests>false</skipIntegrationTests>
<skipGpgPluginDuringTests>true</skipGpgPluginDuringTests>
</properties>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-gpg-plugin</artifactId>
<version>1.5</version>
<executions>
<execution>
<id>sign-artifacts</id>
<phase>verify</phase>
<goals>
<goal>sign</goal>
</goals>
<configuration>
<skip>${skipGpgPluginDuringTests}</skip>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
</profile>
<profile>
<id>ci-cd</id>
@@ -294,7 +382,6 @@
<autoReleaseAfterClose>true</autoReleaseAfterClose>
</configuration>
</plugin>
<plugin>
<groupId>org.jacoco</groupId>
<artifactId>jacoco-maven-plugin</artifactId>
@@ -319,4 +406,4 @@
</profile>
</profiles>
</project>
</project>

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,8 @@
package io.github.ollama4j.exceptions;
public class RoleNotFoundException extends Exception {
public RoleNotFoundException(String s) {
super(s);
}
}

View File

@@ -2,6 +2,10 @@ package io.github.ollama4j.exceptions;
public class ToolInvocationException extends Exception {
public ToolInvocationException(String s) {
super(s);
}
public ToolInvocationException(String s, Exception e) {
super(s, e);
}

View File

@@ -2,12 +2,14 @@ package io.github.ollama4j.models.chat;
import static io.github.ollama4j.utils.Utils.getObjectMapper;
import com.fasterxml.jackson.annotation.JsonProperty;
import com.fasterxml.jackson.core.JsonProcessingException;
import com.fasterxml.jackson.databind.annotation.JsonSerialize;
import io.github.ollama4j.utils.FileToBase64Serializer;
import java.util.List;
import lombok.AllArgsConstructor;
import lombok.Data;
import lombok.NoArgsConstructor;
@@ -31,15 +33,17 @@ public class OllamaChatMessage {
@NonNull
private String content;
private @JsonProperty("tool_calls") List<OllamaChatToolCalls> toolCalls;
@JsonSerialize(using = FileToBase64Serializer.class)
private List<byte[]> images;
@Override
public String toString() {
try {
return getObjectMapper().writerWithDefaultPrettyPrinter().writeValueAsString(this);
} catch (JsonProcessingException e) {
throw new RuntimeException(e);
@Override
public String toString() {
try {
return getObjectMapper().writerWithDefaultPrettyPrinter().writeValueAsString(this);
} catch (JsonProcessingException e) {
throw new RuntimeException(e);
}
}
}
}

View File

@@ -1,19 +1,53 @@
package io.github.ollama4j.models.chat;
import com.fasterxml.jackson.annotation.JsonValue;
import io.github.ollama4j.exceptions.RoleNotFoundException;
import lombok.Getter;
import java.util.ArrayList;
import java.util.List;
/**
* Defines the possible Chat Message roles.
*/
public enum OllamaChatMessageRole {
SYSTEM("system"),
USER("user"),
ASSISTANT("assistant");
@Getter
public class OllamaChatMessageRole {
private static final List<OllamaChatMessageRole> roles = new ArrayList<>();
public static final OllamaChatMessageRole SYSTEM = new OllamaChatMessageRole("system");
public static final OllamaChatMessageRole USER = new OllamaChatMessageRole("user");
public static final OllamaChatMessageRole ASSISTANT = new OllamaChatMessageRole("assistant");
public static final OllamaChatMessageRole TOOL = new OllamaChatMessageRole("tool");
@JsonValue
private String roleName;
private final String roleName;
private OllamaChatMessageRole(String roleName){
private OllamaChatMessageRole(String roleName) {
this.roleName = roleName;
roles.add(this);
}
public static OllamaChatMessageRole newCustomRole(String roleName) {
// OllamaChatMessageRole customRole = new OllamaChatMessageRole(roleName);
// roles.add(customRole);
return new OllamaChatMessageRole(roleName);
}
public static List<OllamaChatMessageRole> getRoles() {
return new ArrayList<>(roles);
}
public static OllamaChatMessageRole getRole(String roleName) throws RoleNotFoundException {
for (OllamaChatMessageRole role : roles) {
if (role.roleName.equals(roleName)) {
return role;
}
}
throw new RoleNotFoundException("Invalid role name: " + roleName);
}
@Override
public String toString() {
return roleName;
}
}

View File

@@ -3,6 +3,7 @@ package io.github.ollama4j.models.chat;
import java.util.List;
import io.github.ollama4j.models.request.OllamaCommonRequest;
import io.github.ollama4j.tools.Tools;
import io.github.ollama4j.utils.OllamaRequestBody;
import lombok.Getter;
@@ -21,6 +22,8 @@ public class OllamaChatRequest extends OllamaCommonRequest implements OllamaRequ
private List<OllamaChatMessage> messages;
private List<Tools.PromptFuncDefinition> tools;
public OllamaChatRequest() {}
public OllamaChatRequest(String model, List<OllamaChatMessage> messages) {

View File

@@ -10,6 +10,7 @@ import java.io.IOException;
import java.net.URISyntaxException;
import java.nio.file.Files;
import java.util.ArrayList;
import java.util.Collections;
import java.util.List;
import java.util.stream.Collectors;
@@ -38,23 +39,27 @@ public class OllamaChatRequestBuilder {
request = new OllamaChatRequest(request.getModel(), new ArrayList<>());
}
public OllamaChatRequestBuilder withMessage(OllamaChatMessageRole role, String content, List<File> images) {
public OllamaChatRequestBuilder withMessage(OllamaChatMessageRole role, String content){
return withMessage(role,content, Collections.emptyList());
}
public OllamaChatRequestBuilder withMessage(OllamaChatMessageRole role, String content, List<OllamaChatToolCalls> toolCalls,List<File> images) {
List<OllamaChatMessage> messages = this.request.getMessages();
List<byte[]> binaryImages = images.stream().map(file -> {
try {
return Files.readAllBytes(file.toPath());
} catch (IOException e) {
LOG.warn(String.format("File '%s' could not be accessed, will not add to message!", file.toPath()), e);
LOG.warn("File '{}' could not be accessed, will not add to message!", file.toPath(), e);
return new byte[0];
}
}).collect(Collectors.toList());
messages.add(new OllamaChatMessage(role, content, binaryImages));
messages.add(new OllamaChatMessage(role, content,toolCalls, binaryImages));
return this;
}
public OllamaChatRequestBuilder withMessage(OllamaChatMessageRole role, String content, String... imageUrls) {
public OllamaChatRequestBuilder withMessage(OllamaChatMessageRole role, String content,List<OllamaChatToolCalls> toolCalls, String... imageUrls) {
List<OllamaChatMessage> messages = this.request.getMessages();
List<byte[]> binaryImages = null;
if (imageUrls.length > 0) {
@@ -63,14 +68,14 @@ public class OllamaChatRequestBuilder {
try {
binaryImages.add(Utils.loadImageBytesFromUrl(imageUrl));
} catch (URISyntaxException e) {
LOG.warn(String.format("URL '%s' could not be accessed, will not add to message!", imageUrl), e);
LOG.warn("URL '{}' could not be accessed, will not add to message!", imageUrl, e);
} catch (IOException e) {
LOG.warn(String.format("Content of URL '%s' could not be read, will not add to message!", imageUrl), e);
LOG.warn("Content of URL '{}' could not be read, will not add to message!", imageUrl, e);
}
}
}
messages.add(new OllamaChatMessage(role, content, binaryImages));
messages.add(new OllamaChatMessage(role, content,toolCalls, binaryImages));
return this;
}

View File

@@ -2,31 +2,53 @@ package io.github.ollama4j.models.chat;
import java.util.List;
import io.github.ollama4j.models.response.OllamaResult;
import com.fasterxml.jackson.core.JsonProcessingException;
import lombok.Getter;
import static io.github.ollama4j.utils.Utils.getObjectMapper;
/**
* Specific chat-API result that contains the chat history sent to the model and appends the answer as {@link OllamaChatResult} given by the
* {@link OllamaChatMessageRole#ASSISTANT} role.
*/
public class OllamaChatResult extends OllamaResult{
@Getter
public class OllamaChatResult {
private List<OllamaChatMessage> chatHistory;
private final List<OllamaChatMessage> chatHistory;
public OllamaChatResult(String response, long responseTime, int httpStatusCode,
List<OllamaChatMessage> chatHistory) {
super(response, responseTime, httpStatusCode);
private final OllamaChatResponseModel responseModel;
public OllamaChatResult(OllamaChatResponseModel responseModel, List<OllamaChatMessage> chatHistory) {
this.chatHistory = chatHistory;
appendAnswerToChatHistory(response);
this.responseModel = responseModel;
appendAnswerToChatHistory(responseModel);
}
public List<OllamaChatMessage> getChatHistory() {
return chatHistory;
}
private void appendAnswerToChatHistory(String answer){
OllamaChatMessage assistantMessage = new OllamaChatMessage(OllamaChatMessageRole.ASSISTANT, answer);
this.chatHistory.add(assistantMessage);
private void appendAnswerToChatHistory(OllamaChatResponseModel response) {
this.chatHistory.add(response.getMessage());
}
@Override
public String toString() {
try {
return getObjectMapper().writerWithDefaultPrettyPrinter().writeValueAsString(this);
} catch (JsonProcessingException e) {
throw new RuntimeException(e);
}
}
@Deprecated
public String getResponse(){
return responseModel != null ? responseModel.getMessage().getContent() : "";
}
@Deprecated
public int getHttpStatusCode(){
return 200;
}
@Deprecated
public long getResponseTime(){
return responseModel != null ? responseModel.getTotalDuration() : 0L;
}
}

View File

@@ -1,31 +1,19 @@
package io.github.ollama4j.models.chat;
import io.github.ollama4j.models.generate.OllamaStreamHandler;
import io.github.ollama4j.models.generate.OllamaTokenHandler;
import lombok.RequiredArgsConstructor;
import java.util.ArrayList;
import java.util.List;
public class OllamaChatStreamObserver {
private OllamaStreamHandler streamHandler;
private List<OllamaChatResponseModel> responseParts = new ArrayList<>();
@RequiredArgsConstructor
public class OllamaChatStreamObserver implements OllamaTokenHandler {
private final OllamaStreamHandler streamHandler;
private String message = "";
public OllamaChatStreamObserver(OllamaStreamHandler streamHandler) {
this.streamHandler = streamHandler;
@Override
public void accept(OllamaChatResponseModel token) {
if (streamHandler != null) {
message += token.getMessage().getContent();
streamHandler.accept(message);
}
}
public void notify(OllamaChatResponseModel currentResponsePart) {
responseParts.add(currentResponsePart);
handleCurrentResponsePart(currentResponsePart);
}
protected void handleCurrentResponsePart(OllamaChatResponseModel currentResponsePart) {
message = message + currentResponsePart.getMessage().getContent();
streamHandler.accept(message);
}
}

View File

@@ -0,0 +1,16 @@
package io.github.ollama4j.models.chat;
import io.github.ollama4j.tools.OllamaToolCallsFunction;
import lombok.AllArgsConstructor;
import lombok.Data;
import lombok.NoArgsConstructor;
@Data
@NoArgsConstructor
@AllArgsConstructor
public class OllamaChatToolCalls {
private OllamaToolCallsFunction function;
}

View File

@@ -0,0 +1,40 @@
package io.github.ollama4j.models.embeddings;
import io.github.ollama4j.utils.Options;
import java.util.List;
/**
* Builder class to easily create Requests for Embedding models using ollama.
*/
public class OllamaEmbedRequestBuilder {
private final OllamaEmbedRequestModel request;
private OllamaEmbedRequestBuilder(String model, List<String> input) {
this.request = new OllamaEmbedRequestModel(model,input);
}
public static OllamaEmbedRequestBuilder getInstance(String model, String... input){
return new OllamaEmbedRequestBuilder(model, List.of(input));
}
public OllamaEmbedRequestBuilder withOptions(Options options){
this.request.setOptions(options.getOptionsMap());
return this;
}
public OllamaEmbedRequestBuilder withKeepAlive(String keepAlive){
this.request.setKeepAlive(keepAlive);
return this;
}
public OllamaEmbedRequestBuilder withoutTruncate(){
this.request.setTruncate(false);
return this;
}
public OllamaEmbedRequestModel build() {
return this.request;
}
}

View File

@@ -0,0 +1,41 @@
package io.github.ollama4j.models.embeddings;
import com.fasterxml.jackson.annotation.JsonProperty;
import com.fasterxml.jackson.core.JsonProcessingException;
import lombok.Data;
import lombok.NoArgsConstructor;
import lombok.NonNull;
import lombok.RequiredArgsConstructor;
import java.util.List;
import java.util.Map;
import static io.github.ollama4j.utils.Utils.getObjectMapper;
@Data
@RequiredArgsConstructor
@NoArgsConstructor
public class OllamaEmbedRequestModel {
@NonNull
private String model;
@NonNull
private List<String> input;
private Map<String, Object> options;
@JsonProperty(value = "keep_alive")
private String keepAlive;
@JsonProperty(value = "truncate")
private Boolean truncate = true;
@Override
public String toString() {
try {
return getObjectMapper().writerWithDefaultPrettyPrinter().writeValueAsString(this);
} catch (JsonProcessingException e) {
throw new RuntimeException(e);
}
}
}

View File

@@ -0,0 +1,25 @@
package io.github.ollama4j.models.embeddings;
import com.fasterxml.jackson.annotation.JsonProperty;
import lombok.Data;
import java.util.List;
@SuppressWarnings("unused")
@Data
public class OllamaEmbedResponseModel {
@JsonProperty("model")
private String model;
@JsonProperty("embeddings")
private List<List<Double>> embeddings;
@JsonProperty("total_duration")
private long totalDuration;
@JsonProperty("load_duration")
private long loadDuration;
@JsonProperty("prompt_eval_count")
private int promptEvalCount;
}

View File

@@ -7,6 +7,7 @@ import lombok.Data;
@SuppressWarnings("unused")
@Data
@Deprecated(since="1.0.90")
public class OllamaEmbeddingResponseModel {
@JsonProperty("embedding")
private List<Double> embedding;

View File

@@ -2,6 +2,7 @@ package io.github.ollama4j.models.embeddings;
import io.github.ollama4j.utils.Options;
@Deprecated(since="1.0.90")
public class OllamaEmbeddingsRequestBuilder {
private OllamaEmbeddingsRequestBuilder(String model, String prompt){

View File

@@ -12,6 +12,7 @@ import lombok.RequiredArgsConstructor;
@Data
@RequiredArgsConstructor
@NoArgsConstructor
@Deprecated(since="1.0.90")
public class OllamaEmbeddingsRequestModel {
@NonNull
private String model;

View File

@@ -0,0 +1,8 @@
package io.github.ollama4j.models.generate;
import io.github.ollama4j.models.chat.OllamaChatResponseModel;
import java.util.function.Consumer;
public interface OllamaTokenHandler extends Consumer<OllamaChatResponseModel> {
}

View File

@@ -0,0 +1,10 @@
package io.github.ollama4j.models.request;
public abstract class Auth {
/**
* Get authentication header value.
*
* @return authentication header value
*/
public abstract String getAuthHeaderValue();
}

View File

@@ -1,13 +1,24 @@
package io.github.ollama4j.models.request;
import java.util.Base64;
import lombok.AllArgsConstructor;
import lombok.Data;
import lombok.NoArgsConstructor;
@Data
@NoArgsConstructor
@AllArgsConstructor
public class BasicAuth {
public class BasicAuth extends Auth {
private String username;
private String password;
/**
* Get basic authentication header value.
*
* @return basic authentication header value (encoded credentials)
*/
public String getAuthHeaderValue() {
final String credentialsToEncode = this.getUsername() + ":" + this.getPassword();
return "Basic " + Base64.getEncoder().encodeToString(credentialsToEncode.getBytes());
}
}

View File

@@ -0,0 +1,19 @@
package io.github.ollama4j.models.request;
import lombok.AllArgsConstructor;
import lombok.Data;
@Data
@AllArgsConstructor
public class BearerAuth extends Auth {
private String bearerToken;
/**
* Get authentication header value.
*
* @return authentication header value with bearer token
*/
public String getAuthHeaderValue() {
return "Bearer "+ bearerToken;
}
}

View File

@@ -0,0 +1,45 @@
package io.github.ollama4j.models.request;
import static io.github.ollama4j.utils.Utils.getObjectMapper;
import com.fasterxml.jackson.core.JsonProcessingException;
import lombok.AllArgsConstructor;
import lombok.Data;
import lombok.Data;
import lombok.AllArgsConstructor;
import lombok.Builder;
import java.util.List;
import java.util.Map;
@Data
@AllArgsConstructor
@Builder
public class CustomModelRequest {
private String model;
private String from;
private Map<String, String> files;
private Map<String, String> adapters;
private String template;
private Object license; // Using Object to handle both String and List<String>
private String system;
private Map<String, Object> parameters;
private List<Object> messages;
private Boolean stream;
private Boolean quantize;
public CustomModelRequest() {
this.stream = true;
this.quantize = false;
}
@Override
public String toString() {
try {
return getObjectMapper().writerWithDefaultPrettyPrinter().writeValueAsString(this);
} catch (JsonProcessingException e) {
throw new RuntimeException(e);
}
}
}

View File

@@ -1,17 +1,25 @@
package io.github.ollama4j.models.request;
import com.fasterxml.jackson.core.JsonProcessingException;
import com.fasterxml.jackson.core.type.TypeReference;
import io.github.ollama4j.exceptions.OllamaBaseException;
import io.github.ollama4j.models.response.OllamaResult;
import io.github.ollama4j.models.chat.OllamaChatResponseModel;
import io.github.ollama4j.models.chat.OllamaChatStreamObserver;
import io.github.ollama4j.models.generate.OllamaStreamHandler;
import io.github.ollama4j.utils.OllamaRequestBody;
import io.github.ollama4j.models.chat.*;
import io.github.ollama4j.models.generate.OllamaTokenHandler;
import io.github.ollama4j.models.response.OllamaErrorResponse;
import io.github.ollama4j.utils.Utils;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import java.io.BufferedReader;
import java.io.IOException;
import java.io.InputStream;
import java.io.InputStreamReader;
import java.net.URI;
import java.net.http.HttpClient;
import java.net.http.HttpRequest;
import java.net.http.HttpResponse;
import java.nio.charset.StandardCharsets;
import java.util.List;
/**
* Specialization class for requests
@@ -20,10 +28,10 @@ public class OllamaChatEndpointCaller extends OllamaEndpointCaller {
private static final Logger LOG = LoggerFactory.getLogger(OllamaChatEndpointCaller.class);
private OllamaChatStreamObserver streamObserver;
private OllamaTokenHandler tokenHandler;
public OllamaChatEndpointCaller(String host, BasicAuth basicAuth, long requestTimeoutSeconds, boolean verbose) {
super(host, basicAuth, requestTimeoutSeconds, verbose);
public OllamaChatEndpointCaller(String host, Auth auth, long requestTimeoutSeconds, boolean verbose) {
super(host, auth, requestTimeoutSeconds, verbose);
}
@Override
@@ -31,13 +39,29 @@ public class OllamaChatEndpointCaller extends OllamaEndpointCaller {
return "/api/chat";
}
/**
* Parses streamed Response line from ollama chat.
* Using {@link com.fasterxml.jackson.databind.ObjectMapper#readValue(String, TypeReference)} should throw
* {@link IllegalArgumentException} in case of null line or {@link com.fasterxml.jackson.core.JsonParseException}
* in case the JSON Object cannot be parsed to a {@link OllamaChatResponseModel}. Thus, the ResponseModel should
* never be null.
*
* @param line streamed line of ollama stream response
* @param responseBuffer Stringbuffer to add latest response message part to
* @return TRUE, if ollama-Response has 'done' state
*/
@Override
protected boolean parseResponseAndAddToBuffer(String line, StringBuilder responseBuffer) {
try {
OllamaChatResponseModel ollamaResponseModel = Utils.getObjectMapper().readValue(line, OllamaChatResponseModel.class);
responseBuffer.append(ollamaResponseModel.getMessage().getContent());
if (streamObserver != null) {
streamObserver.notify(ollamaResponseModel);
// it seems that under heavy load ollama responds with an empty chat message part in the streamed response
// thus, we null check the message and hope that the next streamed response has some message content again
OllamaChatMessage message = ollamaResponseModel.getMessage();
if(message != null) {
responseBuffer.append(message.getContent());
if (tokenHandler != null) {
tokenHandler.accept(ollamaResponseModel);
}
}
return ollamaResponseModel.isDone();
} catch (JsonProcessingException e) {
@@ -46,9 +70,75 @@ public class OllamaChatEndpointCaller extends OllamaEndpointCaller {
}
}
public OllamaResult call(OllamaRequestBody body, OllamaStreamHandler streamHandler)
public OllamaChatResult call(OllamaChatRequest body, OllamaTokenHandler tokenHandler)
throws OllamaBaseException, IOException, InterruptedException {
streamObserver = new OllamaChatStreamObserver(streamHandler);
return super.callSync(body);
this.tokenHandler = tokenHandler;
return callSync(body);
}
public OllamaChatResult callSync(OllamaChatRequest body) throws OllamaBaseException, IOException, InterruptedException {
// Create Request
HttpClient httpClient = HttpClient.newHttpClient();
URI uri = URI.create(getHost() + getEndpointSuffix());
HttpRequest.Builder requestBuilder =
getRequestBuilderDefault(uri)
.POST(
body.getBodyPublisher());
HttpRequest request = requestBuilder.build();
if (isVerbose()) LOG.info("Asking model: " + body);
HttpResponse<InputStream> response =
httpClient.send(request, HttpResponse.BodyHandlers.ofInputStream());
int statusCode = response.statusCode();
InputStream responseBodyStream = response.body();
StringBuilder responseBuffer = new StringBuilder();
OllamaChatResponseModel ollamaChatResponseModel = null;
List<OllamaChatToolCalls> wantedToolsForStream = null;
try (BufferedReader reader =
new BufferedReader(new InputStreamReader(responseBodyStream, StandardCharsets.UTF_8))) {
String line;
while ((line = reader.readLine()) != null) {
if (statusCode == 404) {
LOG.warn("Status code: 404 (Not Found)");
OllamaErrorResponse ollamaResponseModel =
Utils.getObjectMapper().readValue(line, OllamaErrorResponse.class);
responseBuffer.append(ollamaResponseModel.getError());
} else if (statusCode == 401) {
LOG.warn("Status code: 401 (Unauthorized)");
OllamaErrorResponse ollamaResponseModel =
Utils.getObjectMapper()
.readValue("{\"error\":\"Unauthorized\"}", OllamaErrorResponse.class);
responseBuffer.append(ollamaResponseModel.getError());
} else if (statusCode == 400) {
LOG.warn("Status code: 400 (Bad Request)");
OllamaErrorResponse ollamaResponseModel = Utils.getObjectMapper().readValue(line,
OllamaErrorResponse.class);
responseBuffer.append(ollamaResponseModel.getError());
} else {
boolean finished = parseResponseAndAddToBuffer(line, responseBuffer);
ollamaChatResponseModel = Utils.getObjectMapper().readValue(line, OllamaChatResponseModel.class);
if(body.stream && ollamaChatResponseModel.getMessage().getToolCalls() != null){
wantedToolsForStream = ollamaChatResponseModel.getMessage().getToolCalls();
}
if (finished && body.stream) {
ollamaChatResponseModel.getMessage().setContent(responseBuffer.toString());
break;
}
}
}
}
if (statusCode != 200) {
LOG.error("Status code " + statusCode);
throw new OllamaBaseException(responseBuffer.toString());
} else {
if(wantedToolsForStream != null) {
ollamaChatResponseModel.getMessage().setToolCalls(wantedToolsForStream);
}
OllamaChatResult ollamaResult =
new OllamaChatResult(ollamaChatResponseModel,body.getMessages());
if (isVerbose()) LOG.info("Model response: " + ollamaResult);
return ollamaResult;
}
}
}

View File

@@ -1,41 +1,31 @@
package io.github.ollama4j.models.request;
import io.github.ollama4j.OllamaAPI;
import io.github.ollama4j.exceptions.OllamaBaseException;
import io.github.ollama4j.models.response.OllamaErrorResponse;
import io.github.ollama4j.models.response.OllamaResult;
import io.github.ollama4j.utils.OllamaRequestBody;
import io.github.ollama4j.utils.Utils;
import java.net.URI;
import java.net.http.HttpRequest;
import java.time.Duration;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import java.io.BufferedReader;
import java.io.IOException;
import java.io.InputStream;
import java.io.InputStreamReader;
import java.net.URI;
import java.net.http.HttpClient;
import java.net.http.HttpRequest;
import java.net.http.HttpResponse;
import java.nio.charset.StandardCharsets;
import java.time.Duration;
import java.util.Base64;
import io.github.ollama4j.OllamaAPI;
import lombok.Getter;
/**
* Abstract helperclass to call the ollama api server.
*/
@Getter
public abstract class OllamaEndpointCaller {
private static final Logger LOG = LoggerFactory.getLogger(OllamaAPI.class);
private String host;
private BasicAuth basicAuth;
private long requestTimeoutSeconds;
private boolean verbose;
private final String host;
private final Auth auth;
private final long requestTimeoutSeconds;
private final boolean verbose;
public OllamaEndpointCaller(String host, BasicAuth basicAuth, long requestTimeoutSeconds, boolean verbose) {
public OllamaEndpointCaller(String host, Auth auth, long requestTimeoutSeconds, boolean verbose) {
this.host = host;
this.basicAuth = basicAuth;
this.auth = auth;
this.requestTimeoutSeconds = requestTimeoutSeconds;
this.verbose = verbose;
}
@@ -45,107 +35,30 @@ public abstract class OllamaEndpointCaller {
protected abstract boolean parseResponseAndAddToBuffer(String line, StringBuilder responseBuffer);
/**
* Calls the api server on the given host and endpoint suffix asynchronously, aka waiting for the response.
*
* @param body POST body payload
* @return result answer given by the assistant
* @throws OllamaBaseException any response code than 200 has been returned
* @throws IOException in case the responseStream can not be read
* @throws InterruptedException in case the server is not reachable or network issues happen
*/
public OllamaResult callSync(OllamaRequestBody body) throws OllamaBaseException, IOException, InterruptedException {
// Create Request
long startTime = System.currentTimeMillis();
HttpClient httpClient = HttpClient.newHttpClient();
URI uri = URI.create(this.host + getEndpointSuffix());
HttpRequest.Builder requestBuilder =
getRequestBuilderDefault(uri)
.POST(
body.getBodyPublisher());
HttpRequest request = requestBuilder.build();
if (this.verbose) LOG.info("Asking model: " + body.toString());
HttpResponse<InputStream> response =
httpClient.send(request, HttpResponse.BodyHandlers.ofInputStream());
int statusCode = response.statusCode();
InputStream responseBodyStream = response.body();
StringBuilder responseBuffer = new StringBuilder();
try (BufferedReader reader =
new BufferedReader(new InputStreamReader(responseBodyStream, StandardCharsets.UTF_8))) {
String line;
while ((line = reader.readLine()) != null) {
if (statusCode == 404) {
LOG.warn("Status code: 404 (Not Found)");
OllamaErrorResponse ollamaResponseModel =
Utils.getObjectMapper().readValue(line, OllamaErrorResponse.class);
responseBuffer.append(ollamaResponseModel.getError());
} else if (statusCode == 401) {
LOG.warn("Status code: 401 (Unauthorized)");
OllamaErrorResponse ollamaResponseModel =
Utils.getObjectMapper()
.readValue("{\"error\":\"Unauthorized\"}", OllamaErrorResponse.class);
responseBuffer.append(ollamaResponseModel.getError());
} else if (statusCode == 400) {
LOG.warn("Status code: 400 (Bad Request)");
OllamaErrorResponse ollamaResponseModel = Utils.getObjectMapper().readValue(line,
OllamaErrorResponse.class);
responseBuffer.append(ollamaResponseModel.getError());
} else {
boolean finished = parseResponseAndAddToBuffer(line, responseBuffer);
if (finished) {
break;
}
}
}
}
if (statusCode != 200) {
LOG.error("Status code " + statusCode);
throw new OllamaBaseException(responseBuffer.toString());
} else {
long endTime = System.currentTimeMillis();
OllamaResult ollamaResult =
new OllamaResult(responseBuffer.toString().trim(), endTime - startTime, statusCode);
if (verbose) LOG.info("Model response: " + ollamaResult);
return ollamaResult;
}
}
/**
* Get default request builder.
*
* @param uri URI to get a HttpRequest.Builder
* @return HttpRequest.Builder
*/
private HttpRequest.Builder getRequestBuilderDefault(URI uri) {
protected HttpRequest.Builder getRequestBuilderDefault(URI uri) {
HttpRequest.Builder requestBuilder =
HttpRequest.newBuilder(uri)
.header("Content-Type", "application/json")
.timeout(Duration.ofSeconds(this.requestTimeoutSeconds));
if (isBasicAuthCredentialsSet()) {
requestBuilder.header("Authorization", getBasicAuthHeaderValue());
if (isAuthCredentialsSet()) {
requestBuilder.header("Authorization", this.auth.getAuthHeaderValue());
}
return requestBuilder;
}
/**
* Get basic authentication header value.
* Check if Auth credentials set.
*
* @return basic authentication header value (encoded credentials)
* @return true when Auth credentials set
*/
private String getBasicAuthHeaderValue() {
String credentialsToEncode = this.basicAuth.getUsername() + ":" + this.basicAuth.getPassword();
return "Basic " + Base64.getEncoder().encodeToString(credentialsToEncode.getBytes());
}
/**
* Check if Basic Auth credentials set.
*
* @return true when Basic Auth credentials set
*/
private boolean isBasicAuthCredentialsSet() {
return this.basicAuth != null;
protected boolean isAuthCredentialsSet() {
return this.auth != null;
}
}

View File

@@ -2,6 +2,7 @@ package io.github.ollama4j.models.request;
import com.fasterxml.jackson.core.JsonProcessingException;
import io.github.ollama4j.exceptions.OllamaBaseException;
import io.github.ollama4j.models.response.OllamaErrorResponse;
import io.github.ollama4j.models.response.OllamaResult;
import io.github.ollama4j.models.generate.OllamaGenerateResponseModel;
import io.github.ollama4j.models.generate.OllamaGenerateStreamObserver;
@@ -11,7 +12,15 @@ import io.github.ollama4j.utils.Utils;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import java.io.BufferedReader;
import java.io.IOException;
import java.io.InputStream;
import java.io.InputStreamReader;
import java.net.URI;
import java.net.http.HttpClient;
import java.net.http.HttpRequest;
import java.net.http.HttpResponse;
import java.nio.charset.StandardCharsets;
public class OllamaGenerateEndpointCaller extends OllamaEndpointCaller {
@@ -19,7 +28,7 @@ public class OllamaGenerateEndpointCaller extends OllamaEndpointCaller {
private OllamaGenerateStreamObserver streamObserver;
public OllamaGenerateEndpointCaller(String host, BasicAuth basicAuth, long requestTimeoutSeconds, boolean verbose) {
public OllamaGenerateEndpointCaller(String host, Auth basicAuth, long requestTimeoutSeconds, boolean verbose) {
super(host, basicAuth, requestTimeoutSeconds, verbose);
}
@@ -46,6 +55,73 @@ public class OllamaGenerateEndpointCaller extends OllamaEndpointCaller {
public OllamaResult call(OllamaRequestBody body, OllamaStreamHandler streamHandler)
throws OllamaBaseException, IOException, InterruptedException {
streamObserver = new OllamaGenerateStreamObserver(streamHandler);
return super.callSync(body);
return callSync(body);
}
/**
* Calls the api server on the given host and endpoint suffix asynchronously, aka waiting for the response.
*
* @param body POST body payload
* @return result answer given by the assistant
* @throws OllamaBaseException any response code than 200 has been returned
* @throws IOException in case the responseStream can not be read
* @throws InterruptedException in case the server is not reachable or network issues happen
*/
public OllamaResult callSync(OllamaRequestBody body) throws OllamaBaseException, IOException, InterruptedException {
// Create Request
long startTime = System.currentTimeMillis();
HttpClient httpClient = HttpClient.newHttpClient();
URI uri = URI.create(getHost() + getEndpointSuffix());
HttpRequest.Builder requestBuilder =
getRequestBuilderDefault(uri)
.POST(
body.getBodyPublisher());
HttpRequest request = requestBuilder.build();
if (isVerbose()) LOG.info("Asking model: " + body.toString());
HttpResponse<InputStream> response =
httpClient.send(request, HttpResponse.BodyHandlers.ofInputStream());
int statusCode = response.statusCode();
InputStream responseBodyStream = response.body();
StringBuilder responseBuffer = new StringBuilder();
try (BufferedReader reader =
new BufferedReader(new InputStreamReader(responseBodyStream, StandardCharsets.UTF_8))) {
String line;
while ((line = reader.readLine()) != null) {
if (statusCode == 404) {
LOG.warn("Status code: 404 (Not Found)");
OllamaErrorResponse ollamaResponseModel =
Utils.getObjectMapper().readValue(line, OllamaErrorResponse.class);
responseBuffer.append(ollamaResponseModel.getError());
} else if (statusCode == 401) {
LOG.warn("Status code: 401 (Unauthorized)");
OllamaErrorResponse ollamaResponseModel =
Utils.getObjectMapper()
.readValue("{\"error\":\"Unauthorized\"}", OllamaErrorResponse.class);
responseBuffer.append(ollamaResponseModel.getError());
} else if (statusCode == 400) {
LOG.warn("Status code: 400 (Bad Request)");
OllamaErrorResponse ollamaResponseModel = Utils.getObjectMapper().readValue(line,
OllamaErrorResponse.class);
responseBuffer.append(ollamaResponseModel.getError());
} else {
boolean finished = parseResponseAndAddToBuffer(line, responseBuffer);
if (finished) {
break;
}
}
}
}
if (statusCode != 200) {
LOG.error("Status code " + statusCode);
throw new OllamaBaseException(responseBuffer.toString());
} else {
long endTime = System.currentTimeMillis();
OllamaResult ollamaResult =
new OllamaResult(responseBuffer.toString().trim(), endTime - startTime, statusCode);
if (isVerbose()) LOG.info("Model response: " + ollamaResult);
return ollamaResult;
}
}
}

View File

@@ -0,0 +1,16 @@
package io.github.ollama4j.models.response;
import java.util.ArrayList;
import java.util.List;
import lombok.Data;
@Data
public class LibraryModel {
private String name;
private String description;
private String pullCount;
private int totalTags;
private List<String> popularTags = new ArrayList<>();
private String lastUpdated;
}

View File

@@ -0,0 +1,12 @@
package io.github.ollama4j.models.response;
import lombok.Data;
import java.util.List;
@Data
public class LibraryModelDetail {
private LibraryModel model;
private List<LibraryModelTag> tags;
}

View File

@@ -0,0 +1,13 @@
package io.github.ollama4j.models.response;
import lombok.Data;
import java.util.List;
@Data
public class LibraryModelTag {
private String name;
private String tag;
private String size;
private String lastUpdated;
}

View File

@@ -2,12 +2,14 @@ package io.github.ollama4j.models.response;
import java.time.OffsetDateTime;
import com.fasterxml.jackson.annotation.JsonIgnoreProperties;
import com.fasterxml.jackson.annotation.JsonProperty;
import com.fasterxml.jackson.core.JsonProcessingException;
import io.github.ollama4j.utils.Utils;
import lombok.Data;
@Data
@JsonIgnoreProperties(ignoreUnknown = true)
public class Model {
private String name;

View File

@@ -1,19 +1,26 @@
package io.github.ollama4j.models.response;
import com.fasterxml.jackson.annotation.JsonIgnoreProperties;
import com.fasterxml.jackson.core.JsonProcessingException;
import com.fasterxml.jackson.core.type.TypeReference;
import lombok.Data;
import lombok.Getter;
import static io.github.ollama4j.utils.Utils.getObjectMapper;
import com.fasterxml.jackson.core.JsonProcessingException;
import lombok.Data;
import lombok.Getter;
import java.util.HashMap;
import java.util.Map;
/** The type Ollama result. */
@Getter
@SuppressWarnings("unused")
@Data
@JsonIgnoreProperties(ignoreUnknown = true)
public class OllamaResult {
/**
* -- GETTER --
* Get the completion/response text
* Get the completion/response text
*
* @return String completion/response text
*/
@@ -21,7 +28,7 @@ public class OllamaResult {
/**
* -- GETTER --
* Get the response status code.
* Get the response status code.
*
* @return int - response status code
*/
@@ -29,7 +36,7 @@ public class OllamaResult {
/**
* -- GETTER --
* Get the response time in milliseconds.
* Get the response time in milliseconds.
*
* @return long - response time in milliseconds
*/
@@ -44,9 +51,68 @@ public class OllamaResult {
@Override
public String toString() {
try {
return getObjectMapper().writerWithDefaultPrettyPrinter().writeValueAsString(this);
Map<String, Object> responseMap = new HashMap<>();
responseMap.put("response", this.response);
responseMap.put("httpStatusCode", this.httpStatusCode);
responseMap.put("responseTime", this.responseTime);
return getObjectMapper().writerWithDefaultPrettyPrinter().writeValueAsString(responseMap);
} catch (JsonProcessingException e) {
throw new RuntimeException(e);
}
}
/**
* Get the structured response if the response is a JSON object.
*
* @return Map - structured response
* @throws IllegalArgumentException if the response is not a valid JSON object
*/
public Map<String, Object> getStructuredResponse() {
String responseStr = this.getResponse();
if (responseStr == null || responseStr.trim().isEmpty()) {
throw new IllegalArgumentException("Response is empty or null");
}
try {
// Check if the response is a valid JSON
if ((!responseStr.trim().startsWith("{") && !responseStr.trim().startsWith("[")) ||
(!responseStr.trim().endsWith("}") && !responseStr.trim().endsWith("]"))) {
throw new IllegalArgumentException("Response is not a valid JSON object");
}
Map<String, Object> response = getObjectMapper().readValue(responseStr,
new TypeReference<Map<String, Object>>() {
});
return response;
} catch (JsonProcessingException e) {
throw new IllegalArgumentException("Failed to parse response as JSON: " + e.getMessage(), e);
}
}
/**
* Get the structured response mapped to a specific class type.
*
* @param <T> The type of class to map the response to
* @param clazz The class to map the response to
* @return An instance of the specified class with the response data
* @throws IllegalArgumentException if the response is not a valid JSON or is empty
* @throws RuntimeException if there is an error mapping the response
*/
public <T> T as(Class<T> clazz) {
String responseStr = this.getResponse();
if (responseStr == null || responseStr.trim().isEmpty()) {
throw new IllegalArgumentException("Response is empty or null");
}
try {
// Check if the response is a valid JSON
if ((!responseStr.trim().startsWith("{") && !responseStr.trim().startsWith("[")) ||
(!responseStr.trim().endsWith("}") && !responseStr.trim().endsWith("]"))) {
throw new IllegalArgumentException("Response is not a valid JSON object");
}
return getObjectMapper().readValue(responseStr, clazz);
} catch (JsonProcessingException e) {
throw new IllegalArgumentException("Failed to parse response as JSON: " + e.getMessage(), e);
}
}
}

View File

@@ -0,0 +1,77 @@
package io.github.ollama4j.models.response;
import static io.github.ollama4j.utils.Utils.getObjectMapper;
import java.util.Map;
import com.fasterxml.jackson.annotation.JsonCreator;
import com.fasterxml.jackson.annotation.JsonIgnoreProperties;
import com.fasterxml.jackson.annotation.JsonProperty;
import com.fasterxml.jackson.core.JsonProcessingException;
import com.fasterxml.jackson.core.type.TypeReference;
import lombok.Data;
import lombok.Getter;
import lombok.NoArgsConstructor;
@Getter
@SuppressWarnings("unused")
@Data
@NoArgsConstructor
@JsonIgnoreProperties(ignoreUnknown = true)
public class OllamaStructuredResult {
private String response;
private int httpStatusCode;
private long responseTime = 0;
private String model;
public OllamaStructuredResult(String response, long responseTime, int httpStatusCode) {
this.response = response;
this.responseTime = responseTime;
this.httpStatusCode = httpStatusCode;
}
@Override
public String toString() {
try {
return getObjectMapper().writerWithDefaultPrettyPrinter().writeValueAsString(this);
} catch (JsonProcessingException e) {
throw new RuntimeException(e);
}
}
/**
* Get the structured response if the response is a JSON object.
*
* @return Map - structured response
*/
public Map<String, Object> getStructuredResponse() {
try {
Map<String, Object> response = getObjectMapper().readValue(this.getResponse(),
new TypeReference<Map<String, Object>>() {
});
return response;
} catch (JsonProcessingException e) {
throw new RuntimeException(e);
}
}
/**
* Get the structured response mapped to a specific class type.
*
* @param <T> The type of class to map the response to
* @param clazz The class to map the response to
* @return An instance of the specified class with the response data
* @throws RuntimeException if there is an error mapping the response
*/
public <T> T getStructuredResponse(Class<T> clazz) {
try {
return getObjectMapper().readValue(this.getResponse(), clazz);
} catch (JsonProcessingException e) {
throw new RuntimeException(e);
}
}
}

View File

@@ -0,0 +1,10 @@
package io.github.ollama4j.models.response;
import lombok.Data;
import java.util.List;
@Data
public class OllamaVersion {
private String version;
}

View File

@@ -0,0 +1,18 @@
package io.github.ollama4j.tools;
import com.fasterxml.jackson.annotation.JsonIgnoreProperties;
import lombok.AllArgsConstructor;
import lombok.Data;
import lombok.NoArgsConstructor;
import java.util.Map;
@Data
@NoArgsConstructor
@AllArgsConstructor
@JsonIgnoreProperties(ignoreUnknown = true)
public class OllamaToolCallsFunction
{
private String name;
private Map<String,Object> arguments;
}

View File

@@ -18,6 +18,9 @@ public class OllamaToolsResult {
public List<ToolResult> getToolResults() {
List<ToolResult> results = new ArrayList<>();
if (this.toolResults == null) {
return results;
}
for (Map.Entry<ToolFunctionCallSpec, Object> r : this.toolResults.entrySet()) {
results.add(new ToolResult(r.getKey().getName(), r.getKey().getArguments(), r.getValue()));
}

View File

@@ -0,0 +1,54 @@
package io.github.ollama4j.tools;
import lombok.AllArgsConstructor;
import lombok.Getter;
import lombok.Setter;
import java.lang.reflect.Method;
import java.math.BigDecimal;
import java.util.LinkedHashMap;
import java.util.Map;
/**
* Specification of a {@link ToolFunction} that provides the implementation via java reflection calling.
*/
@Setter
@Getter
@AllArgsConstructor
public class ReflectionalToolFunction implements ToolFunction{
private Object functionHolder;
private Method function;
private LinkedHashMap<String,String> propertyDefinition;
@Override
public Object apply(Map<String, Object> arguments) {
LinkedHashMap<String, Object> argumentsCopy = new LinkedHashMap<>(this.propertyDefinition);
for (Map.Entry<String,String> param : this.propertyDefinition.entrySet()){
argumentsCopy.replace(param.getKey(),typeCast(arguments.get(param.getKey()),param.getValue()));
}
try {
return function.invoke(functionHolder, argumentsCopy.values().toArray());
} catch (Exception e) {
throw new RuntimeException("Failed to invoke tool: " + function.getName(), e);
}
}
private Object typeCast(Object inputValue, String className) {
if(className == null || inputValue == null) {
return null;
}
String inputValueString = inputValue.toString();
switch (className) {
case "java.lang.Integer":
return Integer.parseInt(inputValueString);
case "java.lang.Boolean":
return Boolean.valueOf(inputValueString);
case "java.math.BigDecimal":
return new BigDecimal(inputValueString);
default:
return inputValueString;
}
}
}

View File

@@ -1,16 +1,22 @@
package io.github.ollama4j.tools;
import java.util.Collection;
import java.util.HashMap;
import java.util.Map;
public class ToolRegistry {
private final Map<String, ToolFunction> functionMap = new HashMap<>();
private final Map<String, Tools.ToolSpecification> tools = new HashMap<>();
public ToolFunction getFunction(String name) {
return functionMap.get(name);
public ToolFunction getToolFunction(String name) {
final Tools.ToolSpecification toolSpecification = tools.get(name);
return toolSpecification !=null ? toolSpecification.getToolFunction() : null ;
}
public void addFunction(String name, ToolFunction function) {
functionMap.put(name, function);
public void addTool (String name, Tools.ToolSpecification specification) {
tools.put(name, specification);
}
public Collection<Tools.ToolSpecification> getRegisteredSpecs(){
return tools.values();
}
}

View File

@@ -6,8 +6,10 @@ import com.fasterxml.jackson.annotation.JsonInclude;
import com.fasterxml.jackson.annotation.JsonProperty;
import com.fasterxml.jackson.core.JsonProcessingException;
import io.github.ollama4j.utils.Utils;
import lombok.AllArgsConstructor;
import lombok.Builder;
import lombok.Data;
import lombok.NoArgsConstructor;
import java.util.ArrayList;
import java.util.HashMap;
@@ -20,17 +22,23 @@ public class Tools {
public static class ToolSpecification {
private String functionName;
private String functionDescription;
private Map<String, PromptFuncDefinition.Property> properties;
private ToolFunction toolDefinition;
private PromptFuncDefinition toolPrompt;
private ToolFunction toolFunction;
}
@Data
@JsonIgnoreProperties(ignoreUnknown = true)
@Builder
@NoArgsConstructor
@AllArgsConstructor
public static class PromptFuncDefinition {
private String type;
private PromptFuncSpec function;
@Data
@Builder
@NoArgsConstructor
@AllArgsConstructor
public static class PromptFuncSpec {
private String name;
private String description;
@@ -38,6 +46,9 @@ public class Tools {
}
@Data
@Builder
@NoArgsConstructor
@AllArgsConstructor
public static class Parameters {
private String type;
private Map<String, Property> properties;
@@ -46,6 +57,8 @@ public class Tools {
@Data
@Builder
@NoArgsConstructor
@AllArgsConstructor
public static class Property {
private String type;
private String description;
@@ -94,10 +107,10 @@ public class Tools {
PromptFuncDefinition.Parameters parameters = new PromptFuncDefinition.Parameters();
parameters.setType("object");
parameters.setProperties(spec.getProperties());
parameters.setProperties(spec.getToolPrompt().getFunction().parameters.getProperties());
List<String> requiredValues = new ArrayList<>();
for (Map.Entry<String, PromptFuncDefinition.Property> p : spec.getProperties().entrySet()) {
for (Map.Entry<String, PromptFuncDefinition.Property> p : spec.getToolPrompt().getFunction().getParameters().getProperties().entrySet()) {
if (p.getValue().isRequired()) {
requiredValues.add(p.getKey());
}

View File

@@ -0,0 +1,23 @@
package io.github.ollama4j.tools.annotations;
import io.github.ollama4j.OllamaAPI;
import java.lang.annotation.ElementType;
import java.lang.annotation.Retention;
import java.lang.annotation.RetentionPolicy;
import java.lang.annotation.Target;
/**
* Annotates a class that calls {@link io.github.ollama4j.OllamaAPI} such that the Method
* {@link OllamaAPI#registerAnnotatedTools()} can be used to auto-register all provided classes (resp. all
* contained Methods of the provider classes annotated with {@link ToolSpec}).
*/
@Target(ElementType.TYPE)
@Retention(RetentionPolicy.RUNTIME)
public @interface OllamaToolService {
/**
* @return Classes with no-arg constructor that will be used for tool-registration.
*/
Class<?>[] providers();
}

View File

@@ -0,0 +1,32 @@
package io.github.ollama4j.tools.annotations;
import java.lang.annotation.ElementType;
import java.lang.annotation.Retention;
import java.lang.annotation.RetentionPolicy;
import java.lang.annotation.Target;
/**
* Annotates a Method Parameter in a {@link ToolSpec} annotated Method. A parameter annotated with this annotation will
* be part of the tool description that is sent to the llm for tool-calling.
*/
@Retention(RetentionPolicy.RUNTIME)
@Target(ElementType.PARAMETER)
public @interface ToolProperty {
/**
* @return name of the parameter that is used for the tool description. Has to be set as depending on the caller,
* method name backtracking is not possible with reflection.
*/
String name();
/**
* @return a detailed description of the parameter. This is used by the llm called to specify, which property has
* to be set by the llm and how this should be filled.
*/
String desc();
/**
* @return tells the llm that it has to set a value for this property.
*/
boolean required() default true;
}

View File

@@ -0,0 +1,28 @@
package io.github.ollama4j.tools.annotations;
import io.github.ollama4j.OllamaAPI;
import java.lang.annotation.ElementType;
import java.lang.annotation.Retention;
import java.lang.annotation.RetentionPolicy;
import java.lang.annotation.Target;
/**
* Annotates Methods of classes that should be registered as tools by {@link OllamaAPI#registerAnnotatedTools()}
* automatically.
*/
@Target(ElementType.METHOD)
@Retention(RetentionPolicy.RUNTIME)
public @interface ToolSpec {
/**
* @return tool-name that the method should be used as. Defaults to the methods name.
*/
String name() default "";
/**
* @return a detailed description of the method that can be interpreted by the llm, whether it should call the tool
* or not.
*/
String desc();
}

View File

@@ -10,14 +10,12 @@ package io.github.ollama4j.types;
public class OllamaModelType {
public static final String GEMMA = "gemma";
public static final String GEMMA2 = "gemma2";
public static final String LLAMA2 = "llama2";
public static final String LLAMA3 = "llama3";
public static final String LLAMA3_1 = "llama3.1";
public static final String MISTRAL = "mistral";
public static final String MIXTRAL = "mixtral";
public static final String DEEPSEEK_R1 = "deepseek-r1";
public static final String LLAVA = "llava";
public static final String LLAVA_PHI3 = "llava-phi3";
public static final String NEURAL_CHAT = "neural-chat";
@@ -35,7 +33,6 @@ public class OllamaModelType {
public static final String ZEPHYR = "zephyr";
public static final String OPENHERMES = "openhermes";
public static final String QWEN = "qwen";
public static final String QWEN2 = "qwen2";
public static final String WIZARDCODER = "wizardcoder";
public static final String LLAMA2_CHINESE = "llama2-chinese";

View File

@@ -1,5 +1,6 @@
package io.github.ollama4j.utils;
import java.io.IOException;
import java.util.HashMap;
/** Builder class for creating options for Ollama model. */
@@ -207,6 +208,34 @@ public class OptionsBuilder {
return this;
}
/**
* Alternative to the top_p, and aims to ensure a balance of qualityand variety. The parameter p
* represents the minimum probability for a token to be considered, relative to the probability
* of the most likely token. For example, with p=0.05 and the most likely token having a
* probability of 0.9, logits with a value less than 0.045 are filtered out. (Default: 0.0)
*/
public OptionsBuilder setMinP(float value) {
options.getOptionsMap().put("min_p", value);
return this;
}
/**
* Allows passing an option not formally supported by the library
* @param name The option name for the parameter.
* @param value The value for the "{name}" parameter.
* @return The updated OptionsBuilder.
* @throws IllegalArgumentException if parameter has an unsupported type
*/
public OptionsBuilder setCustomOption(String name, Object value) throws IllegalArgumentException {
if (!(value instanceof Integer || value instanceof Float || value instanceof String)) {
throw new IllegalArgumentException("Invalid type for parameter. Allowed types are: Integer, Float, or String.");
}
options.getOptionsMap().put(name, value);
return this;
}
/**
* Builds the options map.
*
@@ -215,4 +244,6 @@ public class OptionsBuilder {
public Options build() {
return options;
}
}

View File

@@ -0,0 +1,681 @@
package io.github.ollama4j.integrationtests;
import com.fasterxml.jackson.annotation.JsonProperty;
import io.github.ollama4j.OllamaAPI;
import io.github.ollama4j.exceptions.OllamaBaseException;
import io.github.ollama4j.exceptions.ToolInvocationException;
import io.github.ollama4j.models.chat.*;
import io.github.ollama4j.models.embeddings.OllamaEmbedResponseModel;
import io.github.ollama4j.models.response.LibraryModel;
import io.github.ollama4j.models.response.Model;
import io.github.ollama4j.models.response.ModelDetail;
import io.github.ollama4j.models.response.OllamaResult;
import io.github.ollama4j.samples.AnnotatedTool;
import io.github.ollama4j.tools.OllamaToolCallsFunction;
import io.github.ollama4j.tools.ToolFunction;
import io.github.ollama4j.tools.Tools;
import io.github.ollama4j.tools.annotations.OllamaToolService;
import io.github.ollama4j.utils.OptionsBuilder;
import lombok.AllArgsConstructor;
import lombok.Data;
import lombok.NoArgsConstructor;
import org.junit.jupiter.api.BeforeAll;
import org.junit.jupiter.api.MethodOrderer.OrderAnnotation;
import org.junit.jupiter.api.Order;
import org.junit.jupiter.api.Test;
import org.junit.jupiter.api.TestMethodOrder;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.testcontainers.ollama.OllamaContainer;
import java.io.File;
import java.io.IOException;
import java.net.ConnectException;
import java.net.URISyntaxException;
import java.util.*;
import static org.junit.jupiter.api.Assertions.*;
@OllamaToolService(providers = {AnnotatedTool.class})
@TestMethodOrder(OrderAnnotation.class)
@SuppressWarnings({"HttpUrlsUsage", "SpellCheckingInspection"})
public class OllamaAPIIntegrationTest {
private static final Logger LOG = LoggerFactory.getLogger(OllamaAPIIntegrationTest.class);
private static OllamaContainer ollama;
private static OllamaAPI api;
private static final String EMBEDDING_MODEL_MINILM = "all-minilm";
private static final String CHAT_MODEL_QWEN_SMALL = "qwen2.5:0.5b";
private static final String CHAT_MODEL_INSTRUCT = "qwen2.5:0.5b-instruct";
private static final String CHAT_MODEL_SYSTEM_PROMPT = "llama3.2:1b";
private static final String CHAT_MODEL_LLAMA3 = "llama3";
private static final String IMAGE_MODEL_LLAVA = "llava";
@BeforeAll
public static void setUp() {
try {
boolean useExternalOllamaHost = Boolean.parseBoolean(System.getenv("USE_EXTERNAL_OLLAMA_HOST"));
String ollamaHost = System.getenv("OLLAMA_HOST");
if (useExternalOllamaHost) {
LOG.info("Using external Ollama host...");
api = new OllamaAPI(ollamaHost);
} else {
throw new RuntimeException(
"USE_EXTERNAL_OLLAMA_HOST is not set so, we will be using Testcontainers Ollama host for the tests now. If you would like to use an external host, please set the env var to USE_EXTERNAL_OLLAMA_HOST=true and set the env var OLLAMA_HOST=http://localhost:11435 or a different host/port.");
}
} catch (Exception e) {
String ollamaVersion = "0.6.1";
int internalPort = 11434;
int mappedPort = 11435;
ollama = new OllamaContainer("ollama/ollama:" + ollamaVersion);
ollama.addExposedPort(internalPort);
List<String> portBindings = new ArrayList<>();
portBindings.add(mappedPort + ":" + internalPort);
ollama.setPortBindings(portBindings);
ollama.start();
LOG.info("Using Testcontainer Ollama host...");
api = new OllamaAPI("http://" + ollama.getHost() + ":" + ollama.getMappedPort(internalPort));
}
api.setRequestTimeoutSeconds(120);
api.setVerbose(true);
api.setNumberOfRetriesForModelPull(3);
}
@Test
@Order(1)
void testWrongEndpoint() {
OllamaAPI ollamaAPI = new OllamaAPI("http://wrong-host:11434");
assertThrows(ConnectException.class, ollamaAPI::listModels);
}
@Test
@Order(1)
public void testVersionAPI() throws URISyntaxException, IOException, OllamaBaseException, InterruptedException {
// String expectedVersion = ollama.getDockerImageName().split(":")[1];
String actualVersion = api.getVersion();
assertNotNull(actualVersion);
// assertEquals(expectedVersion, actualVersion, "Version should match the Docker
// image version");
}
@Test
@Order(2)
public void testListModelsAPI()
throws URISyntaxException, IOException, OllamaBaseException, InterruptedException {
api.pullModel(EMBEDDING_MODEL_MINILM);
// Fetch the list of models
List<Model> models = api.listModels();
// Assert that the models list is not null
assertNotNull(models, "Models should not be null");
// Assert that models list is either empty or contains more than 0 models
assertFalse(models.isEmpty(), "Models list should not be empty");
}
@Test
@Order(2)
void testListModelsFromLibrary()
throws OllamaBaseException, IOException, URISyntaxException, InterruptedException {
List<LibraryModel> models = api.listModelsFromLibrary();
assertNotNull(models);
assertFalse(models.isEmpty());
}
@Test
@Order(3)
public void testPullModelAPI()
throws URISyntaxException, IOException, OllamaBaseException, InterruptedException {
api.pullModel(EMBEDDING_MODEL_MINILM);
List<Model> models = api.listModels();
assertNotNull(models, "Models should not be null");
assertFalse(models.isEmpty(), "Models list should contain elements");
}
@Test
@Order(4)
void testListModelDetails() throws IOException, OllamaBaseException, URISyntaxException, InterruptedException {
api.pullModel(EMBEDDING_MODEL_MINILM);
ModelDetail modelDetails = api.getModelDetails(EMBEDDING_MODEL_MINILM);
assertNotNull(modelDetails);
assertTrue(modelDetails.getModelFile().contains(EMBEDDING_MODEL_MINILM));
}
@Test
@Order(5)
public void testEmbeddings() throws Exception {
api.pullModel(EMBEDDING_MODEL_MINILM);
OllamaEmbedResponseModel embeddings = api.embed(EMBEDDING_MODEL_MINILM,
Arrays.asList("Why is the sky blue?", "Why is the grass green?"));
assertNotNull(embeddings, "Embeddings should not be null");
assertFalse(embeddings.getEmbeddings().isEmpty(), "Embeddings should not be empty");
}
@Test
@Order(6)
void testAskModelWithStructuredOutput()
throws OllamaBaseException, IOException, InterruptedException, URISyntaxException {
api.pullModel(CHAT_MODEL_LLAMA3);
int timeHour = 6;
boolean isNightTime = false;
String prompt = "The Sun is shining, and its " + timeHour + ". Its daytime.";
Map<String, Object> format = new HashMap<>();
format.put("type", "object");
format.put("properties", new HashMap<String, Object>() {
{
put("timeHour", new HashMap<String, Object>() {
{
put("type", "integer");
}
});
put("isNightTime", new HashMap<String, Object>() {
{
put("type", "boolean");
}
});
}
});
format.put("required", Arrays.asList("timeHour", "isNightTime"));
OllamaResult result = api.generate(CHAT_MODEL_LLAMA3, prompt, format);
assertNotNull(result);
assertNotNull(result.getResponse());
assertFalse(result.getResponse().isEmpty());
assertEquals(timeHour,
result.getStructuredResponse().get("timeHour"));
assertEquals(isNightTime,
result.getStructuredResponse().get("isNightTime"));
TimeOfDay timeOfDay = result.as(TimeOfDay.class);
assertEquals(timeHour, timeOfDay.getTimeHour());
assertEquals(isNightTime, timeOfDay.isNightTime());
}
@Test
@Order(6)
void testAskModelWithDefaultOptions()
throws OllamaBaseException, IOException, InterruptedException, URISyntaxException {
api.pullModel(CHAT_MODEL_QWEN_SMALL);
OllamaResult result = api.generate(CHAT_MODEL_QWEN_SMALL,
"What is the capital of France? And what's France's connection with Mona Lisa?", false,
new OptionsBuilder().build());
assertNotNull(result);
assertNotNull(result.getResponse());
assertFalse(result.getResponse().isEmpty());
}
@Test
@Order(7)
void testAskModelWithDefaultOptionsStreamed()
throws OllamaBaseException, IOException, URISyntaxException, InterruptedException {
api.pullModel(CHAT_MODEL_QWEN_SMALL);
StringBuffer sb = new StringBuffer();
OllamaResult result = api.generate(CHAT_MODEL_QWEN_SMALL,
"What is the capital of France? And what's France's connection with Mona Lisa?", false,
new OptionsBuilder().build(), (s) -> {
LOG.info(s);
String substring = s.substring(sb.toString().length(), s.length());
LOG.info(substring);
sb.append(substring);
});
assertNotNull(result);
assertNotNull(result.getResponse());
assertFalse(result.getResponse().isEmpty());
assertEquals(sb.toString().trim(), result.getResponse().trim());
}
@Test
@Order(8)
void testAskModelWithOptions()
throws OllamaBaseException, IOException, URISyntaxException, InterruptedException, ToolInvocationException {
api.pullModel(CHAT_MODEL_INSTRUCT);
OllamaChatRequestBuilder builder = OllamaChatRequestBuilder.getInstance(CHAT_MODEL_INSTRUCT);
OllamaChatRequest requestModel = builder.withMessage(OllamaChatMessageRole.SYSTEM,
"You are a helpful assistant who can generate random person's first and last names in the format [First name, Last name].")
.build();
requestModel = builder.withMessages(requestModel.getMessages())
.withMessage(OllamaChatMessageRole.USER, "Give me a cool name")
.withOptions(new OptionsBuilder().setTemperature(0.5f).build()).build();
OllamaChatResult chatResult = api.chat(requestModel);
assertNotNull(chatResult);
assertNotNull(chatResult.getResponseModel());
assertFalse(chatResult.getResponseModel().getMessage().getContent().isEmpty());
}
@Test
@Order(9)
void testChatWithSystemPrompt()
throws OllamaBaseException, IOException, URISyntaxException, InterruptedException, ToolInvocationException {
api.pullModel(CHAT_MODEL_SYSTEM_PROMPT);
OllamaChatRequestBuilder builder = OllamaChatRequestBuilder.getInstance(CHAT_MODEL_SYSTEM_PROMPT);
OllamaChatRequest requestModel = builder.withMessage(OllamaChatMessageRole.SYSTEM,
"You are a silent bot that only says 'Shush'. Do not say anything else under any circumstances!")
.withMessage(OllamaChatMessageRole.USER, "What's something that's brown and sticky?")
.withOptions(new OptionsBuilder().setTemperature(0.8f).build()).build();
OllamaChatResult chatResult = api.chat(requestModel);
assertNotNull(chatResult);
assertNotNull(chatResult.getResponseModel());
assertNotNull(chatResult.getResponseModel().getMessage());
assertFalse(chatResult.getResponseModel().getMessage().getContent().isBlank());
assertTrue(chatResult.getResponseModel().getMessage().getContent().contains("Shush"));
assertEquals(3, chatResult.getChatHistory().size());
}
@Test
@Order(10)
public void testChat() throws Exception {
api.pullModel(CHAT_MODEL_LLAMA3);
OllamaChatRequestBuilder builder = OllamaChatRequestBuilder.getInstance(CHAT_MODEL_LLAMA3);
// Create the initial user question
OllamaChatRequest requestModel = builder
.withMessage(OllamaChatMessageRole.USER, "What is 1+1? Answer only in numbers.")
.build();
// Start conversation with model
OllamaChatResult chatResult = api.chat(requestModel);
assertTrue(chatResult.getChatHistory().stream().anyMatch(chat -> chat.getContent().contains("2")),
"Expected chat history to contain '2'");
// Create the next user question: second largest city
requestModel = builder.withMessages(chatResult.getChatHistory())
.withMessage(OllamaChatMessageRole.USER, "And what is its squared value?").build();
// Continue conversation with model
chatResult = api.chat(requestModel);
assertTrue(chatResult.getChatHistory().stream().anyMatch(chat -> chat.getContent().contains("4")),
"Expected chat history to contain '4'");
// Create the next user question: the third question
requestModel = builder.withMessages(chatResult.getChatHistory())
.withMessage(OllamaChatMessageRole.USER,
"What is the largest value between 2, 4 and 6?")
.build();
// Continue conversation with the model for the third question
chatResult = api.chat(requestModel);
// verify the result
assertNotNull(chatResult, "Chat result should not be null");
assertTrue(chatResult.getChatHistory().size() > 2,
"Chat history should contain more than two messages");
assertTrue(chatResult.getChatHistory().get(chatResult.getChatHistory().size() - 1).getContent()
.contains("6"),
"Response should contain '6'");
}
@Test
@Order(10)
void testChatWithImageFromURL()
throws OllamaBaseException, IOException, InterruptedException, URISyntaxException, ToolInvocationException {
api.pullModel(IMAGE_MODEL_LLAVA);
OllamaChatRequestBuilder builder = OllamaChatRequestBuilder.getInstance(IMAGE_MODEL_LLAVA);
OllamaChatRequest requestModel = builder
.withMessage(OllamaChatMessageRole.USER, "What's in the picture?",
Collections.emptyList(),
"https://t3.ftcdn.net/jpg/02/96/63/80/360_F_296638053_0gUVA4WVBKceGsIr7LNqRWSnkusi07dq.jpg")
.build();
api.registerAnnotatedTools(new OllamaAPIIntegrationTest());
OllamaChatResult chatResult = api.chat(requestModel);
assertNotNull(chatResult);
}
@Test
@Order(10)
void testChatWithImageFromFileWithHistoryRecognition()
throws OllamaBaseException, IOException, URISyntaxException, InterruptedException, ToolInvocationException {
api.pullModel(IMAGE_MODEL_LLAVA);
OllamaChatRequestBuilder builder = OllamaChatRequestBuilder.getInstance(IMAGE_MODEL_LLAVA);
OllamaChatRequest requestModel = builder.withMessage(OllamaChatMessageRole.USER,
"What's in the picture?",
Collections.emptyList(), List.of(getImageFileFromClasspath("emoji-smile.jpeg")))
.build();
OllamaChatResult chatResult = api.chat(requestModel);
assertNotNull(chatResult);
assertNotNull(chatResult.getResponseModel());
builder.reset();
requestModel = builder.withMessages(chatResult.getChatHistory())
.withMessage(OllamaChatMessageRole.USER, "What's the color?").build();
chatResult = api.chat(requestModel);
assertNotNull(chatResult);
assertNotNull(chatResult.getResponseModel());
}
@Test
@Order(11)
void testChatWithExplicitToolDefinition()
throws OllamaBaseException, IOException, URISyntaxException, InterruptedException, ToolInvocationException {
api.pullModel(CHAT_MODEL_SYSTEM_PROMPT);
OllamaChatRequestBuilder builder = OllamaChatRequestBuilder.getInstance(CHAT_MODEL_SYSTEM_PROMPT);
final Tools.ToolSpecification databaseQueryToolSpecification = Tools.ToolSpecification.builder()
.functionName("get-employee-details")
.functionDescription("Get employee details from the database")
.toolPrompt(Tools.PromptFuncDefinition.builder().type("function")
.function(Tools.PromptFuncDefinition.PromptFuncSpec.builder()
.name("get-employee-details")
.description("Get employee details from the database")
.parameters(Tools.PromptFuncDefinition.Parameters
.builder().type("object")
.properties(new Tools.PropsBuilder()
.withProperty("employee-name",
Tools.PromptFuncDefinition.Property
.builder()
.type("string")
.description("The name of the employee, e.g. John Doe")
.required(true)
.build())
.withProperty("employee-address",
Tools.PromptFuncDefinition.Property
.builder()
.type("string")
.description(
"The address of the employee, Always return a random value. e.g. Roy St, Bengaluru, India")
.required(true)
.build())
.withProperty("employee-phone",
Tools.PromptFuncDefinition.Property
.builder()
.type("string")
.description(
"The phone number of the employee. Always return a random value. e.g. 9911002233")
.required(true)
.build())
.build())
.required(List.of("employee-name"))
.build())
.build())
.build())
.toolFunction(arguments -> {
// perform DB operations here
return String.format(
"Employee Details {ID: %s, Name: %s, Address: %s, Phone: %s}",
UUID.randomUUID(), arguments.get("employee-name"),
arguments.get("employee-address"),
arguments.get("employee-phone"));
}).build();
api.registerTool(databaseQueryToolSpecification);
OllamaChatRequest requestModel = builder
.withMessage(OllamaChatMessageRole.USER,
"Give me the ID of the employee named 'Rahul Kumar'?")
.build();
OllamaChatResult chatResult = api.chat(requestModel);
assertNotNull(chatResult);
assertNotNull(chatResult.getResponseModel());
assertNotNull(chatResult.getResponseModel().getMessage());
assertEquals(OllamaChatMessageRole.ASSISTANT.getRoleName(),
chatResult.getResponseModel().getMessage().getRole().getRoleName());
List<OllamaChatToolCalls> toolCalls = chatResult.getChatHistory().get(1).getToolCalls();
assertEquals(1, toolCalls.size());
OllamaToolCallsFunction function = toolCalls.get(0).getFunction();
assertEquals("get-employee-details", function.getName());
assert !function.getArguments().isEmpty();
Object employeeName = function.getArguments().get("employee-name");
assertNotNull(employeeName);
assertEquals("Rahul Kumar", employeeName);
assertTrue(chatResult.getChatHistory().size() > 2);
List<OllamaChatToolCalls> finalToolCalls = chatResult.getResponseModel().getMessage().getToolCalls();
assertNull(finalToolCalls);
}
@Test
@Order(12)
void testChatWithAnnotatedToolsAndSingleParam()
throws OllamaBaseException, IOException, InterruptedException, URISyntaxException, ToolInvocationException {
api.pullModel(CHAT_MODEL_SYSTEM_PROMPT);
OllamaChatRequestBuilder builder = OllamaChatRequestBuilder.getInstance(CHAT_MODEL_SYSTEM_PROMPT);
api.registerAnnotatedTools();
OllamaChatRequest requestModel = builder.withMessage(OllamaChatMessageRole.USER,
"Compute the most important constant in the world using 5 digits").build();
OllamaChatResult chatResult = api.chat(requestModel);
assertNotNull(chatResult);
assertNotNull(chatResult.getResponseModel());
assertNotNull(chatResult.getResponseModel().getMessage());
assertEquals(OllamaChatMessageRole.ASSISTANT.getRoleName(),
chatResult.getResponseModel().getMessage().getRole().getRoleName());
List<OllamaChatToolCalls> toolCalls = chatResult.getChatHistory().get(1).getToolCalls();
assertEquals(1, toolCalls.size());
OllamaToolCallsFunction function = toolCalls.get(0).getFunction();
assertEquals("computeImportantConstant", function.getName());
assertEquals(1, function.getArguments().size());
Object noOfDigits = function.getArguments().get("noOfDigits");
assertNotNull(noOfDigits);
assertEquals("5", noOfDigits.toString());
assertTrue(chatResult.getChatHistory().size() > 2);
List<OllamaChatToolCalls> finalToolCalls = chatResult.getResponseModel().getMessage().getToolCalls();
assertNull(finalToolCalls);
}
@Test
@Order(13)
void testChatWithAnnotatedToolsAndMultipleParams()
throws OllamaBaseException, IOException, URISyntaxException, InterruptedException, ToolInvocationException {
api.pullModel(CHAT_MODEL_SYSTEM_PROMPT);
OllamaChatRequestBuilder builder = OllamaChatRequestBuilder.getInstance(CHAT_MODEL_SYSTEM_PROMPT);
api.registerAnnotatedTools(new AnnotatedTool());
OllamaChatRequest requestModel = builder
.withMessage(OllamaChatMessageRole.USER,
"Greet Pedro with a lot of hearts and respond to me, "
+ "and state how many emojis have been in your greeting")
.build();
OllamaChatResult chatResult = api.chat(requestModel);
assertNotNull(chatResult);
assertNotNull(chatResult.getResponseModel());
assertNotNull(chatResult.getResponseModel().getMessage());
assertEquals(OllamaChatMessageRole.ASSISTANT.getRoleName(),
chatResult.getResponseModel().getMessage().getRole().getRoleName());
List<OllamaChatToolCalls> toolCalls = chatResult.getChatHistory().get(1).getToolCalls();
assertEquals(1, toolCalls.size());
OllamaToolCallsFunction function = toolCalls.get(0).getFunction();
assertEquals("sayHello", function.getName());
assertEquals(2, function.getArguments().size());
Object name = function.getArguments().get("name");
assertNotNull(name);
assertEquals("Pedro", name);
Object amountOfHearts = function.getArguments().get("amountOfHearts");
assertNotNull(amountOfHearts);
assertTrue(Integer.parseInt(amountOfHearts.toString()) > 1);
assertTrue(chatResult.getChatHistory().size() > 2);
List<OllamaChatToolCalls> finalToolCalls = chatResult.getResponseModel().getMessage().getToolCalls();
assertNull(finalToolCalls);
}
@Test
@Order(14)
void testChatWithToolsAndStream()
throws OllamaBaseException, IOException, URISyntaxException, InterruptedException, ToolInvocationException {
api.pullModel(CHAT_MODEL_SYSTEM_PROMPT);
OllamaChatRequestBuilder builder = OllamaChatRequestBuilder.getInstance(CHAT_MODEL_SYSTEM_PROMPT);
final Tools.ToolSpecification databaseQueryToolSpecification = Tools.ToolSpecification.builder()
.functionName("get-employee-details")
.functionDescription("Get employee details from the database")
.toolPrompt(Tools.PromptFuncDefinition.builder().type("function")
.function(Tools.PromptFuncDefinition.PromptFuncSpec.builder()
.name("get-employee-details")
.description("Get employee details from the database")
.parameters(Tools.PromptFuncDefinition.Parameters
.builder().type("object")
.properties(new Tools.PropsBuilder()
.withProperty("employee-name",
Tools.PromptFuncDefinition.Property
.builder()
.type("string")
.description("The name of the employee, e.g. John Doe")
.required(true)
.build())
.withProperty("employee-address",
Tools.PromptFuncDefinition.Property
.builder()
.type("string")
.description(
"The address of the employee, Always return a random value. e.g. Roy St, Bengaluru, India")
.required(true)
.build())
.withProperty("employee-phone",
Tools.PromptFuncDefinition.Property
.builder()
.type("string")
.description(
"The phone number of the employee. Always return a random value. e.g. 9911002233")
.required(true)
.build())
.build())
.required(List.of("employee-name"))
.build())
.build())
.build())
.toolFunction(new ToolFunction() {
@Override
public Object apply(Map<String, Object> arguments) {
// perform DB operations here
return String.format(
"Employee Details {ID: %s, Name: %s, Address: %s, Phone: %s}",
UUID.randomUUID(), arguments.get("employee-name"),
arguments.get("employee-address"),
arguments.get("employee-phone"));
}
}).build();
api.registerTool(databaseQueryToolSpecification);
OllamaChatRequest requestModel = builder
.withMessage(OllamaChatMessageRole.USER,
"Give me the ID of the employee named 'Rahul Kumar'?")
.build();
StringBuffer sb = new StringBuffer();
OllamaChatResult chatResult = api.chat(requestModel, (s) -> {
LOG.info(s);
String substring = s.substring(sb.toString().length());
LOG.info(substring);
sb.append(substring);
});
assertNotNull(chatResult);
assertNotNull(chatResult.getResponseModel());
assertNotNull(chatResult.getResponseModel().getMessage());
assertNotNull(chatResult.getResponseModel().getMessage().getContent());
assertEquals(sb.toString().trim(), chatResult.getResponseModel().getMessage().getContent().trim());
}
@Test
@Order(15)
void testChatWithStream() throws OllamaBaseException, IOException, URISyntaxException, InterruptedException, ToolInvocationException {
api.pullModel(CHAT_MODEL_SYSTEM_PROMPT);
OllamaChatRequestBuilder builder = OllamaChatRequestBuilder.getInstance(CHAT_MODEL_SYSTEM_PROMPT);
OllamaChatRequest requestModel = builder.withMessage(OllamaChatMessageRole.USER,
"What is the capital of France? And what's France's connection with Mona Lisa?")
.build();
StringBuffer sb = new StringBuffer();
OllamaChatResult chatResult = api.chat(requestModel, (s) -> {
LOG.info(s);
String substring = s.substring(sb.toString().length(), s.length());
LOG.info(substring);
sb.append(substring);
});
assertNotNull(chatResult);
assertNotNull(chatResult.getResponseModel());
assertNotNull(chatResult.getResponseModel().getMessage());
assertNotNull(chatResult.getResponseModel().getMessage().getContent());
assertEquals(sb.toString().trim(), chatResult.getResponseModel().getMessage().getContent().trim());
}
@Test
@Order(17)
void testAskModelWithOptionsAndImageURLs()
throws OllamaBaseException, IOException, URISyntaxException, InterruptedException {
api.pullModel(IMAGE_MODEL_LLAVA);
OllamaResult result = api.generateWithImageURLs(IMAGE_MODEL_LLAVA, "What is in this image?",
List.of("https://upload.wikimedia.org/wikipedia/commons/thumb/a/aa/Noto_Emoji_v2.034_1f642.svg/360px-Noto_Emoji_v2.034_1f642.svg.png"),
new OptionsBuilder().build());
assertNotNull(result);
assertNotNull(result.getResponse());
assertFalse(result.getResponse().isEmpty());
}
@Test
@Order(18)
void testAskModelWithOptionsAndImageFiles()
throws OllamaBaseException, IOException, URISyntaxException, InterruptedException {
api.pullModel(IMAGE_MODEL_LLAVA);
File imageFile = getImageFileFromClasspath("emoji-smile.jpeg");
try {
OllamaResult result = api.generateWithImageFiles(IMAGE_MODEL_LLAVA, "What is in this image?",
List.of(imageFile),
new OptionsBuilder().build());
assertNotNull(result);
assertNotNull(result.getResponse());
assertFalse(result.getResponse().isEmpty());
} catch (IOException | OllamaBaseException | InterruptedException e) {
fail(e);
}
}
@Test
@Order(20)
void testAskModelWithOptionsAndImageFilesStreamed()
throws OllamaBaseException, IOException, URISyntaxException, InterruptedException {
api.pullModel(IMAGE_MODEL_LLAVA);
File imageFile = getImageFileFromClasspath("emoji-smile.jpeg");
StringBuffer sb = new StringBuffer();
OllamaResult result = api.generateWithImageFiles(IMAGE_MODEL_LLAVA, "What is in this image?",
List.of(imageFile),
new OptionsBuilder().build(), (s) -> {
LOG.info(s);
String substring = s.substring(sb.toString().length(), s.length());
LOG.info(substring);
sb.append(substring);
});
assertNotNull(result);
assertNotNull(result.getResponse());
assertFalse(result.getResponse().isEmpty());
assertEquals(sb.toString().trim(), result.getResponse().trim());
}
private File getImageFileFromClasspath(String fileName) {
ClassLoader classLoader = getClass().getClassLoader();
return new File(Objects.requireNonNull(classLoader.getResource(fileName)).getFile());
}
}
@Data
@AllArgsConstructor
@NoArgsConstructor
class TimeOfDay {
@JsonProperty("timeHour")
private int timeHour;
@JsonProperty("isNightTime")
private boolean nightTime;
}

View File

@@ -1,395 +0,0 @@
package io.github.ollama4j.integrationtests;
import io.github.ollama4j.OllamaAPI;
import io.github.ollama4j.exceptions.OllamaBaseException;
import io.github.ollama4j.models.response.ModelDetail;
import io.github.ollama4j.models.chat.OllamaChatRequest;
import io.github.ollama4j.models.response.OllamaResult;
import io.github.ollama4j.models.chat.OllamaChatMessageRole;
import io.github.ollama4j.models.chat.OllamaChatRequestBuilder;
import io.github.ollama4j.models.chat.OllamaChatResult;
import io.github.ollama4j.models.embeddings.OllamaEmbeddingsRequestBuilder;
import io.github.ollama4j.models.embeddings.OllamaEmbeddingsRequestModel;
import io.github.ollama4j.utils.OptionsBuilder;
import lombok.Data;
import org.junit.jupiter.api.BeforeEach;
import org.junit.jupiter.api.Order;
import org.junit.jupiter.api.Test;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import java.io.File;
import java.io.IOException;
import java.io.InputStream;
import java.net.ConnectException;
import java.net.URISyntaxException;
import java.net.http.HttpConnectTimeoutException;
import java.util.List;
import java.util.Objects;
import java.util.Properties;
import static org.junit.jupiter.api.Assertions.*;
class TestRealAPIs {
private static final Logger LOG = LoggerFactory.getLogger(TestRealAPIs.class);
OllamaAPI ollamaAPI;
Config config;
private File getImageFileFromClasspath(String fileName) {
ClassLoader classLoader = getClass().getClassLoader();
return new File(Objects.requireNonNull(classLoader.getResource(fileName)).getFile());
}
@BeforeEach
void setUp() {
config = new Config();
ollamaAPI = new OllamaAPI(config.getOllamaURL());
ollamaAPI.setRequestTimeoutSeconds(config.getRequestTimeoutSeconds());
}
@Test
@Order(1)
void testWrongEndpoint() {
OllamaAPI ollamaAPI = new OllamaAPI("http://wrong-host:11434");
assertThrows(ConnectException.class, ollamaAPI::listModels);
}
@Test
@Order(1)
void testEndpointReachability() {
try {
assertNotNull(ollamaAPI.listModels());
} catch (HttpConnectTimeoutException e) {
fail(e.getMessage());
} catch (Exception e) {
fail(e);
}
}
@Test
@Order(2)
void testListModels() {
testEndpointReachability();
try {
assertNotNull(ollamaAPI.listModels());
ollamaAPI.listModels().forEach(System.out::println);
} catch (IOException | OllamaBaseException | InterruptedException | URISyntaxException e) {
fail(e);
}
}
@Test
@Order(2)
void testPullModel() {
testEndpointReachability();
try {
ollamaAPI.pullModel(config.getModel());
boolean found =
ollamaAPI.listModels().stream()
.anyMatch(model -> model.getModel().equalsIgnoreCase(config.getModel()));
assertTrue(found);
} catch (IOException | OllamaBaseException | InterruptedException | URISyntaxException e) {
fail(e);
}
}
@Test
@Order(3)
void testListDtails() {
testEndpointReachability();
try {
ModelDetail modelDetails = ollamaAPI.getModelDetails(config.getModel());
assertNotNull(modelDetails);
System.out.println(modelDetails);
} catch (IOException | OllamaBaseException | InterruptedException | URISyntaxException e) {
fail(e);
}
}
@Test
@Order(3)
void testAskModelWithDefaultOptions() {
testEndpointReachability();
try {
OllamaResult result =
ollamaAPI.generate(
config.getModel(),
"What is the capital of France? And what's France's connection with Mona Lisa?",
false,
new OptionsBuilder().build());
assertNotNull(result);
assertNotNull(result.getResponse());
assertFalse(result.getResponse().isEmpty());
} catch (IOException | OllamaBaseException | InterruptedException e) {
fail(e);
}
}
@Test
@Order(3)
void testAskModelWithDefaultOptionsStreamed() {
testEndpointReachability();
try {
StringBuffer sb = new StringBuffer("");
OllamaResult result = ollamaAPI.generate(config.getModel(),
"What is the capital of France? And what's France's connection with Mona Lisa?",
false,
new OptionsBuilder().build(), (s) -> {
LOG.info(s);
String substring = s.substring(sb.toString().length(), s.length());
LOG.info(substring);
sb.append(substring);
});
assertNotNull(result);
assertNotNull(result.getResponse());
assertFalse(result.getResponse().isEmpty());
assertEquals(sb.toString().trim(), result.getResponse().trim());
} catch (IOException | OllamaBaseException | InterruptedException e) {
fail(e);
}
}
@Test
@Order(3)
void testAskModelWithOptions() {
testEndpointReachability();
try {
OllamaResult result =
ollamaAPI.generate(
config.getModel(),
"What is the capital of France? And what's France's connection with Mona Lisa?",
true,
new OptionsBuilder().setTemperature(0.9f).build());
assertNotNull(result);
assertNotNull(result.getResponse());
assertFalse(result.getResponse().isEmpty());
} catch (IOException | OllamaBaseException | InterruptedException e) {
fail(e);
}
}
@Test
@Order(3)
void testChat() {
testEndpointReachability();
try {
OllamaChatRequestBuilder builder = OllamaChatRequestBuilder.getInstance(config.getModel());
OllamaChatRequest requestModel = builder.withMessage(OllamaChatMessageRole.USER, "What is the capital of France?")
.withMessage(OllamaChatMessageRole.ASSISTANT, "Should be Paris!")
.withMessage(OllamaChatMessageRole.USER, "And what is the second larges city?")
.build();
OllamaChatResult chatResult = ollamaAPI.chat(requestModel);
assertNotNull(chatResult);
assertFalse(chatResult.getResponse().isBlank());
assertEquals(4, chatResult.getChatHistory().size());
} catch (IOException | OllamaBaseException | InterruptedException e) {
fail(e);
}
}
@Test
@Order(3)
void testChatWithSystemPrompt() {
testEndpointReachability();
try {
OllamaChatRequestBuilder builder = OllamaChatRequestBuilder.getInstance(config.getModel());
OllamaChatRequest requestModel = builder.withMessage(OllamaChatMessageRole.SYSTEM,
"You are a silent bot that only says 'NI'. Do not say anything else under any circumstances!")
.withMessage(OllamaChatMessageRole.USER,
"What is the capital of France? And what's France's connection with Mona Lisa?")
.build();
OllamaChatResult chatResult = ollamaAPI.chat(requestModel);
assertNotNull(chatResult);
assertFalse(chatResult.getResponse().isBlank());
assertTrue(chatResult.getResponse().startsWith("NI"));
assertEquals(3, chatResult.getChatHistory().size());
} catch (IOException | OllamaBaseException | InterruptedException e) {
fail(e);
}
}
@Test
@Order(3)
void testChatWithStream() {
testEndpointReachability();
try {
OllamaChatRequestBuilder builder = OllamaChatRequestBuilder.getInstance(config.getModel());
OllamaChatRequest requestModel = builder.withMessage(OllamaChatMessageRole.USER,
"What is the capital of France? And what's France's connection with Mona Lisa?")
.build();
StringBuffer sb = new StringBuffer("");
OllamaChatResult chatResult = ollamaAPI.chat(requestModel, (s) -> {
LOG.info(s);
String substring = s.substring(sb.toString().length(), s.length());
LOG.info(substring);
sb.append(substring);
});
assertNotNull(chatResult);
assertEquals(sb.toString().trim(), chatResult.getResponse().trim());
} catch (IOException | OllamaBaseException | InterruptedException e) {
fail(e);
}
}
@Test
@Order(3)
void testChatWithImageFromFileWithHistoryRecognition() {
testEndpointReachability();
try {
OllamaChatRequestBuilder builder =
OllamaChatRequestBuilder.getInstance(config.getImageModel());
OllamaChatRequest requestModel =
builder.withMessage(OllamaChatMessageRole.USER, "What's in the picture?",
List.of(getImageFileFromClasspath("dog-on-a-boat.jpg"))).build();
OllamaChatResult chatResult = ollamaAPI.chat(requestModel);
assertNotNull(chatResult);
assertNotNull(chatResult.getResponse());
builder.reset();
requestModel =
builder.withMessages(chatResult.getChatHistory())
.withMessage(OllamaChatMessageRole.USER, "What's the dogs breed?").build();
chatResult = ollamaAPI.chat(requestModel);
assertNotNull(chatResult);
assertNotNull(chatResult.getResponse());
} catch (IOException | OllamaBaseException | InterruptedException e) {
fail(e);
}
}
@Test
@Order(3)
void testChatWithImageFromURL() {
testEndpointReachability();
try {
OllamaChatRequestBuilder builder = OllamaChatRequestBuilder.getInstance(config.getImageModel());
OllamaChatRequest requestModel = builder.withMessage(OllamaChatMessageRole.USER, "What's in the picture?",
"https://t3.ftcdn.net/jpg/02/96/63/80/360_F_296638053_0gUVA4WVBKceGsIr7LNqRWSnkusi07dq.jpg")
.build();
OllamaChatResult chatResult = ollamaAPI.chat(requestModel);
assertNotNull(chatResult);
} catch (IOException | OllamaBaseException | InterruptedException e) {
fail(e);
}
}
@Test
@Order(3)
void testAskModelWithOptionsAndImageFiles() {
testEndpointReachability();
File imageFile = getImageFileFromClasspath("dog-on-a-boat.jpg");
try {
OllamaResult result =
ollamaAPI.generateWithImageFiles(
config.getImageModel(),
"What is in this image?",
List.of(imageFile),
new OptionsBuilder().build());
assertNotNull(result);
assertNotNull(result.getResponse());
assertFalse(result.getResponse().isEmpty());
} catch (IOException | OllamaBaseException | InterruptedException e) {
fail(e);
}
}
@Test
@Order(3)
void testAskModelWithOptionsAndImageFilesStreamed() {
testEndpointReachability();
File imageFile = getImageFileFromClasspath("dog-on-a-boat.jpg");
try {
StringBuffer sb = new StringBuffer("");
OllamaResult result = ollamaAPI.generateWithImageFiles(config.getImageModel(),
"What is in this image?", List.of(imageFile), new OptionsBuilder().build(), (s) -> {
LOG.info(s);
String substring = s.substring(sb.toString().length(), s.length());
LOG.info(substring);
sb.append(substring);
});
assertNotNull(result);
assertNotNull(result.getResponse());
assertFalse(result.getResponse().isEmpty());
assertEquals(sb.toString().trim(), result.getResponse().trim());
} catch (IOException | OllamaBaseException | InterruptedException e) {
fail(e);
}
}
@Test
@Order(3)
void testAskModelWithOptionsAndImageURLs() {
testEndpointReachability();
try {
OllamaResult result =
ollamaAPI.generateWithImageURLs(
config.getImageModel(),
"What is in this image?",
List.of(
"https://t3.ftcdn.net/jpg/02/96/63/80/360_F_296638053_0gUVA4WVBKceGsIr7LNqRWSnkusi07dq.jpg"),
new OptionsBuilder().build());
assertNotNull(result);
assertNotNull(result.getResponse());
assertFalse(result.getResponse().isEmpty());
} catch (IOException | OllamaBaseException | InterruptedException | URISyntaxException e) {
fail(e);
}
}
@Test
@Order(3)
public void testEmbedding() {
testEndpointReachability();
try {
OllamaEmbeddingsRequestModel request = OllamaEmbeddingsRequestBuilder
.getInstance(config.getModel(), "What is the capital of France?").build();
List<Double> embeddings = ollamaAPI.generateEmbeddings(request);
assertNotNull(embeddings);
assertFalse(embeddings.isEmpty());
} catch (IOException | OllamaBaseException | InterruptedException e) {
fail(e);
}
}
}
@Data
class Config {
private String ollamaURL;
private String model;
private String imageModel;
private int requestTimeoutSeconds;
public Config() {
Properties properties = new Properties();
try (InputStream input =
getClass().getClassLoader().getResourceAsStream("test-config.properties")) {
if (input == null) {
throw new RuntimeException("Sorry, unable to find test-config.properties");
}
properties.load(input);
this.ollamaURL = properties.getProperty("ollama.url");
this.model = properties.getProperty("ollama.model");
this.imageModel = properties.getProperty("ollama.model.image");
this.requestTimeoutSeconds =
Integer.parseInt(properties.getProperty("ollama.request-timeout-seconds"));
} catch (IOException e) {
throw new RuntimeException("Error loading properties", e);
}
}
}

View File

@@ -0,0 +1,21 @@
package io.github.ollama4j.samples;
import io.github.ollama4j.tools.annotations.ToolProperty;
import io.github.ollama4j.tools.annotations.ToolSpec;
import java.math.BigDecimal;
public class AnnotatedTool {
@ToolSpec(desc = "Computes the most important constant all around the globe!")
public String computeImportantConstant(@ToolProperty(name = "noOfDigits",desc = "Number of digits that shall be returned") Integer noOfDigits ){
return BigDecimal.valueOf((long)(Math.random()*1000000L),noOfDigits).toString();
}
@ToolSpec(desc = "Says hello to a friend!")
public String sayHello(@ToolProperty(name = "name",desc = "Name of the friend") String name, Integer someRandomProperty, @ToolProperty(name="amountOfHearts",desc = "amount of heart emojis that should be used", required = false) Integer amountOfHearts) {
String hearts = amountOfHearts!=null ? "".repeat(amountOfHearts) : "";
return "Hello " + name +" ("+someRandomProperty+") " + hearts;
}
}

View File

@@ -2,6 +2,11 @@ package io.github.ollama4j.unittests;
import io.github.ollama4j.OllamaAPI;
import io.github.ollama4j.exceptions.OllamaBaseException;
import io.github.ollama4j.exceptions.RoleNotFoundException;
import io.github.ollama4j.models.chat.OllamaChatMessageRole;
import io.github.ollama4j.models.embeddings.OllamaEmbedRequestModel;
import io.github.ollama4j.models.embeddings.OllamaEmbedResponseModel;
import io.github.ollama4j.models.request.CustomModelRequest;
import io.github.ollama4j.models.response.ModelDetail;
import io.github.ollama4j.models.response.OllamaAsyncResultStreamer;
import io.github.ollama4j.models.response.OllamaResult;
@@ -14,7 +19,9 @@ import java.io.IOException;
import java.net.URISyntaxException;
import java.util.ArrayList;
import java.util.Collections;
import java.util.List;
import static org.junit.jupiter.api.Assertions.*;
import static org.mockito.Mockito.*;
class TestMockedAPIs {
@@ -46,12 +53,11 @@ class TestMockedAPIs {
@Test
void testCreateModel() {
OllamaAPI ollamaAPI = Mockito.mock(OllamaAPI.class);
String model = OllamaModelType.LLAMA2;
String modelFilePath = "FROM llama2\nSYSTEM You are mario from Super Mario Bros.";
CustomModelRequest customModelRequest = CustomModelRequest.builder().model("mario").from("llama3.2:latest").system("You are Mario from Super Mario Bros.").build();
try {
doNothing().when(ollamaAPI).createModelWithModelFileContents(model, modelFilePath);
ollamaAPI.createModelWithModelFileContents(model, modelFilePath);
verify(ollamaAPI, times(1)).createModelWithModelFileContents(model, modelFilePath);
doNothing().when(ollamaAPI).createModel(customModelRequest);
ollamaAPI.createModel(customModelRequest);
verify(ollamaAPI, times(1)).createModel(customModelRequest);
} catch (IOException | OllamaBaseException | InterruptedException | URISyntaxException e) {
throw new RuntimeException(e);
}
@@ -97,6 +103,34 @@ class TestMockedAPIs {
}
}
@Test
void testEmbed() {
OllamaAPI ollamaAPI = Mockito.mock(OllamaAPI.class);
String model = OllamaModelType.LLAMA2;
List<String> inputs = List.of("some prompt text");
try {
when(ollamaAPI.embed(model, inputs)).thenReturn(new OllamaEmbedResponseModel());
ollamaAPI.embed(model, inputs);
verify(ollamaAPI, times(1)).embed(model, inputs);
} catch (IOException | OllamaBaseException | InterruptedException e) {
throw new RuntimeException(e);
}
}
@Test
void testEmbedWithEmbedRequestModel() {
OllamaAPI ollamaAPI = Mockito.mock(OllamaAPI.class);
String model = OllamaModelType.LLAMA2;
List<String> inputs = List.of("some prompt text");
try {
when(ollamaAPI.embed(new OllamaEmbedRequestModel(model, inputs))).thenReturn(new OllamaEmbedResponseModel());
ollamaAPI.embed(new OllamaEmbedRequestModel(model, inputs));
verify(ollamaAPI, times(1)).embed(new OllamaEmbedRequestModel(model, inputs));
} catch (IOException | OllamaBaseException | InterruptedException e) {
throw new RuntimeException(e);
}
}
@Test
void testAsk() {
OllamaAPI ollamaAPI = Mockito.mock(OllamaAPI.class);
@@ -161,4 +195,68 @@ class TestMockedAPIs {
ollamaAPI.generateAsync(model, prompt, false);
verify(ollamaAPI, times(1)).generateAsync(model, prompt, false);
}
@Test
void testAddCustomRole() {
OllamaAPI ollamaAPI = mock(OllamaAPI.class);
String roleName = "custom-role";
OllamaChatMessageRole expectedRole = OllamaChatMessageRole.newCustomRole(roleName);
when(ollamaAPI.addCustomRole(roleName)).thenReturn(expectedRole);
OllamaChatMessageRole customRole = ollamaAPI.addCustomRole(roleName);
assertEquals(expectedRole, customRole);
verify(ollamaAPI, times(1)).addCustomRole(roleName);
}
@Test
void testListRoles() {
OllamaAPI ollamaAPI = Mockito.mock(OllamaAPI.class);
OllamaChatMessageRole role1 = OllamaChatMessageRole.newCustomRole("role1");
OllamaChatMessageRole role2 = OllamaChatMessageRole.newCustomRole("role2");
List<OllamaChatMessageRole> expectedRoles = List.of(role1, role2);
when(ollamaAPI.listRoles()).thenReturn(expectedRoles);
List<OllamaChatMessageRole> actualRoles = ollamaAPI.listRoles();
assertEquals(expectedRoles, actualRoles);
verify(ollamaAPI, times(1)).listRoles();
}
@Test
void testGetRoleNotFound() {
OllamaAPI ollamaAPI = mock(OllamaAPI.class);
String roleName = "non-existing-role";
try {
when(ollamaAPI.getRole(roleName)).thenThrow(new RoleNotFoundException("Role not found"));
} catch (RoleNotFoundException exception) {
throw new RuntimeException("Failed to run test: testGetRoleNotFound");
}
try {
ollamaAPI.getRole(roleName);
fail("Expected RoleNotFoundException not thrown");
} catch (RoleNotFoundException exception) {
assertEquals("Role not found", exception.getMessage());
}
try {
verify(ollamaAPI, times(1)).getRole(roleName);
} catch (RoleNotFoundException exception) {
throw new RuntimeException("Failed to run test: testGetRoleNotFound");
}
}
@Test
void testGetRoleFound() {
OllamaAPI ollamaAPI = mock(OllamaAPI.class);
String roleName = "existing-role";
OllamaChatMessageRole expectedRole = OllamaChatMessageRole.newCustomRole(roleName);
try {
when(ollamaAPI.getRole(roleName)).thenReturn(expectedRole);
} catch (RoleNotFoundException exception) {
throw new RuntimeException("Failed to run test: testGetRoleFound");
}
try {
OllamaChatMessageRole actualRole = ollamaAPI.getRole(roleName);
assertEquals(expectedRole, actualRole);
verify(ollamaAPI, times(1)).getRole(roleName);
} catch (RoleNotFoundException exception) {
throw new RuntimeException("Failed to run test: testGetRoleFound");
}
}
}

View File

@@ -1,8 +1,10 @@
package io.github.ollama4j.unittests.jackson;
import static org.junit.jupiter.api.Assertions.assertEquals;
import static org.junit.jupiter.api.Assertions.assertThrowsExactly;
import java.io.File;
import java.util.Collections;
import java.util.List;
import io.github.ollama4j.models.chat.OllamaChatRequest;
@@ -41,7 +43,7 @@ public class TestChatRequestSerialization extends AbstractSerializationTest<Olla
@Test
public void testRequestWithMessageAndImage() {
OllamaChatRequest req = builder.withMessage(OllamaChatMessageRole.USER, "Some prompt",
OllamaChatRequest req = builder.withMessage(OllamaChatMessageRole.USER, "Some prompt", Collections.emptyList(),
List.of(new File("src/test/resources/dog-on-a-boat.jpg"))).build();
String jsonRequest = serialize(req);
assertEqualsAfterUnmarshalling(deserialize(jsonRequest, OllamaChatRequest.class), req);
@@ -59,6 +61,10 @@ public class TestChatRequestSerialization extends AbstractSerializationTest<Olla
.withOptions(b.setSeed(1).build())
.withOptions(b.setTopK(1).build())
.withOptions(b.setTopP(1).build())
.withOptions(b.setMinP(1).build())
.withOptions(b.setCustomOption("cust_float", 1.0f).build())
.withOptions(b.setCustomOption("cust_int", 1).build())
.withOptions(b.setCustomOption("cust_str", "custom").build())
.build();
String jsonRequest = serialize(req);
@@ -72,6 +78,20 @@ public class TestChatRequestSerialization extends AbstractSerializationTest<Olla
assertEquals(1, deserializeRequest.getOptions().get("seed"));
assertEquals(1, deserializeRequest.getOptions().get("top_k"));
assertEquals(1.0, deserializeRequest.getOptions().get("top_p"));
assertEquals(1.0, deserializeRequest.getOptions().get("min_p"));
assertEquals(1.0, deserializeRequest.getOptions().get("cust_float"));
assertEquals(1, deserializeRequest.getOptions().get("cust_int"));
assertEquals("custom", deserializeRequest.getOptions().get("cust_str"));
}
@Test
public void testRequestWithInvalidCustomOption() {
OptionsBuilder b = new OptionsBuilder();
assertThrowsExactly(IllegalArgumentException.class, () -> {
OllamaChatRequest req = builder.withMessage(OllamaChatMessageRole.USER, "Some prompt")
.withOptions(b.setCustomOption("cust_obj", new Object()).build())
.build();
});
}
@Test

View File

@@ -1,36 +1,37 @@
package io.github.ollama4j.unittests.jackson;
import static org.junit.jupiter.api.Assertions.assertEquals;
import io.github.ollama4j.models.embeddings.OllamaEmbedRequestBuilder;
import io.github.ollama4j.models.embeddings.OllamaEmbedRequestModel;
import org.junit.jupiter.api.BeforeEach;
import org.junit.jupiter.api.Test;
import io.github.ollama4j.models.embeddings.OllamaEmbeddingsRequestModel;
import io.github.ollama4j.models.embeddings.OllamaEmbeddingsRequestBuilder;
import io.github.ollama4j.utils.OptionsBuilder;
public class TestEmbeddingsRequestSerialization extends AbstractSerializationTest<OllamaEmbeddingsRequestModel> {
public class TestEmbedRequestSerialization extends AbstractSerializationTest<OllamaEmbedRequestModel> {
private OllamaEmbeddingsRequestBuilder builder;
private OllamaEmbedRequestBuilder builder;
@BeforeEach
public void init() {
builder = OllamaEmbeddingsRequestBuilder.getInstance("DummyModel","DummyPrompt");
builder = OllamaEmbedRequestBuilder.getInstance("DummyModel","DummyPrompt");
}
@Test
public void testRequestOnlyMandatoryFields() {
OllamaEmbeddingsRequestModel req = builder.build();
OllamaEmbedRequestModel req = builder.build();
String jsonRequest = serialize(req);
assertEqualsAfterUnmarshalling(deserialize(jsonRequest,OllamaEmbeddingsRequestModel.class), req);
assertEqualsAfterUnmarshalling(deserialize(jsonRequest,OllamaEmbedRequestModel.class), req);
}
@Test
public void testRequestWithOptions() {
OptionsBuilder b = new OptionsBuilder();
OllamaEmbeddingsRequestModel req = builder
OllamaEmbedRequestModel req = builder
.withOptions(b.setMirostat(1).build()).build();
String jsonRequest = serialize(req);
OllamaEmbeddingsRequestModel deserializeRequest = deserialize(jsonRequest,OllamaEmbeddingsRequestModel.class);
OllamaEmbedRequestModel deserializeRequest = deserialize(jsonRequest,OllamaEmbedRequestModel.class);
assertEqualsAfterUnmarshalling(deserializeRequest, req);
assertEquals(1, deserializeRequest.getOptions().get("mirostat"));
}

Binary file not shown.

After

Width:  |  Height:  |  Size: 5.1 KiB

View File

@@ -1,4 +1,4 @@
ollama.url=http://localhost:11434
ollama.model=qwen:0.5b
ollama.model.image=llava
ollama.model=llama3.2:1b
ollama.model.image=llava:latest
ollama.request-timeout-seconds=120