Amith Koujalgi 17ca2bdee3 updated README.md
Signed-off-by: Amith Koujalgi <koujalgi.amith@gmail.com>
2024-07-27 16:46:53 +05:30
2024-07-15 22:57:01 +05:30
2024-01-24 01:07:51 +05:30
2023-10-27 00:42:14 +05:30
2024-01-24 01:07:51 +05:30
2024-07-27 00:16:44 +05:30
2023-12-01 09:47:44 +05:30
2024-07-27 00:21:34 +05:30
2024-07-27 16:46:53 +05:30

Ollama4j

ollama4j-icon

A Java library (wrapper/binding) for Ollama server.

Find more details on the website.

GitHub stars GitHub forks GitHub watchers Contributors GitHub License

GitHub last commit codecov Build Status

Table of Contents

How does it work?

  flowchart LR
    o4j[Ollama4j]
    o[Ollama Server]
    o4j -->|Communicates with| o;
    m[Models]
    subgraph Ollama Deployment
        direction TB
        o -->|Manages| m
    end

Requirements

Java

macOS

https://ollama.com/download/Ollama-darwin.zip

Linux

curl -fsSL https://ollama.com/install.sh \| sh

Windows

https://ollama.com/download/OllamaSetup.exe

CPU only

docker run -d -p 11434:11434 \
  -v ollama:/root/.ollama \
  --name ollama \
  ollama/ollama

NVIDIA GPU

docker run -d -p 11434:11434 \
  --gpus=all \
  -v ollama:/root/.ollama \
  --name ollama \
  ollama/ollama

Installation

Note

We are now publishing the artifacts to both Maven Central and GitHub package repositories.

Track the releases here and update the dependency version according to your requirements.

For Maven

Using Maven Central

In your Maven project, add this dependency:


<dependency>
    <groupId>io.github.ollama4j</groupId>
    <artifactId>ollama4j</artifactId>
    <version>1.0.78</version>
</dependency>

Using GitHub's Maven Package Repository

  1. Add GitHub Maven Packages repository to your project's pom.xml or your settings.xml:

<repositories>
    <repository>
        <id>github</id>
        <name>GitHub Apache Maven Packages</name>
        <url>https://maven.pkg.github.com/ollama4j/ollama4j</url>
        <releases>
            <enabled>true</enabled>
        </releases>
        <snapshots>
            <enabled>true</enabled>
        </snapshots>
    </repository>
</repositories>
  1. Add GitHub server to settings.xml. (Usually available at ~/.m2/settings.xml)

<settings xmlns="http://maven.apache.org/SETTINGS/1.0.0"
          xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
          xsi:schemaLocation="http://maven.apache.org/SETTINGS/1.0.0
                      http://maven.apache.org/xsd/settings-1.0.0.xsd">
    <servers>
        <server>
            <id>github</id>
            <username>YOUR-USERNAME</username>
            <password>YOUR-TOKEN</password>
        </server>
    </servers>
</settings>
  1. In your Maven project, add this dependency:

<dependency>
    <groupId>io.github.ollama4j</groupId>
    <artifactId>ollama4j</artifactId>
    <version>1.0.78</version>
</dependency>

For Gradle

  1. Add the dependency
dependencies {
  implementation 'com.github.ollama4j:ollama4j:1.0.78'
}

API Spec

Tip

Find the full API specifications on the website.

Development

Build:

make build

Run unit tests:

make unit-tests

Run integration tests:

make integration-tests

Releases

Newer artifacts are published via GitHub Actions CI workflow when a new release is created from main branch.

Who's using Ollama4j?

Traction

Star History Chart

Areas of improvement

  • Use Java-naming conventions for attributes in the request/response models instead of the snake-case conventions. ( possibly with Jackson-mapper's @JsonProperty)
  • Fix deprecated HTTP client code
  • Setup logging
  • Use lombok
  • Update request body creation with Java objects
  • Async APIs for images
  • Support for function calling with models like Mistral
    • generate in sync mode
    • generate in async mode
  • Add custom headers to requests
  • Add additional params for ask APIs such as:
    • options: additional model parameters for the Modelfile such as temperature - Supported params.
    • system: system prompt to (overrides what is defined in the Modelfile)
    • template: the full prompt or prompt template (overrides what is defined in the Modelfile)
    • context: the context parameter returned from a previous request, which can be used to keep a short conversational memory
    • stream: Add support for streaming responses from the model
  • Add test cases
  • Handle exceptions better (maybe throw more appropriate exceptions)

Get Involved

Open Issues Closed Issues Open PRs Closed PRs Discussions

Contributions are most welcome! Whether it's reporting a bug, proposing an enhancement, or helping with code - any sort of contribution is much appreciated.

References

Credits

The nomenclature and the icon have been adopted from the incredible Ollama project.

Thanks to the amazing contributors

Appreciate my work?

Buy Me A Coffee

Description
Java library for interacting with Ollama server.
Readme 1.5 MiB
2024-05-27 17:09:58 +02:00
Languages
Java 99.4%
Makefile 0.6%