Amith Koujalgi e2b3070677 init
2023-11-08 13:34:08 +05:30
2023-11-07 23:31:19 +05:30
2023-11-07 23:11:35 +05:30
2023-10-27 01:15:14 +05:30
2023-10-27 00:42:14 +05:30
2023-10-27 16:51:27 +05:30
2023-10-27 16:34:30 +05:30
2023-11-08 13:34:08 +05:30

Ollama4j

ollama4j-icon

A Java library (wrapper) for Ollama APIs.

Build Status

Requirements

Install

In your Maven project, add this dependency available in the Central Repository:


<dependency>
    <groupId>io.github.amithkoujalgi</groupId>
    <artifactId>ollama4j</artifactId>
    <version>1.0-SNAPSHOT</version>
</dependency>

You might want to include the Maven repository to pull the ollama4j library from. Include this in your pom.xml:


<repositories>
    <repository>
        <id>ollama4j-from-ossrh</id>
        <url>https://s01.oss.sonatype.org/content/repositories/snapshots</url>
    </repository>
</repositories>

Build:

Build your project to resolve the dependencies:

mvn clean install

You can then use the Ollama Java APIs by importing ollama4j:

import io.github.amithkoujalgi.ollama4j.core.OllamaAPI;

Try out the APIs

For simplest way to get started, I prefer to use the Ollama docker setup.

Start the Ollama docker container:

docker run -v ~/ollama:/root/.ollama -p 11434:11434 ollama/ollama

Pull a model:

public class Main {
    public static void main(String[] args) {
        String host = "http://localhost:11434/";
        OllamaAPI ollamaAPI = new OllamaAPI(host);
        ollamaAPI.pullModel(OllamaModel.LLAMA2);
    }
}

Find the list of available models from Ollama here.

Ask a question to the model with ollama4j

Using sync API:
public class Main {
    public static void main(String[] args) {
        String host = "http://localhost:11434/";
        OllamaAPI ollamaAPI = new OllamaAPI(host);
        String response = ollamaAPI.runSync(OllamaModel.LLAMA2, "Who are you?");
        System.out.println(response);
    }
}
Using async API:
public class Main {
    public static void main(String[] args) {
        String host = "http://localhost:11434/";
        OllamaAPI ollamaAPI = new OllamaAPI(host);
        OllamaAsyncResultCallback ollamaAsyncResultCallback = ollamaAPI.runAsync(OllamaModel.LLAMA2, "Who are you?");
        while (true) {
            if (ollamaAsyncResultCallback.isComplete()) {
                System.out.println(ollamaAsyncResultCallback.getResponse());
                break;
            }
            // introduce sleep to check for status with a time interval
            // Thread.sleep(1000);
        }
    }
}

You'd then get a response from the model:

I am LLaMA, an AI assistant developed by Meta AI that can understand and respond to human input in a conversational manner. I am trained on a massive dataset of text from the internet and can generate human-like responses to a wide range of topics and questions. I can be used to create chatbots, virtual assistants, and other applications that require natural language understanding and generation capabilities.

Find the full Javadoc (API specifications) here.

Get Involved

Contributions are most welcome! Whether it's reporting a bug, proposing an enhancement, or helping with code - any sort of contribution is much appreciated.

Description
Java library for interacting with Ollama server.
Readme 1.5 MiB
2024-05-27 17:09:58 +02:00
Languages
Java 99.4%
Makefile 0.6%