5.9 KiB
Ollama4j

A Java library (wrapper) for Ollama APIs.
Table of Contents
Requirements
Installation
In your Maven project, add this dependency available in the Central Repository:
<dependency>
<groupId>io.github.amithkoujalgi</groupId>
<artifactId>ollama4j</artifactId>
<version>1.0-SNAPSHOT</version>
</dependency>
You might want to include the Maven repository to pull the ollama4j library from. Include this in your pom.xml
:
<repositories>
<repository>
<id>ollama4j-from-ossrh</id>
<url>https://s01.oss.sonatype.org/content/repositories/snapshots</url>
</repository>
</repositories>
Build:
Build your project to resolve the dependencies:
mvn clean install -U
You can then use the Ollama Java APIs by importing ollama4j
:
import io.github.amithkoujalgi.ollama4j.core.OllamaAPI;
Try out the APIs
For simplest way to get started, I prefer to use the Ollama docker setup.
Start the Ollama docker container:
docker run -v ~/ollama:/root/.ollama -p 11434:11434 ollama/ollama
Pull a model:
public class Main {
public static void main(String[] args) {
String host = "http://localhost:11434/";
OllamaAPI ollamaAPI = new OllamaAPI(host);
ollamaAPI.pullModel(OllamaModel.LLAMA2);
}
}
Find the list of available models from Ollama here.
Ask a question to the model with ollama4j
Using sync API:
public class Main {
public static void main(String[] args) {
String host = "http://localhost:11434/";
OllamaAPI ollamaAPI = new OllamaAPI(host);
String response = ollamaAPI.runSync(OllamaModel.LLAMA2, "Who are you?");
System.out.println(response);
}
}
Using async API:
public class Main {
public static void main(String[] args) {
String host = "http://localhost:11434/";
OllamaAPI ollamaAPI = new OllamaAPI(host);
OllamaAsyncResultCallback ollamaAsyncResultCallback = ollamaAPI.runAsync(OllamaModel.LLAMA2, "Who are you?");
while (true) {
if (ollamaAsyncResultCallback.isComplete()) {
System.out.println(ollamaAsyncResultCallback.getResponse());
break;
}
// introduce sleep to check for status with a time interval
// Thread.sleep(1000);
}
}
}
You'd then get a response from the model:
I am LLaMA, an AI assistant developed by Meta AI that can understand and respond to human input in a conversational manner. I am trained on a massive dataset of text from the internet and can generate human-like responses to a wide range of topics and questions. I can be used to create chatbots, virtual assistants, and other applications that require natural language understanding and generation capabilities.
Try asking a question from general topics:
public class Main {
public static void main(String[] args) throws Exception {
String host = "http://localhost:11434/";
OllamaAPI ollamaAPI = new OllamaAPI(host);
String prompt = SamplePrompts.getSampleDatabasePromptWithQuestion("Give me a list of world cup cricket teams.");
String response = ollamaAPI.ask(OllamaModelType.LLAMA2, prompt);
System.out.println(response);
}
}
You'd then get a response from the model:
Certainly! Here are the 10 teams that have qualified for the ICC World Cup in cricket:
- Australia
- Bangladesh
- England and Wales
- India
- Ireland
- New Zealand
- Pakistan
- South Africa
- Sri Lanka
- West Indies
These teams will compete in the ICC World Cup 2019, which will be held in England and Wales from May 30 to July 14, 2019.
Try asking for a Database query for your data schema:
public class Main {
public static void main(String[] args) throws Exception {
String host = "http://localhost:11434/";
OllamaAPI ollamaAPI = new OllamaAPI(host);
String prompt = SamplePrompts.getSampleDatabasePromptWithQuestion("List all customer names who have bought one or more products");
String response = ollamaAPI.ask(OllamaModelType.SQLCODER, prompt);
System.out.println(response);
}
}
Note: Here I've used a sample prompt containing a database schema from within this library for demonstration purposes.
You'd then get a response from the model:
SELECT customers.name
FROM sales
JOIN customers ON sales.customer_id = customers.customer_id
GROUP BY customers.name;
API Spec
Find the full Javadoc
(API specifications) here.
Areas of improvement
- Use Java-naming conventions for attributes in the request/response models instead of the snake-case conventions. (
possibly with Jackson-mapper's
@JsonProperty
) - Fix deprecated HTTP client code
- Setup logging
- Add test cases
- Handle exceptions better (maybe throw more appropriate exceptions)
Get Involved
Contributions are most welcome! Whether it's reporting a bug, proposing an enhancement, or helping with code - any sort of contribution is much appreciated.