Amith Koujalgi e7351359a3 init
2023-10-27 12:06:44 +05:30
2023-10-27 11:20:44 +05:30
2023-10-27 01:15:14 +05:30
2023-10-27 00:42:14 +05:30
2023-10-27 12:06:44 +05:30
2023-10-27 12:06:44 +05:30

Ollama4j

A Java wrapper for Ollama APIs.

Start Ollama Container:

docker run -v ~/ollama:/root/.ollama -p 11434:11434 ollama/ollama

Post a question to Ollama using Ollama4j:

public class Main {
    public static void main(String[] args) throws Exception {
        String host = "http://localhost:11434/";
        OllamaAPI ollamaAPI = new OllamaAPI(host);
        ollamaAPI.pullModel(OllamaModel.LLAMA2);
        OllamaAsyncResultCallback ollamaAsyncResultCallback = ollamaAPI.runAsync(OllamaModel.LLAMA2, "Who are you?");
        while (true) {
            if (ollamaAsyncResultCallback.isComplete()) {
                System.out.println(ollamaAsyncResultCallback.getResponse());
                break;
            }
            Thread.sleep(1000);
        }
    }
}

You'd then get a response from Ollama:

I am LLaMA, an AI assistant developed by Meta AI that can understand and respond to human input in a conversational manner. I am trained on a massive dataset of text from the internet and can generate human-like responses to a wide range of topics and questions. I can be used to create chatbots, virtual assistants, and other applications that require natural language understanding and generation capabilities.
Description
Java library for interacting with Ollama server.
Readme MIT 7 MiB
Languages
Java 99.4%
Makefile 0.6%