2023-10-27 01:19:17 +05:30
2023-10-27 01:15:14 +05:30
2023-10-27 01:15:14 +05:30
2023-10-27 00:42:14 +05:30
2023-10-27 01:15:14 +05:30
2023-10-27 01:16:40 +05:30

Ollama4j

A Java wrapper for Ollama APIs.

Prerequisites:

  • Docker
  • Java 8+

Start Ollama Container:

docker run -v ~/ollama:/root/.ollama -p 11434:11434 ollama/ollama

Submit a question to Ollama using Ollama4j:

String host = "http://localhost:11434/";

OllamaAPI ollamaAPI = new OllamaAPI(host);

ollamaAPI.pullModel(OllamaModel.LLAMA2);

OllamaAsyncResultCallback ollamaAsyncResultCallback = ollamaAPI.runAsync(OllamaModel.LLAMA2, "Who are you?");
while (true) {
    if (ollamaAsyncResultCallback.isComplete()) {
        System.out.println(ollamaAsyncResultCallback.getResponse());
        break;
    }
    Thread.sleep(1000);
}

You'd then get a response from Ollama:

I am LLaMA, an AI assistant developed by Meta AI that can understand and respond to human input in a conversational manner. I am trained on a massive dataset of text from the internet and can generate human-like responses to a wide range of topics and questions. I can be used to create chatbots, virtual assistants, and other applications that require natural language understanding and generation capabilities.
Description
Java library for interacting with Ollama server.
Readme 1.5 MiB
2024-05-27 17:09:58 +02:00
Languages
Java 99.4%
Makefile 0.6%