Class OllamaAPI
java.lang.Object
io.github.amithkoujalgi.ollama4j.core.OllamaAPI
The base Ollama API class.
-
Constructor Summary
Constructors -
Method Summary
Modifier and TypeMethodDescriptionchat
(OllamaChatRequestModel request) Ask a question to a model using anOllamaChatRequestModel
.chat
(OllamaChatRequestModel request, OllamaStreamHandler streamHandler) Ask a question to a model using anOllamaChatRequestModel
.chat
(String model, List<OllamaChatMessage> messages) Ask a question to a model based on a given message stack (i.e.void
createModelWithFilePath
(String modelName, String modelFilePath) Create a custom model from a model file.void
createModelWithModelFileContents
(String modelName, String modelFileContents) Create a custom model from a model file.void
deleteModel
(String modelName, boolean ignoreIfNotPresent) Delete a model from Ollama server.Convenience method to call Ollama API without streaming responses.generate
(String model, String prompt, Options options, OllamaStreamHandler streamHandler) Generate response for a question to a model running on Ollama server.generateAsync
(String model, String prompt) Generate response for a question to a model running on Ollama server and get a callback handle that can be used to check for status and get the response from the model later.generateEmbeddings
(OllamaEmbeddingsRequestModel modelRequest) Generate embeddings using aOllamaEmbeddingsRequestModel
.generateEmbeddings
(String model, String prompt) Generate embeddings for a given text from a modelConvenience method to call Ollama API without streaming responses.generateWithImageFiles
(String model, String prompt, List<File> imageFiles, Options options, OllamaStreamHandler streamHandler) With one or more image files, ask a question to a model running on Ollama server.Convenience method to call Ollama API without streaming responses.generateWithImageURLs
(String model, String prompt, List<String> imageURLs, Options options, OllamaStreamHandler streamHandler) With one or more image URLs, ask a question to a model running on Ollama server.getModelDetails
(String modelName) Gets model details from the Ollama server.List available models from Ollama server.boolean
ping()
API to check the reachability of Ollama server.void
Pull a model on the Ollama server from the list of available models.void
setBasicAuth
(String username, String password) Set basic authentication for accessing Ollama server that's behind a reverse-proxy/gateway.void
setRequestTimeoutSeconds
(long requestTimeoutSeconds) Set request timeout in seconds.void
setVerbose
(boolean verbose) Set/unset logging of responses
-
Constructor Details
-
OllamaAPI
Instantiates the Ollama API.- Parameters:
host
- the host address of Ollama server
-
-
Method Details
-
setRequestTimeoutSeconds
public void setRequestTimeoutSeconds(long requestTimeoutSeconds) Set request timeout in seconds. Default is 3 seconds.- Parameters:
requestTimeoutSeconds
- the request timeout in seconds
-
setVerbose
public void setVerbose(boolean verbose) Set/unset logging of responses- Parameters:
verbose
- true/false
-
setBasicAuth
Set basic authentication for accessing Ollama server that's behind a reverse-proxy/gateway.- Parameters:
username
- the usernamepassword
- the password
-
ping
public boolean ping()API to check the reachability of Ollama server.- Returns:
- true if the server is reachable, false otherwise.
-
listModels
public List<Model> listModels() throws OllamaBaseException, IOException, InterruptedException, URISyntaxExceptionList available models from Ollama server.- Returns:
- the list
- Throws:
OllamaBaseException
IOException
InterruptedException
URISyntaxException
-
pullModel
public void pullModel(String modelName) throws OllamaBaseException, IOException, URISyntaxException, InterruptedException Pull a model on the Ollama server from the list of available models.- Parameters:
modelName
- the name of the model- Throws:
OllamaBaseException
IOException
URISyntaxException
InterruptedException
-
getModelDetails
public ModelDetail getModelDetails(String modelName) throws IOException, OllamaBaseException, InterruptedException, URISyntaxException Gets model details from the Ollama server.- Parameters:
modelName
- the model- Returns:
- the model details
- Throws:
IOException
OllamaBaseException
InterruptedException
URISyntaxException
-
createModelWithFilePath
public void createModelWithFilePath(String modelName, String modelFilePath) throws IOException, InterruptedException, OllamaBaseException, URISyntaxException Create a custom model from a model file. Read more about custom model file creation here.- Parameters:
modelName
- the name of the custom model to be created.modelFilePath
- the path to model file that exists on the Ollama server.- Throws:
IOException
InterruptedException
OllamaBaseException
URISyntaxException
-
createModelWithModelFileContents
public void createModelWithModelFileContents(String modelName, String modelFileContents) throws IOException, InterruptedException, OllamaBaseException, URISyntaxException Create a custom model from a model file. Read more about custom model file creation here.- Parameters:
modelName
- the name of the custom model to be created.modelFileContents
- the path to model file that exists on the Ollama server.- Throws:
IOException
InterruptedException
OllamaBaseException
URISyntaxException
-
deleteModel
public void deleteModel(String modelName, boolean ignoreIfNotPresent) throws IOException, InterruptedException, OllamaBaseException, URISyntaxException Delete a model from Ollama server.- Parameters:
modelName
- the name of the model to be deleted.ignoreIfNotPresent
- ignore errors if the specified model is not present on Ollama server.- Throws:
IOException
InterruptedException
OllamaBaseException
URISyntaxException
-
generateEmbeddings
public List<Double> generateEmbeddings(String model, String prompt) throws IOException, InterruptedException, OllamaBaseException Generate embeddings for a given text from a model- Parameters:
model
- name of model to generate embeddings fromprompt
- text to generate embeddings for- Returns:
- embeddings
- Throws:
IOException
InterruptedException
OllamaBaseException
-
generateEmbeddings
public List<Double> generateEmbeddings(OllamaEmbeddingsRequestModel modelRequest) throws IOException, InterruptedException, OllamaBaseException Generate embeddings using aOllamaEmbeddingsRequestModel
.- Parameters:
modelRequest
- request for '/api/embeddings' endpoint- Returns:
- embeddings
- Throws:
IOException
InterruptedException
OllamaBaseException
-
generate
public OllamaResult generate(String model, String prompt, Options options, OllamaStreamHandler streamHandler) throws OllamaBaseException, IOException, InterruptedException Generate response for a question to a model running on Ollama server. This is a sync/blocking call.- Parameters:
model
- the ollama model to ask the question toprompt
- the prompt/question textoptions
- the Options object - More details on the optionsstreamHandler
- optional callback consumer that will be applied every time a streamed response is received. If not set, the stream parameter of the request is set to false.- Returns:
- OllamaResult that includes response text and time taken for response
- Throws:
OllamaBaseException
IOException
InterruptedException
-
generate
public OllamaResult generate(String model, String prompt, Options options) throws OllamaBaseException, IOException, InterruptedException Convenience method to call Ollama API without streaming responses. -
generateAsync
Generate response for a question to a model running on Ollama server and get a callback handle that can be used to check for status and get the response from the model later. This would be an async/non-blocking call.- Parameters:
model
- the ollama model to ask the question toprompt
- the prompt/question text- Returns:
- the ollama async result callback handle
-
generateWithImageFiles
public OllamaResult generateWithImageFiles(String model, String prompt, List<File> imageFiles, Options options, OllamaStreamHandler streamHandler) throws OllamaBaseException, IOException, InterruptedException With one or more image files, ask a question to a model running on Ollama server. This is a sync/blocking call.- Parameters:
model
- the ollama model to ask the question toprompt
- the prompt/question textimageFiles
- the list of image files to use for the questionoptions
- the Options object - More details on the optionsstreamHandler
- optional callback consumer that will be applied every time a streamed response is received. If not set, the stream parameter of the request is set to false.- Returns:
- OllamaResult that includes response text and time taken for response
- Throws:
OllamaBaseException
IOException
InterruptedException
-
generateWithImageFiles
public OllamaResult generateWithImageFiles(String model, String prompt, List<File> imageFiles, Options options) throws OllamaBaseException, IOException, InterruptedException Convenience method to call Ollama API without streaming responses.Uses
generateWithImageFiles(String, String, List, Options, OllamaStreamHandler)
-
generateWithImageURLs
public OllamaResult generateWithImageURLs(String model, String prompt, List<String> imageURLs, Options options, OllamaStreamHandler streamHandler) throws OllamaBaseException, IOException, InterruptedException, URISyntaxException With one or more image URLs, ask a question to a model running on Ollama server. This is a sync/blocking call.- Parameters:
model
- the ollama model to ask the question toprompt
- the prompt/question textimageURLs
- the list of image URLs to use for the questionoptions
- the Options object - More details on the optionsstreamHandler
- optional callback consumer that will be applied every time a streamed response is received. If not set, the stream parameter of the request is set to false.- Returns:
- OllamaResult that includes response text and time taken for response
- Throws:
OllamaBaseException
IOException
InterruptedException
URISyntaxException
-
generateWithImageURLs
public OllamaResult generateWithImageURLs(String model, String prompt, List<String> imageURLs, Options options) throws OllamaBaseException, IOException, InterruptedException, URISyntaxException Convenience method to call Ollama API without streaming responses.Uses
generateWithImageURLs(String, String, List, Options, OllamaStreamHandler)
-
chat
public OllamaChatResult chat(String model, List<OllamaChatMessage> messages) throws OllamaBaseException, IOException, InterruptedException Ask a question to a model based on a given message stack (i.e. a chat history). Creates a synchronous call to the api 'api/chat'.- Parameters:
model
- the ollama model to ask the question tomessages
- chat history / message stack to send to the model- Returns:
OllamaChatResult
containing the api response and the message history including the newly aqcuired assistant response.- Throws:
OllamaBaseException
- any response code than 200 has been returnedIOException
- in case the responseStream can not be readInterruptedException
- in case the server is not reachable or network issues happen
-
chat
public OllamaChatResult chat(OllamaChatRequestModel request) throws OllamaBaseException, IOException, InterruptedException Ask a question to a model using anOllamaChatRequestModel
. This can be constructed using anOllamaChatRequestBuilder
.Hint: the OllamaChatRequestModel#getStream() property is not implemented.
- Parameters:
request
- request object to be sent to the server- Returns:
- Throws:
OllamaBaseException
- any response code than 200 has been returnedIOException
- in case the responseStream can not be readInterruptedException
- in case the server is not reachable or network issues happen
-
chat
public OllamaChatResult chat(OllamaChatRequestModel request, OllamaStreamHandler streamHandler) throws OllamaBaseException, IOException, InterruptedException Ask a question to a model using anOllamaChatRequestModel
. This can be constructed using anOllamaChatRequestBuilder
.Hint: the OllamaChatRequestModel#getStream() property is not implemented.
- Parameters:
request
- request object to be sent to the serverstreamHandler
- callback handler to handle the last message from stream (caution: all previous messages from stream will be concatenated)- Returns:
- Throws:
OllamaBaseException
- any response code than 200 has been returnedIOException
- in case the responseStream can not be readInterruptedException
- in case the server is not reachable or network issues happen
-