forked from Mirror/ollama4j
		
	updated docs
This commit is contained in:
		@@ -8,7 +8,7 @@ This API lets you ask questions along with the image files to the LLMs.
 | 
			
		||||
These APIs correlate to
 | 
			
		||||
the [completion](https://github.com/jmorganca/ollama/blob/main/docs/api.md#generate-a-completion) APIs.
 | 
			
		||||
 | 
			
		||||
:::caution
 | 
			
		||||
:::note
 | 
			
		||||
 | 
			
		||||
Executing this on Ollama server running in CPU-mode will take longer to generate response. Hence, GPU-mode is
 | 
			
		||||
recommended.
 | 
			
		||||
 
 | 
			
		||||
@@ -8,7 +8,7 @@ This API lets you ask questions along with the image files to the LLMs.
 | 
			
		||||
These APIs correlate to
 | 
			
		||||
the [completion](https://github.com/jmorganca/ollama/blob/main/docs/api.md#generate-a-completion) APIs.
 | 
			
		||||
 | 
			
		||||
:::caution
 | 
			
		||||
:::note
 | 
			
		||||
 | 
			
		||||
Executing this on Ollama server running in CPU-mode will take longer to generate response. Hence, GPU-mode is
 | 
			
		||||
recommended.
 | 
			
		||||
 
 | 
			
		||||
@@ -30,17 +30,17 @@ public class Main {
 | 
			
		||||
 | 
			
		||||
You will get a response similar to:
 | 
			
		||||
 | 
			
		||||
```json
 | 
			
		||||
```javascript
 | 
			
		||||
 [
 | 
			
		||||
  0.5670403838157654,
 | 
			
		||||
  0.009260174818336964,
 | 
			
		||||
  0.23178744316101074,
 | 
			
		||||
  -0.2916173040866852,
 | 
			
		||||
  -0.8924556970596313,
 | 
			
		||||
  0.8785552978515625,
 | 
			
		||||
  -0.34576427936553955,
 | 
			
		||||
  0.5742510557174683,
 | 
			
		||||
  -0.04222835972905159,
 | 
			
		||||
  -0.137906014919281
 | 
			
		||||
    0.5670403838157654,
 | 
			
		||||
    0.009260174818336964,
 | 
			
		||||
    0.23178744316101074,
 | 
			
		||||
    -0.2916173040866852,
 | 
			
		||||
    -0.8924556970596313,
 | 
			
		||||
    0.8785552978515625,
 | 
			
		||||
    -0.34576427936553955,
 | 
			
		||||
    0.5742510557174683,
 | 
			
		||||
    -0.04222835972905159,
 | 
			
		||||
    -0.137906014919281
 | 
			
		||||
]
 | 
			
		||||
```
 | 
			
		||||
@@ -2,10 +2,38 @@
 | 
			
		||||
sidebar_position: 1
 | 
			
		||||
---
 | 
			
		||||
 | 
			
		||||
# Intro
 | 
			
		||||
# Introduction
 | 
			
		||||
 | 
			
		||||
Let's get started with **Ollama4j**.
 | 
			
		||||
 | 
			
		||||
## 🦙 What is Ollama?
 | 
			
		||||
 | 
			
		||||
[Ollama](https://ollama.ai/) is an advanced AI tool that allows users to easily set up and run large language models
 | 
			
		||||
locally (in CPU and GPU
 | 
			
		||||
modes). With Ollama, users can leverage powerful language models such as Llama 2 and even customize and create their own
 | 
			
		||||
models.
 | 
			
		||||
 | 
			
		||||
## 👨💻 Why Ollama4j?
 | 
			
		||||
 | 
			
		||||
Ollama4j was built for the simple purpose of integrating Ollama with Java applications.
 | 
			
		||||
 | 
			
		||||
```mermaid
 | 
			
		||||
  flowchart LR
 | 
			
		||||
    o4j[Ollama4j]
 | 
			
		||||
    o[Ollama Server]
 | 
			
		||||
    o4j -->|Communicates with| o;
 | 
			
		||||
    m[Models]
 | 
			
		||||
    p[Your Java Project]
 | 
			
		||||
    subgraph Your Java Environment
 | 
			
		||||
        direction TB
 | 
			
		||||
        p -->|Uses| o4j
 | 
			
		||||
    end
 | 
			
		||||
    subgraph Ollama Setup
 | 
			
		||||
        direction TB
 | 
			
		||||
        o -->|Manages| m
 | 
			
		||||
    end
 | 
			
		||||
```
 | 
			
		||||
 | 
			
		||||
## Getting Started
 | 
			
		||||
 | 
			
		||||
### What you'll need
 | 
			
		||||
 
 | 
			
		||||
		Reference in New Issue
	
	Block a user