Refactor and reorganize API docs structure

Moved and renamed several API documentation files for better organization, updated sidebar positions, and merged image generation docs. Added logging documentation and updated Makefile commands for docs build and serve. Improved clarity and consistency in API doc titles and structure.
This commit is contained in:
Amith Koujalgi 2025-09-17 10:34:50 +05:30
parent fc1f842f6b
commit 329381b1ee
16 changed files with 77 additions and 52 deletions

View File

@ -30,10 +30,10 @@ list-releases:
--silent | jq -r '.components[].version' --silent | jq -r '.components[].version'
docs-build: docs-build:
npm i --prefix docs && npm run build --prefix docs cd ./docs && npm install --prefix && npm run build
docs-serve: docs-serve:
npm i --prefix docs && npm run start --prefix docs cd ./docs && npm install && npm run start
start-cpu: start-cpu:
docker run -it -v ~/ollama:/root/.ollama -p 11434:11434 ollama/ollama docker run -it -v ~/ollama:/root/.ollama -p 11434:11434 ollama/ollama

View File

@ -1,8 +1,8 @@
--- ---
sidebar_position: 2 sidebar_position: 3
--- ---
# Set Basic Authentication # Basic Auth
This API lets you set the basic authentication for the Ollama client. This would help in scenarios where This API lets you set the basic authentication for the Ollama client. This would help in scenarios where
Ollama server would be setup behind a gateway/reverse proxy with basic auth. Ollama server would be setup behind a gateway/reverse proxy with basic auth.

View File

@ -1,8 +1,8 @@
--- ---
sidebar_position: 2 sidebar_position: 4
--- ---
# Set Bearer Authentication # Bearer Auth
This API lets you set the bearer authentication for the Ollama client. This would help in scenarios where This API lets you set the bearer authentication for the Ollama client. This would help in scenarios where
Ollama server would be setup behind a gateway/reverse proxy with bearer auth. Ollama server would be setup behind a gateway/reverse proxy with bearer auth.

View File

@ -0,0 +1,26 @@
---
sidebar_position: 7
---
# Logging
### Using with SLF4J and Logback
Add a `logback.xml` file to your `src/main/resources` folder with the following content:
```xml
<configuration>
<root level="DEBUG">
<appender-ref ref="STDOUT"/>
</root>
<appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
<encoder>
<pattern>%d{yyyy-MM-dd HH:mm:ss} %-5level %logger{36} - %msg%n</pattern>
</encoder>
</appender>
</configuration>
```

View File

@ -1,5 +1,5 @@
--- ---
sidebar_position: 3 sidebar_position: 5
--- ---
# Ping # Ping

View File

@ -1,5 +1,5 @@
--- ---
sidebar_position: 10 sidebar_position: 2
--- ---
# Prompt Builder # Prompt Builder
@ -51,6 +51,7 @@ public class Main {
You will get a response similar to: You will get a response similar to:
:::tip[LLM Response]
```go ```go
package main package main
@ -71,4 +72,5 @@ func readFile(fileName string) {
fmt.Println(f.String()) fmt.Println(f.String())
} }
} }
``` ```
:::

View File

@ -1,5 +1,5 @@
--- ---
sidebar_position: 4 sidebar_position: 5
--- ---
# PS # PS

View File

@ -2,7 +2,9 @@
sidebar_position: 2 sidebar_position: 2
--- ---
# Set Request Timeout # Timeouts
## Set Request Timeout
This API lets you set the request timeout for the Ollama client. This API lets you set the request timeout for the Ollama client.

View File

@ -1,5 +1,5 @@
--- ---
sidebar_position: 2 sidebar_position: 6
--- ---
import CodeEmbed from '@site/src/components/CodeEmbed'; import CodeEmbed from '@site/src/components/CodeEmbed';

View File

@ -1,5 +1,5 @@
--- ---
sidebar_position: 5 sidebar_position: 1
--- ---
import CodeEmbed from '@site/src/components/CodeEmbed'; import CodeEmbed from '@site/src/components/CodeEmbed';

View File

@ -1,5 +1,5 @@
--- ---
sidebar_position: 2 sidebar_position: 3
--- ---
import CodeEmbed from '@site/src/components/CodeEmbed'; import CodeEmbed from '@site/src/components/CodeEmbed';

View File

@ -1,33 +0,0 @@
---
sidebar_position: 4
---
import CodeEmbed from '@site/src/components/CodeEmbed';
# Generate with Image URLs
This API lets you ask questions along with the image files to the LLMs.
This API corresponds to
the [completion](https://github.com/jmorganca/ollama/blob/main/docs/api.md#generate-a-completion) API.
:::note
Executing this on Ollama server running in CPU-mode will take longer to generate response. Hence, GPU-mode is
recommended.
:::
## Ask (Sync)
Passing the link of this image the following code:
![Img](https://t3.ftcdn.net/jpg/02/96/63/80/360_F_296638053_0gUVA4WVBKceGsIr7LNqRWSnkusi07dq.jpg)
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/GenerateWithImageURL.java" />
You will get a response similar to:
::::tip[LLM Response]
This image features a white boat with brown cushions, where a dog is sitting on the back of the boat. The dog seems to
be enjoying its time outdoors, perhaps on a lake.
::::

View File

@ -1,5 +1,5 @@
--- ---
sidebar_position: 3 sidebar_position: 4
--- ---
import CodeEmbed from '@site/src/components/CodeEmbed'; import CodeEmbed from '@site/src/components/CodeEmbed';
@ -27,6 +27,34 @@ If you have this image downloaded and you pass the path to the downloaded image
You will get a response similar to: You will get a response similar to:
::::tip[LLM Response]
This image features a white boat with brown cushions, where a dog is sitting on the back of the boat. The dog seems to
be enjoying its time outdoors, perhaps on a lake.
::::
# Generate with Image URLs
This API lets you ask questions along with the image files to the LLMs.
This API corresponds to
the [completion](https://github.com/jmorganca/ollama/blob/main/docs/api.md#generate-a-completion) API.
:::note
Executing this on Ollama server running in CPU-mode will take longer to generate response. Hence, GPU-mode is
recommended.
:::
## Ask (Sync)
Passing the link of this image the following code:
![Img](https://t3.ftcdn.net/jpg/02/96/63/80/360_F_296638053_0gUVA4WVBKceGsIr7LNqRWSnkusi07dq.jpg)
<CodeEmbed src="https://raw.githubusercontent.com/ollama4j/ollama4j-examples/refs/heads/main/src/main/java/io/github/ollama4j/examples/GenerateWithImageURL.java" />
You will get a response similar to:
::::tip[LLM Response] ::::tip[LLM Response]
This image features a white boat with brown cushions, where a dog is sitting on the back of the boat. The dog seems to This image features a white boat with brown cushions, where a dog is sitting on the back of the boat. The dog seems to
be enjoying its time outdoors, perhaps on a lake. be enjoying its time outdoors, perhaps on a lake.

View File

@ -1,5 +1,5 @@
--- ---
sidebar_position: 6 sidebar_position: 5
--- ---
import CodeEmbed from '@site/src/components/CodeEmbed'; import CodeEmbed from '@site/src/components/CodeEmbed';

View File

@ -1,11 +1,11 @@
--- ---
sidebar_position: 1 sidebar_position: 2
--- ---
import CodeEmbed from '@site/src/components/CodeEmbed'; import CodeEmbed from '@site/src/components/CodeEmbed';
import TypewriterTextarea from '@site/src/components/TypewriterTextarea'; import TypewriterTextarea from '@site/src/components/TypewriterTextarea';
# Generate (Sync) # Generate
This API lets you ask questions to the LLMs in a synchronous way. This API lets you ask questions to the LLMs in a synchronous way.
This API corresponds to This API corresponds to

View File

@ -1,5 +1,5 @@
{ {
"label": "APIs - Model Management", "label": "APIs - Manage Models",
"position": 2, "position": 2,
"link": { "link": {
"type": "generated-index", "type": "generated-index",