Files
ollama4j/docs/docs/apis-generate/generate-embeddings.md
amithkoujalgi f914707536 Add docs for thinking APIs and update examples
Added new documentation for 'chat-with-thinking' and 'generate-thinking' APIs, including usage examples and streamed output. Updated existing API docs to improve example clarity, response formatting, and added more interactive output using TypewriterTextarea. Removed deprecated 'list-library-models' doc and made minor README updates.
2025-08-31 19:36:43 +05:30

1.4 KiB

sidebar_position
sidebar_position
5

import CodeEmbed from '@site/src/components/CodeEmbed';

Generate Embeddings

Generate embeddings from a model.

Using embed()

::::tip[LLM Response]

[
  [
    0.010000081,
    -0.0017487297,
    0.050126992,
    0.04694895,
    0.055186987,
    0.008570699,
    0.10545243,
    -0.02591801,
    0.1296789,
  ],
  [
    -0.009868476,
    0.060335685,
    0.025288988,
    -0.0062160683,
    0.07281043,
    0.017217565,
    0.090314455,
    -0.051715206,
  ]
]

::::

You could also use the OllamaEmbedRequestModel to specify the options such as seed, temperature, etc., to apply for generating embeddings.

You will get a response similar to:

::::tip[LLM Response]

[
  [
    0.010000081,
    -0.0017487297,
    0.050126992,
    0.04694895,
    0.055186987,
    0.008570699,
    0.10545243,
    -0.02591801,
    0.1296789,
  ],
  [
    -0.009868476,
    0.060335685,
    0.025288988,
    -0.0062160683,
    0.07281043,
    0.017217565,
    0.090314455,
    -0.051715206,
  ]
]

::::