Table of Contents

Class OllamaApiClient

Namespace
OllamaSharp
Assembly
OllamaSharp.dll

The default client to use the Ollama API conveniently. https://github.com/jmorganca/ollama/blob/main/docs/api.md

public class OllamaApiClient : IOllamaApiClient, IChatClient, IEmbeddingGenerator<string, Embedding<float>>, IEmbeddingGenerator, IDisposable
Inheritance
OllamaApiClient
Implements
IChatClient
IEmbeddingGenerator<string, Embedding<float>>
IEmbeddingGenerator
Inherited Members
Extension Methods

Constructors

OllamaApiClient(Configuration)

Creates a new instance of the Ollama API client.

public OllamaApiClient(OllamaApiClient.Configuration config)

Parameters

config OllamaApiClient.Configuration

The configuration for the Ollama API client.

OllamaApiClient(HttpClient, string, JsonSerializerContext?)

Creates a new instance of the Ollama API client.

public OllamaApiClient(HttpClient client, string defaultModel = "", JsonSerializerContext? jsonSerializerContext = null)

Parameters

client HttpClient

The HTTP client to access the Ollama API with.

defaultModel string

The default model that should be used with Ollama.

jsonSerializerContext JsonSerializerContext

The JSON serializer context for source generation (optional, for NativeAOT scenarios).

Exceptions

ArgumentNullException

OllamaApiClient(string, string)

Creates a new instance of the Ollama API client.

public OllamaApiClient(string uriString, string defaultModel = "")

Parameters

uriString string

The URI of the Ollama API endpoint.

defaultModel string

The default model that should be used with Ollama.

OllamaApiClient(Uri, string)

Creates a new instance of the Ollama API client.

public OllamaApiClient(Uri uri, string defaultModel = "")

Parameters

uri Uri

The URI of the Ollama API endpoint.

defaultModel string

The default model that should be used with Ollama.

Properties

Config

Gets the current configuration of the API client.

public OllamaApiClient.Configuration Config { get; }

Property Value

OllamaApiClient.Configuration

DefaultRequestHeaders

Gets the default request headers that are sent to the Ollama API.

public Dictionary<string, string> DefaultRequestHeaders { get; }

Property Value

Dictionary<string, string>

IncomingJsonSerializerOptions

Gets the serializer options used for deserializing HTTP responses.

public JsonSerializerOptions IncomingJsonSerializerOptions { get; }

Property Value

JsonSerializerOptions

OutgoingJsonSerializerOptions

Gets the serializer options for outgoing web requests like Post or Delete.

public JsonSerializerOptions OutgoingJsonSerializerOptions { get; }

Property Value

JsonSerializerOptions

SelectedModel

Gets or sets the name of the model to run requests on.

public string SelectedModel { get; set; }

Property Value

string

Uri

Gets the endpoint URI used by the API client.

public Uri Uri { get; }

Property Value

Uri

Methods

ChatAsync(ChatRequest, CancellationToken)

Sends a request to the /api/chat endpoint and streams the response of the chat.

public IAsyncEnumerable<ChatResponseStream?> ChatAsync(ChatRequest request, CancellationToken cancellationToken = default)

Parameters

request ChatRequest

The request to send to Ollama.

cancellationToken CancellationToken

The token to cancel the operation with.

Returns

IAsyncEnumerable<ChatResponseStream>

An asynchronous enumerable that yields ChatResponseStream. Each item represents a message in the chat response stream. Returns null when the stream is completed.

Remarks

This is the method to call the Ollama endpoint /api/chat. You might not want to do this manually. To implement a fully interactive chat, you should make use of the Chat class with "new Chat(...)"

CopyModelAsync(CopyModelRequest, CancellationToken)

Sends a request to the /api/copy endpoint to copy a model.

public Task CopyModelAsync(CopyModelRequest request, CancellationToken cancellationToken = default)

Parameters

request CopyModelRequest

The parameters required to copy a model.

cancellationToken CancellationToken

The token to cancel the operation with.

Returns

Task

CreateModelAsync(CreateModelRequest, CancellationToken)

Sends a request to the /api/create endpoint to create a model.

public IAsyncEnumerable<CreateModelResponse?> CreateModelAsync(CreateModelRequest request, CancellationToken cancellationToken = default)

Parameters

request CreateModelRequest

The request object containing the model details.

cancellationToken CancellationToken

The token to cancel the operation with.

Returns

IAsyncEnumerable<CreateModelResponse>

An asynchronous enumerable of the model creation status.

DeleteModelAsync(DeleteModelRequest, CancellationToken)

Sends a request to the /api/delete endpoint to delete a model.

public Task DeleteModelAsync(DeleteModelRequest request, CancellationToken cancellationToken = default)

Parameters

request DeleteModelRequest

The request containing the model to delete.

cancellationToken CancellationToken

The token to cancel the operation with.

Returns

Task

Dispose()

Releases the resources used by the OllamaApiClient instance. Disposes the internal HTTP client if it was created internally.

public void Dispose()

EmbedAsync(EmbedRequest, CancellationToken)

Sends a request to the /api/embed endpoint to generate embeddings.

public Task<EmbedResponse> EmbedAsync(EmbedRequest request, CancellationToken cancellationToken = default)

Parameters

request EmbedRequest

The parameters to generate embeddings for.

cancellationToken CancellationToken

The token to cancel the operation with.

Returns

Task<EmbedResponse>

A task that represents the asynchronous operation. The task result contains the EmbedResponse.

GenerateAsync(GenerateRequest, CancellationToken)

Streams completion responses from the /api/generate endpoint on the Ollama API based on the provided request.

public IAsyncEnumerable<GenerateResponseStream?> GenerateAsync(GenerateRequest request, CancellationToken cancellationToken = default)

Parameters

request GenerateRequest

The request containing the parameters for the completion.

cancellationToken CancellationToken

The token to cancel the operation with.

Returns

IAsyncEnumerable<GenerateResponseStream>

An asynchronous enumerable of GenerateResponseStream.

GetVersionAsync(CancellationToken)

Gets the version of Ollama.

public Task<string> GetVersionAsync(CancellationToken cancellationToken = default)

Parameters

cancellationToken CancellationToken

The token to cancel the operation with.

Returns

Task<string>

A task that represents the asynchronous operation. The task result contains the Version.

IsBlobExistsAsync(string, CancellationToken)

Ensures that the file blob (Binary Large Object) used with create a model exists on the server. This checks your Ollama server and not ollama.com.

public Task<bool> IsBlobExistsAsync(string digest, CancellationToken cancellationToken = default)

Parameters

digest string

The expected SHA256 digest of the file.

cancellationToken CancellationToken

The token to cancel the operation with.

Returns

Task<bool>

IsRunningAsync(CancellationToken)

Sends a query to check whether the Ollama API is running or not.

public Task<bool> IsRunningAsync(CancellationToken cancellationToken = default)

Parameters

cancellationToken CancellationToken

The token to cancel the operation with.

Returns

Task<bool>

A task that represents the asynchronous operation. The task result contains a boolean indicating whether the API is running.

ListLocalModelsAsync(CancellationToken)

Sends a request to the /api/tags endpoint to get all models that are available locally.

public Task<IEnumerable<Model>> ListLocalModelsAsync(CancellationToken cancellationToken = default)

Parameters

cancellationToken CancellationToken

The token to cancel the operation with.

Returns

Task<IEnumerable<Model>>

A task that represents the asynchronous operation. The task result contains a collection of Model.

ListRunningModelsAsync(CancellationToken)

Sends a request to the /api/ps endpoint to get the running models.

public Task<IEnumerable<RunningModel>> ListRunningModelsAsync(CancellationToken cancellationToken = default)

Parameters

cancellationToken CancellationToken

The token to cancel the operation with.

Returns

Task<IEnumerable<RunningModel>>

A task that represents the asynchronous operation. The task result contains a collection of RunningModel.

PullModelAsync(PullModelRequest, CancellationToken)

Sends a request to the /api/pull endpoint to pull a new model.

public IAsyncEnumerable<PullModelResponse?> PullModelAsync(PullModelRequest request, CancellationToken cancellationToken = default)

Parameters

request PullModelRequest

The request specifying the model name and whether to use an insecure connection.

cancellationToken CancellationToken

The token to cancel the operation with.

Returns

IAsyncEnumerable<PullModelResponse>

An asynchronous enumerable of PullModelResponse objects representing the status of the model pull operation.

PushBlobAsync(string, byte[], CancellationToken)

Push a file to the Ollama server to create a "blob" (Binary Large Object).

public Task PushBlobAsync(string digest, byte[] bytes, CancellationToken cancellationToken = default)

Parameters

digest string

The expected SHA256 digest of the file.

bytes byte[]

The bytes data of the file.

cancellationToken CancellationToken

The token to cancel the operation with.

Returns

Task

PushModelAsync(PushModelRequest, CancellationToken)

Pushes a model to the Ollama API endpoint.

public IAsyncEnumerable<PushModelResponse?> PushModelAsync(PushModelRequest request, CancellationToken cancellationToken = default)

Parameters

request PushModelRequest

The request containing the model information to push.

cancellationToken CancellationToken

The token to cancel the operation with.

Returns

IAsyncEnumerable<PushModelResponse>

An asynchronous enumerable of push status updates. Use the enumerator to retrieve the push status updates.

SendToOllamaAsync(HttpRequestMessage, OllamaRequest?, HttpCompletionOption, CancellationToken)

Sends an HTTP request message to the Ollama API.

protected virtual Task<HttpResponseMessage> SendToOllamaAsync(HttpRequestMessage requestMessage, OllamaRequest? ollamaRequest, HttpCompletionOption completionOption, CancellationToken cancellationToken)

Parameters

requestMessage HttpRequestMessage

The HTTP request message to send.

ollamaRequest OllamaRequest

The request containing custom HTTP request headers.

completionOption HttpCompletionOption

When the operation should complete (as soon as a response is available or after reading the whole response content).

cancellationToken CancellationToken

The token to cancel the operation with.

Returns

Task<HttpResponseMessage>

ShowModelAsync(ShowModelRequest, CancellationToken)

Sends a request to the /api/show endpoint to show the information of a model.

public Task<ShowModelResponse> ShowModelAsync(ShowModelRequest request, CancellationToken cancellationToken = default)

Parameters

request ShowModelRequest

The request containing the name of the model to get the information for.

cancellationToken CancellationToken

The token to cancel the operation with.

Returns

Task<ShowModelResponse>

A task that represents the asynchronous operation. The task result contains the ShowModelResponse.