Table of Contents

Interface IOllamaApiClient

Namespace
OllamaSharp
Assembly
OllamaSharp.dll

Interface for the Ollama API client.

public interface IOllamaApiClient
Extension Methods

Properties

SelectedModel

Gets or sets the name of the model to run requests on.

string SelectedModel { get; set; }

Property Value

string

Uri

Gets the endpoint URI used by the API client.

Uri Uri { get; }

Property Value

Uri

Methods

ChatAsync(ChatRequest, CancellationToken)

Sends a request to the /api/chat endpoint and streams the response of the chat.

IAsyncEnumerable<ChatResponseStream?> ChatAsync(ChatRequest request, CancellationToken cancellationToken = default)

Parameters

request ChatRequest

The request to send to Ollama.

cancellationToken CancellationToken

The token to cancel the operation with.

Returns

IAsyncEnumerable<ChatResponseStream>

An asynchronous enumerable that yields ChatResponseStream. Each item represents a message in the chat response stream. Returns null when the stream is completed.

Remarks

This is the method to call the Ollama endpoint /api/chat. You might not want to do this manually. To implement a fully interactive chat, you should make use of the Chat class with "new Chat(...)"

CopyModelAsync(CopyModelRequest, CancellationToken)

Sends a request to the /api/copy endpoint to copy a model.

Task CopyModelAsync(CopyModelRequest request, CancellationToken cancellationToken = default)

Parameters

request CopyModelRequest

The parameters required to copy a model.

cancellationToken CancellationToken

The token to cancel the operation with.

Returns

Task

CreateModelAsync(CreateModelRequest, CancellationToken)

Sends a request to the /api/create endpoint to create a model.

IAsyncEnumerable<CreateModelResponse?> CreateModelAsync(CreateModelRequest request, CancellationToken cancellationToken = default)

Parameters

request CreateModelRequest

The request object containing the model details.

cancellationToken CancellationToken

The token to cancel the operation with.

Returns

IAsyncEnumerable<CreateModelResponse>

An asynchronous enumerable of the model creation status.

DeleteModelAsync(DeleteModelRequest, CancellationToken)

Sends a request to the /api/delete endpoint to delete a model.

Task DeleteModelAsync(DeleteModelRequest request, CancellationToken cancellationToken = default)

Parameters

request DeleteModelRequest

The request containing the model to delete.

cancellationToken CancellationToken

The token to cancel the operation with.

Returns

Task

EmbedAsync(EmbedRequest, CancellationToken)

Sends a request to the /api/embed endpoint to generate embeddings.

Task<EmbedResponse> EmbedAsync(EmbedRequest request, CancellationToken cancellationToken = default)

Parameters

request EmbedRequest

The parameters to generate embeddings for.

cancellationToken CancellationToken

The token to cancel the operation with.

Returns

Task<EmbedResponse>

A task that represents the asynchronous operation. The task result contains the EmbedResponse.

GenerateAsync(GenerateRequest, CancellationToken)

Streams completion responses from the /api/generate endpoint on the Ollama API based on the provided request.

IAsyncEnumerable<GenerateResponseStream?> GenerateAsync(GenerateRequest request, CancellationToken cancellationToken = default)

Parameters

request GenerateRequest

The request containing the parameters for the completion.

cancellationToken CancellationToken

The token to cancel the operation with.

Returns

IAsyncEnumerable<GenerateResponseStream>

An asynchronous enumerable of GenerateResponseStream.

GetVersionAsync(CancellationToken)

Gets the version of Ollama.

Task<string> GetVersionAsync(CancellationToken cancellationToken = default)

Parameters

cancellationToken CancellationToken

The token to cancel the operation with.

Returns

Task<string>

A task that represents the asynchronous operation. The task result contains the Version.

IsBlobExistsAsync(string, CancellationToken)

Ensures that the file blob (Binary Large Object) used with create a model exists on the server. This checks your Ollama server and not ollama.com.

Task<bool> IsBlobExistsAsync(string digest, CancellationToken cancellationToken = default)

Parameters

digest string

The expected SHA256 digest of the file.

cancellationToken CancellationToken

The token to cancel the operation with.

Returns

Task<bool>

IsRunningAsync(CancellationToken)

Sends a query to check whether the Ollama API is running or not.

Task<bool> IsRunningAsync(CancellationToken cancellationToken = default)

Parameters

cancellationToken CancellationToken

The token to cancel the operation with.

Returns

Task<bool>

A task that represents the asynchronous operation. The task result contains a boolean indicating whether the API is running.

ListLocalModelsAsync(CancellationToken)

Sends a request to the /api/tags endpoint to get all models that are available locally.

Task<IEnumerable<Model>> ListLocalModelsAsync(CancellationToken cancellationToken = default)

Parameters

cancellationToken CancellationToken

The token to cancel the operation with.

Returns

Task<IEnumerable<Model>>

A task that represents the asynchronous operation. The task result contains a collection of Model.

ListRunningModelsAsync(CancellationToken)

Sends a request to the /api/ps endpoint to get the running models.

Task<IEnumerable<RunningModel>> ListRunningModelsAsync(CancellationToken cancellationToken = default)

Parameters

cancellationToken CancellationToken

The token to cancel the operation with.

Returns

Task<IEnumerable<RunningModel>>

A task that represents the asynchronous operation. The task result contains a collection of RunningModel.

PullModelAsync(PullModelRequest, CancellationToken)

Sends a request to the /api/pull endpoint to pull a new model.

IAsyncEnumerable<PullModelResponse?> PullModelAsync(PullModelRequest request, CancellationToken cancellationToken = default)

Parameters

request PullModelRequest

The request specifying the model name and whether to use an insecure connection.

cancellationToken CancellationToken

The token to cancel the operation with.

Returns

IAsyncEnumerable<PullModelResponse>

An asynchronous enumerable of PullModelResponse objects representing the status of the model pull operation.

PushBlobAsync(string, byte[], CancellationToken)

Push a file to the Ollama server to create a "blob" (Binary Large Object).

Task PushBlobAsync(string digest, byte[] bytes, CancellationToken cancellationToken = default)

Parameters

digest string

The expected SHA256 digest of the file.

bytes byte[]

The bytes data of the file.

cancellationToken CancellationToken

The token to cancel the operation with.

Returns

Task

PushModelAsync(PushModelRequest, CancellationToken)

Pushes a model to the Ollama API endpoint.

IAsyncEnumerable<PushModelResponse?> PushModelAsync(PushModelRequest request, CancellationToken cancellationToken = default)

Parameters

request PushModelRequest

The request containing the model information to push.

cancellationToken CancellationToken

The token to cancel the operation with.

Returns

IAsyncEnumerable<PushModelResponse>

An asynchronous enumerable of push status updates. Use the enumerator to retrieve the push status updates.

ShowModelAsync(ShowModelRequest, CancellationToken)

Sends a request to the /api/show endpoint to show the information of a model.

Task<ShowModelResponse> ShowModelAsync(ShowModelRequest request, CancellationToken cancellationToken = default)

Parameters

request ShowModelRequest

The request containing the name of the model to get the information for.

cancellationToken CancellationToken

The token to cancel the operation with.

Returns

Task<ShowModelResponse>

A task that represents the asynchronous operation. The task result contains the ShowModelResponse.