Table of Contents

Class OllamaApiClientExtensions

Namespace
OllamaSharp
Assembly
OllamaSharp.dll

Extension methods to simplify the usage of the IOllamaApiClient.

public static class OllamaApiClientExtensions
Inheritance
OllamaApiClientExtensions
Inherited Members

Methods

CopyModelAsync(IOllamaApiClient, string, string, CancellationToken)

Sends a request to the /api/copy endpoint to copy a model.

public static Task CopyModelAsync(this IOllamaApiClient client, string source, string destination, CancellationToken cancellationToken = default)

Parameters

client IOllamaApiClient

The client used to execute the command.

source string

The name of the existing model to copy.

destination string

The name the copied model should get.

cancellationToken CancellationToken

The token to cancel the operation with.

Returns

Task

A task that represents the asynchronous operation.

DeleteModelAsync(IOllamaApiClient, string, CancellationToken)

Sends a request to the /api/delete endpoint to delete a model.

public static Task DeleteModelAsync(this IOllamaApiClient client, string model, CancellationToken cancellationToken = default)

Parameters

client IOllamaApiClient

The client used to execute the command.

model string

The name of the model to delete.

cancellationToken CancellationToken

The token to cancel the operation with.

Returns

Task

A task that represents the asynchronous operation.

EmbedAsync(IOllamaApiClient, string, CancellationToken)

Sends a request to the /api/embed endpoint to generate embeddings for the currently selected model.

public static Task<EmbedResponse> EmbedAsync(this IOllamaApiClient client, string input, CancellationToken cancellationToken = default)

Parameters

client IOllamaApiClient

The client used to execute the command.

input string

The input text to generate embeddings for.

cancellationToken CancellationToken

The token to cancel the operation with.

Returns

Task<EmbedResponse>

A EmbedResponse containing the embeddings.

GenerateAsync(IOllamaApiClient, string, ConversationContext?, CancellationToken)

Sends a request to the /api/generate endpoint to get a completion and streams the returned chunks to a given streamer that can be used to update the user interface in real-time.

public static IAsyncEnumerable<GenerateResponseStream?> GenerateAsync(this IOllamaApiClient client, string prompt, ConversationContext? context = null, CancellationToken cancellationToken = default)

Parameters

client IOllamaApiClient

The client used to execute the command.

prompt string

The prompt to generate a completion for.

context ConversationContext

The context that keeps the conversation for a chat-like history. Should reuse the result from earlier calls if these calls belong together. Can be null initially.

cancellationToken CancellationToken

The token to cancel the operation with.

Returns

IAsyncEnumerable<GenerateResponseStream>

An async enumerable that can be used to iterate over the streamed responses. See GenerateResponseStream.

PullModelAsync(IOllamaApiClient, string, CancellationToken)

Sends a request to the /api/pull endpoint to pull a new model.

public static IAsyncEnumerable<PullModelResponse?> PullModelAsync(this IOllamaApiClient client, string model, CancellationToken cancellationToken = default)

Parameters

client IOllamaApiClient

The client used to execute the command.

model string

The name of the model to pull.

cancellationToken CancellationToken

The token to cancel the operation with.

Returns

IAsyncEnumerable<PullModelResponse>

An async enumerable that can be used to iterate over the streamed responses. See PullModelResponse.

PushBlobAsync(IOllamaApiClient, byte[], CancellationToken)

Push a file to the Ollama server to create a "blob" (Binary Large Object).

public static Task PushBlobAsync(this IOllamaApiClient client, byte[] bytes, CancellationToken cancellationToken = default)

Parameters

client IOllamaApiClient

The client used to execute the command.

bytes byte[]

The bytes data of the file.

cancellationToken CancellationToken

The token to cancel the operation with.

Returns

Task

PushModelAsync(IOllamaApiClient, string, CancellationToken)

Sends a request to the /api/push endpoint to push a new model.

public static IAsyncEnumerable<PushModelResponse?> PushModelAsync(this IOllamaApiClient client, string name, CancellationToken cancellationToken = default)

Parameters

client IOllamaApiClient

The client used to execute the command.

name string

The name of the model to push.

cancellationToken CancellationToken

The token to cancel the operation with.

Returns

IAsyncEnumerable<PushModelResponse>

An async enumerable that can be used to iterate over the streamed responses. See PullModelResponse.

RequestModelUnloadAsync(IOllamaApiClient, string, CancellationToken)

Send a request to /api/generate with keep_alive set to 0 to immediately unload a model from memory.

public static Task RequestModelUnloadAsync(this IOllamaApiClient client, string model, CancellationToken cancellationToken = default)

Parameters

client IOllamaApiClient

The client used to execute the command.

model string

The name of the model to unload.

cancellationToken CancellationToken

The token to cancel the operation with.

Returns

Task

ShowModelAsync(IOllamaApiClient, string, CancellationToken)

Sends a request to the /api/show endpoint to show the information of a model.

public static Task<ShowModelResponse> ShowModelAsync(this IOllamaApiClient client, string model, CancellationToken cancellationToken = default)

Parameters

client IOllamaApiClient

The client used to execute the command.

model string

The name of the model to get the information for.

cancellationToken CancellationToken

The token to cancel the operation with.

Returns

Task<ShowModelResponse>

A task that represents the asynchronous operation. The task result contains the ShowModelResponse with the model information.