Class OllamaApiClientExtensions
- Namespace
- OllamaSharp
- Assembly
- OllamaSharp.dll
Extension methods to simplify the usage of the IOllamaApiClient.
public static class OllamaApiClientExtensions
- Inheritance
-
OllamaApiClientExtensions
- Inherited Members
Methods
CopyModelAsync(IOllamaApiClient, string, string, CancellationToken)
Sends a request to the /api/copy endpoint to copy a model.
public static Task CopyModelAsync(this IOllamaApiClient client, string source, string destination, CancellationToken cancellationToken = default)
Parameters
clientIOllamaApiClientThe client used to execute the command.
sourcestringThe name of the existing model to copy.
destinationstringThe name the copied model should get.
cancellationTokenCancellationTokenThe token to cancel the operation with.
Returns
- Task
A task that represents the asynchronous operation.
DeleteModelAsync(IOllamaApiClient, string, CancellationToken)
Sends a request to the /api/delete endpoint to delete a model.
public static Task DeleteModelAsync(this IOllamaApiClient client, string model, CancellationToken cancellationToken = default)
Parameters
clientIOllamaApiClientThe client used to execute the command.
modelstringThe name of the model to delete.
cancellationTokenCancellationTokenThe token to cancel the operation with.
Returns
- Task
A task that represents the asynchronous operation.
EmbedAsync(IOllamaApiClient, string, CancellationToken)
Sends a request to the /api/embed endpoint to generate embeddings for the currently selected model.
public static Task<EmbedResponse> EmbedAsync(this IOllamaApiClient client, string input, CancellationToken cancellationToken = default)
Parameters
clientIOllamaApiClientThe client used to execute the command.
inputstringThe input text to generate embeddings for.
cancellationTokenCancellationTokenThe token to cancel the operation with.
Returns
- Task<EmbedResponse>
A EmbedResponse containing the embeddings.
GenerateAsync(IOllamaApiClient, string, ConversationContext?, CancellationToken)
Sends a request to the /api/generate endpoint to get a completion and streams the returned chunks to a given streamer that can be used to update the user interface in real-time.
public static IAsyncEnumerable<GenerateResponseStream?> GenerateAsync(this IOllamaApiClient client, string prompt, ConversationContext? context = null, CancellationToken cancellationToken = default)
Parameters
clientIOllamaApiClientThe client used to execute the command.
promptstringThe prompt to generate a completion for.
contextConversationContextThe context that keeps the conversation for a chat-like history. Should reuse the result from earlier calls if these calls belong together. Can be null initially.
cancellationTokenCancellationTokenThe token to cancel the operation with.
Returns
- IAsyncEnumerable<GenerateResponseStream>
An async enumerable that can be used to iterate over the streamed responses. See GenerateResponseStream.
PullModelAsync(IOllamaApiClient, string, CancellationToken)
Sends a request to the /api/pull endpoint to pull a new model.
public static IAsyncEnumerable<PullModelResponse?> PullModelAsync(this IOllamaApiClient client, string model, CancellationToken cancellationToken = default)
Parameters
clientIOllamaApiClientThe client used to execute the command.
modelstringThe name of the model to pull.
cancellationTokenCancellationTokenThe token to cancel the operation with.
Returns
- IAsyncEnumerable<PullModelResponse>
An async enumerable that can be used to iterate over the streamed responses. See PullModelResponse.
PushBlobAsync(IOllamaApiClient, byte[], CancellationToken)
Push a file to the Ollama server to create a "blob" (Binary Large Object).
public static Task PushBlobAsync(this IOllamaApiClient client, byte[] bytes, CancellationToken cancellationToken = default)
Parameters
clientIOllamaApiClientThe client used to execute the command.
bytesbyte[]The bytes data of the file.
cancellationTokenCancellationTokenThe token to cancel the operation with.
Returns
PushModelAsync(IOllamaApiClient, string, CancellationToken)
Sends a request to the /api/push endpoint to push a new model.
public static IAsyncEnumerable<PushModelResponse?> PushModelAsync(this IOllamaApiClient client, string name, CancellationToken cancellationToken = default)
Parameters
clientIOllamaApiClientThe client used to execute the command.
namestringThe name of the model to push.
cancellationTokenCancellationTokenThe token to cancel the operation with.
Returns
- IAsyncEnumerable<PushModelResponse>
An async enumerable that can be used to iterate over the streamed responses. See PullModelResponse.
RequestModelUnloadAsync(IOllamaApiClient, string, CancellationToken)
Send a request to /api/generate with keep_alive set to 0 to immediately unload a model from memory.
public static Task RequestModelUnloadAsync(this IOllamaApiClient client, string model, CancellationToken cancellationToken = default)
Parameters
clientIOllamaApiClientThe client used to execute the command.
modelstringThe name of the model to unload.
cancellationTokenCancellationTokenThe token to cancel the operation with.
Returns
ShowModelAsync(IOllamaApiClient, string, CancellationToken)
Sends a request to the /api/show endpoint to show the information of a model.
public static Task<ShowModelResponse> ShowModelAsync(this IOllamaApiClient client, string model, CancellationToken cancellationToken = default)
Parameters
clientIOllamaApiClientThe client used to execute the command.
modelstringThe name of the model to get the information for.
cancellationTokenCancellationTokenThe token to cancel the operation with.
Returns
- Task<ShowModelResponse>
A task that represents the asynchronous operation. The task result contains the ShowModelResponse with the model information.