LlmEx.Providers.OllamaClient (LlmEx v0.1.0)
View SourceClient for communicating with a local LLM server (Ollama). Provides functionality to send prompts and stream responses back to a specified process.
Summary
Functions
Convert a Message struct to local LLM format
Streams a chat response from the local LLM server to the given pid.
Convert a Tool struct to local LLM format
Functions
Convert a Message struct to local LLM format
Streams a chat response from the local LLM server to the given pid.
Parameters
messages
- List of Message structs representing the conversation historymessage_id
- ID for the response messagepid
- The process ID to stream the response toopts
- Options for the request:model
- The model to use (default: llama3.2):temperature
- Temperature parameter for response randomness:tools
- List of Tool structs available to the LLM
Convert a Tool struct to local LLM format