LlmEx.Providers.OllamaClient (LlmEx v0.1.0)

View Source

Client for communicating with a local LLM server (Ollama). Provides functionality to send prompts and stream responses back to a specified process.

Summary

Functions

Convert a Message struct to local LLM format

Streams a chat response from the local LLM server to the given pid.

Convert a Tool struct to local LLM format

Functions

message_to_provider_format(message)

Convert a Message struct to local LLM format

stream_chat(messages, message_id, pid, opts \\ [])

Streams a chat response from the local LLM server to the given pid.

Parameters

  • messages - List of Message structs representing the conversation history
  • message_id - ID for the response message
  • pid - The process ID to stream the response to
  • opts - Options for the request
    • :model - The model to use (default: llama3.2)
    • :temperature - Temperature parameter for response randomness
    • :tools - List of Tool structs available to the LLM

tool_to_provider_format(tool)

Convert a Tool struct to local LLM format