View Source API Reference Omni v0.1.1

Modules

A Provider represents an LLM provider service. By fully implementing the Provider behaviour, a module can be made to support any LLM backend available.

Provider implementation for the Anthropic Messages API. Use this Provider to chat with any of the Claude 3 models.

Provider implementation for the Google Gemini API. Use this Provider to chat with any of the Gemini models.

Provider implementation for Ollama, using the Ollama Chat API. Use this Provider to chat with pretty much any local and open model (Llama 3, Mistral, Gemma, and many more).

An alternative Provider implementation for Ollama, using the Ollama Completion API. This Provider is preferred when you need fine grained control over the prompt templates, that isn't possible using the normal chat API.

Provider implementation for the OpenAI Chat API. Use this Provider to chat with any of the Chat GPT models.