Ragex.AI.Provider.Ollama
(Ragex v0.13.0)
View Source
Ollama provider for running local LLMs.
Supports local Ollama server for models like llama2, mistral, codellama, etc.
Configuration
config :ragex, :ai_providers,
ollama: [
endpoint: "http://localhost:11434",
model: "codellama",
options: [
temperature: 0.7
]
]No API Key Required
Ollama runs locally, so no API key is needed.
Supported Models
Any model installed in your local Ollama instance:
llama2- Meta's Llama 2mistral- Mistral AI's modelcodellama- Code-specialized Llamaphi- Microsoft's small language model- And many more from https://ollama.ai/library
Installation
Install Ollama from https://ollama.ai and pull a model:
ollama pull codellama