View Source HyperLLM.Provider.Ollama (hyper_llm v0.5.0)
Provider implementation for Ollama.
https://github.com/ollama/ollama/blob/main/docs/api.md#generate-a-chat-completion
Configuration
api_url
- The URL for the Ollama API. Defaults to http://localhost:11434
.
api_key
- The API key for the Ollama API. Defaults to ollama
.
config :hyper_llm,
ollama: [
api_url: "http://localhost:11434",
api_key: "ollama"
]
Summary
Functions
Ollama supports all models.