View Source HyperLLM.Providers.Ollama (hyper_llm v0.4.0)
Provider implementation for Ollama.
https://github.com/ollama/ollama/blob/main/docs/api.md#generate-a-chat-completion
Configuration:
config :hyper_llm,
ollama: [
api_url: "http://localhost:11434",
api_key: "ollama"
]
Summary
Functions
Ollama supports all models.