View Source HyperLLM.Provider.Ollama (hyper_llm v0.6.0)
Provider implementation for Ollama.
https://github.com/ollama/ollama/blob/main/docs/api.md#generate-a-chat-completion
Configuration
api_url
- The URL for the Ollama API. Defaults to http://localhost:11434
.
api_key
- The API key for the Ollama API. Defaults to ollama
.
config :hyper_llm,
ollama: [
api_url: "http://localhost:11434",
api_key: "ollama"
]
Summary
Functions
@spec completion( HyperLLM.Provider.completion_params(), HyperLLM.Provider.completion_config() ) :: {:ok, binary()} | {:error, binary()}
See HyperLLM.Chat.completion/3
for more information.
Ollama supports all models.