View Source HyperLLM.Provider.Ollama (hyper_llm v0.6.0)

Provider implementation for Ollama.

https://github.com/ollama/ollama/blob/main/docs/api.md#generate-a-chat-completion

Configuration

api_url - The URL for the Ollama API. Defaults to http://localhost:11434.

api_key - The API key for the Ollama API. Defaults to ollama.

config :hyper_llm, 
  ollama: [
    api_url: "http://localhost:11434",
    api_key: "ollama"
  ]

Summary

Functions

Ollama supports all models.

Functions

completion(params, config \\ [])

@spec completion(
  HyperLLM.Provider.completion_params(),
  HyperLLM.Provider.completion_config()
) ::
  {:ok, binary()} | {:error, binary()}

See HyperLLM.Chat.completion/3 for more information.

model_supported?(_)

Ollama supports all models.