View Source Ollama (Ollama v0.1.0)
Ollama is a nifty little tool for running large language models locally, and this is a nifty little library for working with Ollama in Elixir.
Highlights
- API client fully implementing the Ollama API -
Ollama.API
- Server module implementing OpenAI compatible completion and chat endpoints, proxying through to Ollama - COMING SOON
Installation
The package can be installed by adding ollama
to your list of dependencies
in mix.exs
.
def deps do
[
{:ollama, "~> 0.1"}
]
end