Module ollama_handler

Ollama Handler Library Generic library to interact with Ollama API in a simple and flexible way.

Description

Ollama Handler Library Generic library to interact with Ollama API in a simple and flexible way. Supports both default configuration and custom configuration per request. Environment variables can be used to override default settings.

Data Types

config()

config() = #{endpoint => string(), chat_endpoint => string(), model => binary(), stream => boolean(), temperature => float(), max_tokens => integer(), system_prompt => binary(), additional_options => map()}

message()

message() = #{role => binary(), content => binary()}

messages()

messages() = [message()]

ollama_result()

ollama_result() = {ok, binary()} | {error, term()}

Function Index

chat/1 Chat completion using messages format with default/environment configuration.
chat/2 Chat completion using messages format with custom configuration.
default_config/0 Get default hardcoded configuration.
format_prompt/2 Format a prompt template with given arguments.
generate/1 Generate text using a simple prompt with default/environment configuration.
generate/2 Generate text using a simple prompt with custom configuration.
generate_with_context/2 Generate text with additional context using default/environment configuration.
generate_with_context/3 Generate text with additional context using custom configuration.
get_env_config/0 Get configuration from environment variables with fallback to defaults.
merge_config/2 Merge two configurations, with the second one taking precedence.
print_result/1 Print the result of an Ollama operation to stdout.

Function Details

chat/1

chat(Messages::messages()) -> ollama_result()

Chat completion using messages format with default/environment configuration.

chat/2

chat(Messages::messages(), Config::config()) -> ollama_result()

Chat completion using messages format with custom configuration.

default_config/0

default_config() -> config()

Get default hardcoded configuration.

format_prompt/2

format_prompt(Template::string(), Args::list()) -> binary()

Format a prompt template with given arguments. Similar to io_lib:format but returns binary.

generate/1

generate(Prompt::string() | binary()) -> ollama_result()

Generate text using a simple prompt with default/environment configuration.

generate/2

generate(Prompt::string() | binary(), Config::config()) -> ollama_result()

Generate text using a simple prompt with custom configuration.

generate_with_context/2

generate_with_context(Context::string() | binary(), Prompt::string() | binary()) -> ollama_result()

Generate text with additional context using default/environment configuration.

generate_with_context/3

generate_with_context(Context::string() | binary(), Prompt::string() | binary(), Config::config()) -> ollama_result()

Generate text with additional context using custom configuration.

get_env_config/0

get_env_config() -> config()

Get configuration from environment variables with fallback to defaults. Environment variables: - OLLAMA_ENDPOINT: Ollama API endpoint (default: http://localhost:11434/api/generate) - OLLAMA_CHAT_ENDPOINT: Ollama Chat API endpoint (default: http://localhost:11434/api/chat) - OLLAMA_MODEL: Model name to use (default: llama2) - OLLAMA_TEMPERATURE: Temperature for generation (default: 0.7) - OLLAMA_MAX_TOKENS: Maximum tokens to generate (default: 1000) - OLLAMA_STREAM: Whether to stream responses (default: false) - OLLAMA_SYSTEM_PROMPT: System prompt to use

merge_config/2

merge_config(BaseConfig::config(), OverrideConfig::config()) -> config()

Merge two configurations, with the second one taking precedence.

print_result/1

print_result(X1::ollama_result()) -> ok | error

Print the result of an Ollama operation to stdout. Returns 'ok' if successful, 'error' otherwise.


Generated by EDoc