GenAI.ThreadProtocol protocol (GenAI Core v0.1.0)

Link to this section Summary

Types

t()

All the types that implement this protocol.

Functions

Run inference.

Start inference using a streaming handler.

Specify an API key for a provider.

Specify an API org for a provider.

Add a message to the conversation.

Add a list of messages to the conversation.

Specify a specific model or model picker.

Set a hyperparameter option.

Link to this section Types

@type t() :: term()

All the types that implement this protocol.

Link to this section Functions

Run inference.

This function performs the following steps:

  • Picks the appropriate model and hyperparameters based on the provided context and settings.
  • Performs any necessary pre-processing, such as RAG (Retrieval-Augmented Generation) or message consolidation.
  • Runs inference on the selected model with the prepared input.
  • Returns the inference result.
Link to this function

stream(context, handler)

Start inference using a streaming handler.

If the selected model does not support streaming, the handler will be called with the final inference result.

Link to this function

with_api_key(context, provider, api_key)

Specify an API key for a provider.

Link to this function

with_api_org(context, provider, api_org)

Specify an API org for a provider.

Link to this function

with_message(context, message, options)

Add a message to the conversation.

Link to this function

with_messages(context, messages, options)

Add a list of messages to the conversation.

Link to this function

with_model(context, model)

Specify a specific model or model picker.

This function allows you to define the model to be used for inference. You can either provide a specific model, like Model.smartest(), or a model picker function that dynamically selects the best model based on the context and available providers.

Examples:

  • Model.smartest() - This will select the "smartest" available model at inference time, based on factors like performance and capabilities.
  • Model.cheapest(params: :best_effort) - This will select the cheapest available model that can handle the given parameters and context size.
  • CustomProvider.custom_model - This allows you to use a custom model from a user-defined provider.
Link to this function

with_safety_setting(context, safety_setting, threshold)

Link to this function

with_setting(context, setting, value)

Set a hyperparameter option.

Some options are model-specific. The value can be a literal or a picker function that dynamically determines the best value based on the context and model.

Examples:

  • Parameter.required(name, value) - This sets a required parameter with the specified name and value.
  • Gemini.best_temperature_for(:chain_of_thought) - This uses a picker function to determine the best temperature for the Gemini provider when using the "chain of thought" prompting technique.
Link to this function

with_tool(context, tool)

Link to this function

with_tools(context, tools)