GenAI (GenAI Core v0.2.0)

Link to this section Summary

Functions

Run inference. Returning update chat completion and updated thread state.

Run inference in streaming mode, interstitial messages (dynamics) if any will sent to the stream handler using the interstitial handle

Set API Key or API Key constraint for inference. @todo we will need per model keys for ollam and hugging face.

Set API Org or API Org constraint for inference.

Append message to thread. @note Message may be dynamic/generated.

Append messages to thread. @note Messages may be dynamic/generated.

Set model or model selector constraint for inference.

Set safety setting for inference. @note - only fully supported by Gemini. backwards compatibility can be enabled via prompting but will be less reliable.

Set setting or setting selector constraint for inference.

Set Inference setting. GenAI.Session

Set settings setting selector constraints for inference.

Override streaming handler module.

Set tool for inference.

Set tools for inference.

Link to this section Functions

Link to this function

chat(context_type \\ :default, options \\ nil)

Creates a new chat context.

Link to this function

execute(thread_context, command, context, options \\ nil)

Execute command.

# Notes Used, for example, to retrieve full report of a thread with an optimization loop or data loop command. Under usual processing not final/accepted grid search loops are not returned in response and a linear thread is returned. Execute mode however will return a graph of all runs, or meta data based on options, and grid search configuration.

Link to this function

report(thread_context, context, options \\ nil)

Shorthand for execute report

Link to this function

run(thread_context)

Run inference. Returning update chat completion and updated thread state.

Link to this function

run(thread_context, context, options \\ nil)

Link to this function

stream(thread_context, context, options \\ nil)

Run inference in streaming mode, interstitial messages (dynamics) if any will sent to the stream handler using the interstitial handle

Link to this function

with_api_key(thead_context, provider, api_key)

Set API Key or API Key constraint for inference. @todo we will need per model keys for ollam and hugging face.

Link to this function

with_api_org(thead_context, provider, api_org)

Set API Org or API Org constraint for inference.

Link to this function

with_message(thead_context, message, options \\ nil)

Append message to thread. @note Message may be dynamic/generated.

Link to this function

with_messages(thead_context, messages, options \\ nil)

Append messages to thread. @note Messages may be dynamic/generated.

Link to this function

with_model(thead_context, model)

Set model or model selector constraint for inference.

Link to this function

with_model_setting(thead_context, model_setting)

See GenAI.ThreadProtocol.with_model_setting/2.

Link to this function

with_model_setting(thead_context, model, setting, value)

See GenAI.ThreadProtocol.with_model_setting/4.

Link to this function

with_provider_setting(thead_context, provider_setting)

See GenAI.ThreadProtocol.with_provider_setting/2.

Link to this function

with_provider_setting(thead_context, provider, setting, value)

See GenAI.ThreadProtocol.with_provider_setting/4.

Link to this function

with_provider_settings(thead_context, provider_settings)

See GenAI.ThreadProtocol.with_provider_settings/2.

Link to this function

with_provider_settings(thead_context, provider, provider_settings)

See GenAI.ThreadProtocol.with_provider_settings/3.

Link to this function

with_safety_setting(thead_context, safety_setting_object)

See GenAI.ThreadProtocol.with_safety_setting/2.

Link to this function

with_safety_setting(thead_context, safety_setting, threshold)

Set safety setting for inference. @note - only fully supported by Gemini. backwards compatibility can be enabled via prompting but will be less reliable.

Link to this function

with_setting(thead_context, setting_object)

Set setting or setting selector constraint for inference.

Link to this function

with_setting(thead_context, setting, value)

Set Inference setting. GenAI.Session

Link to this function

with_settings(thead_context, setting_object)

Set settings setting selector constraints for inference.

Link to this function

with_stream_handler(context, handler, options \\ nil)

Override streaming handler module.

Link to this function

with_tool(thead_context, tool)

Set tool for inference.

Link to this function

with_tools(thead_context, tools)

Set tools for inference.