Alchemind.OpenAI (AlchemindOpenAI v0.1.0-rc.1)

View Source

OpenAI provider implementation for the Alchemind LLM interface.

This module implements the Alchemind behaviour for interacting with OpenAI's API.

Summary

Functions

complete(client, messages, callback_or_opts \\ [], opts \\ [])

Completes a conversation using OpenAI's API with optional streaming.

Parameters

  • client: OpenAI client created with new/1
  • messages: List of messages in the conversation
  • callback_or_opts: Callback function for streaming (not supported) or options
  • opts: Additional options for the completion request (when callback is provided)

Options

  • :model - OpenAI model to use (required unless specified in client)
  • :temperature - Controls randomness (0.0 to 2.0)
  • :max_tokens - Maximum number of tokens to generate

Examples

Using model in options:

iex> {:ok, client} = Alchemind.OpenAI.new(api_key: "sk-...")
iex> messages = [
...>   %{role: :system, content: "You are a helpful assistant."},
...>   %{role: :user, content: "Hello, world!"}
...> ]
iex> Alchemind.OpenAI.complete(client, messages, model: "gpt-4o", temperature: 0.7)

Using default model from client:

iex> {:ok, client} = Alchemind.OpenAI.new(api_key: "sk-...", model: "gpt-4o")
iex> messages = [
...>   %{role: :system, content: "You are a helpful assistant."},
...>   %{role: :user, content: "Hello, world!"}
...> ]
iex> Alchemind.OpenAI.complete(client, messages, temperature: 0.7)

Note: Streaming is not supported in the direct OpenAI implementation. Use OpenAILangChain for streaming support.

complete_chat(client_resource, messages, model)

create_client(api_key, base_url)

new(opts \\ [])

Creates a new OpenAI client.

Options

  • :api_key - OpenAI API key (required)
  • :base_url - API base URL (default: https://api.openai.com/v1)
  • :model - Default model to use (optional, can be overridden in complete calls)

Examples

iex> Alchemind.OpenAI.new(api_key: "sk-...")
{:ok, <Rust client resource>}

iex> Alchemind.OpenAI.new(api_key: "sk-...", model: "gpt-4o")
{:ok, <Rust client resource>}

Returns

  • {:ok, client} - OpenAI client
  • {:error, reason} - Error with reason

process_completion_chunk(client_resource, messages, model, pid, ref)

speech(client, input, opts \\ [])

Converts text to speech using OpenAI's API.

Parameters

  • client: OpenAI client created with new/1
  • input: Text to convert to speech
  • opts: Options for the speech request

Options

  • :model - OpenAI text-to-speech model to use (default: "gpt-4o-mini-tts")
  • :voice - Voice to use (default: "alloy")
  • :response_format - Format of the audio (default: "mp3")
  • :speed - Speed of the generated audio (optional)

Examples

iex> {:ok, client} = Alchemind.OpenAI.new(api_key: "sk-...")
iex> Alchemind.OpenAI.tts(client, "Hello, world!", voice: "echo")
{:ok, <<binary audio data>>}

Returns

  • {:ok, audio_binary} - Successful speech generation with audio binary
  • {:error, reason} - Error with reason

text_to_speech(client_resource, input, opts)

transcribe(client, audio_binary, opts \\ [])

Transcribes audio to text using OpenAI's API.

Parameters

  • client: OpenAI client created with new/1
  • audio_binary: Binary audio data
  • opts: Options for the transcription request

Options

  • :model - OpenAI transcription model to use (default: "whisper-1")
  • :language - Language of the audio (default: nil, auto-detect)
  • :prompt - Optional text to guide the model's transcription
  • :response_format - Format of the transcript (default: "json")
  • :temperature - Controls randomness (0.0 to 1.0, default: 0)

Examples

iex> {:ok, client} = Alchemind.OpenAI.new(api_key: "sk-...")
iex> audio_binary = File.read!("audio.mp3")
iex> Alchemind.OpenAI.transcribe(client, audio_binary, language: "en")
{:ok, "This is a transcription of the audio."}

Returns

  • {:ok, text} - Successful transcription with text
  • {:error, reason} - Error with reason

transcribe_audio(client_resource, audio_binary, opts)