MistralClient.API.Chat (mistralex_ai v0.1.0)

View Source

Chat completions API for the Mistral AI client.

This module provides functions for creating chat completions, both streaming and non-streaming, with support for tools, function calling, and structured outputs.

Features

  • Standard chat completions
  • Streaming chat completions
  • Tool/function calling support
  • Structured output parsing
  • Temperature and sampling controls
  • Token usage tracking

Usage

# Basic chat completion
{:ok, response} = MistralClient.API.Chat.complete([
  %{role: "user", content: "Hello, how are you?"}
])

# Chat with options
{:ok, response} = MistralClient.API.Chat.complete(
  [%{role: "user", content: "Hello!"}],
  %{model: "mistral-large-latest", temperature: 0.7}
)

# Streaming chat
MistralClient.API.Chat.stream([
  %{role: "user", content: "Tell me a story"}
], fn chunk ->
  content = get_in(chunk, ["choices", Access.at(0), "delta", "content"])
  if content, do: IO.write(content)
end)

Summary

Functions

Create a chat completion.

Create a chat completion (legacy interface).

Create a chat completion with structured output parsing.

Create a streaming chat completion.

Create a chat completion with tool/function calling.

Types

message()

@type message() :: MistralClient.Models.Message.t() | map()

options()

@type options() :: %{
  model: String.t(),
  temperature: float() | nil,
  max_tokens: integer() | nil,
  top_p: float() | nil,
  stream: boolean() | nil,
  tools: list() | nil,
  tool_choice: String.t() | map() | nil,
  response_format: map() | nil,
  safe_prompt: boolean() | nil,
  random_seed: integer() | nil
}

Functions

complete(config, request)

@spec complete(keyword() | MistralClient.Client.t(), map()) ::
  {:ok, MistralClient.Models.ChatCompletion.t()} | {:error, Exception.t()}

Create a chat completion.

Parameters

  • config - Configuration keyword list or Client struct
  • request - Request map with messages and options

Request Options

  • :model - Model to use (default: "mistral-large-latest")
  • :messages - List of message maps (required)
  • :temperature - Sampling temperature (0.0 to 1.0)
  • :max_tokens - Maximum tokens to generate
  • :top_p - Nucleus sampling parameter
  • :tools - List of available tools/functions
  • :tool_choice - Tool choice strategy
  • :response_format - Structured output format
  • :safe_prompt - Enable safety filtering
  • :random_seed - Random seed for reproducibility

Examples

config = [api_key: "your-api-key"]
request = %{
  "messages" => [%{"role" => "user", "content" => "Hello!"}],
  "model" => "mistral-tiny"
}
{:ok, response} = MistralClient.API.Chat.complete(config, request)

complete(messages, options \\ %{}, client \\ nil)

@spec complete([message()], options(), MistralClient.Client.t() | nil) ::
  {:ok, MistralClient.Models.ChatCompletion.t()} | {:error, Exception.t()}

Create a chat completion (legacy interface).

Parameters

  • messages - List of message maps or structs
  • options - Optional parameters for the completion
  • client - HTTP client (optional, uses default if not provided)

Examples

{:ok, response} = MistralClient.API.Chat.complete([
  %{role: "user", content: "What is the capital of France?"}
])

parse(messages, response_format, options \\ %{}, client \\ nil)

@spec parse([message()], map(), options(), MistralClient.Client.t() | nil) ::
  {:ok, MistralClient.Models.ChatCompletion.t()} | {:error, Exception.t()}

Create a chat completion with structured output parsing.

Parameters

  • messages - List of message maps or structs
  • response_format - JSON schema for structured output
  • options - Optional parameters for the completion
  • client - HTTP client (optional, uses default if not provided)

Examples

schema = %{
  type: "object",
  properties: %{
    name: %{type: "string"},
    age: %{type: "integer"}
  },
  required: ["name", "age"]
}

{:ok, response} = MistralClient.API.Chat.parse(
  [%{role: "user", content: "Extract: John is 25 years old"}],
  schema
)

stream(config, request, callback \\ nil)

@spec stream(keyword() | MistralClient.Client.t(), map(), function() | nil) ::
  {:ok, Enumerable.t()} | :ok | {:error, Exception.t()}

Create a streaming chat completion.

Parameters

  • config - Configuration keyword list or Client struct
  • request - Request map with messages and options
  • callback - Function to handle each chunk (optional, returns stream if not provided)

Examples

config = [api_key: "your-api-key"]
request = %{
  "messages" => [%{"role" => "user", "content" => "Tell me a story"}],
  "model" => "mistral-tiny"
}
{:ok, stream} = MistralClient.API.Chat.stream(config, request)

with_tools(messages, tools, options \\ %{}, client \\ nil)

@spec with_tools([message()], [map()], options(), MistralClient.Client.t() | nil) ::
  {:ok, MistralClient.Models.ChatCompletion.t()} | {:error, Exception.t()}

Create a chat completion with tool/function calling.

Parameters

  • messages - List of message maps or structs
  • tools - List of available tools/functions
  • options - Optional parameters for the completion
  • client - HTTP client (optional, uses default if not provided)

Tool Format

tools = [
  %{
    type: "function",
    function: %{
      name: "get_weather",
      description: "Get current weather for a location",
      parameters: %{
        type: "object",
        properties: %{
          location: %{type: "string", description: "City name"}
        },
        required: ["location"]
      }
    }
  }
]

Examples

{:ok, response} = MistralClient.API.Chat.with_tools(
  [%{role: "user", content: "What's the weather in Paris?"}],
  tools
)