MistralClient.API.Agents (mistralex_ai v0.1.0)
View SourceAgents API for the Mistral AI client.
This module provides functions for creating agent completions, both streaming and non-streaming, with support for tools, function calling, and agent-specific configurations.
Features
- Agent-based chat completions
- Streaming agent completions
- Tool/function calling support for agents
- Agent-specific configurations
- Temperature and sampling controls
- Token usage tracking
Usage
# Basic agent completion
{:ok, response} = MistralClient.API.Agents.complete(
"agent-123",
[%{role: "user", content: "Hello, how are you?"}]
)
# Agent completion with options
{:ok, response} = MistralClient.API.Agents.complete(
"agent-123",
[%{role: "user", content: "Hello!"}],
%{temperature: 0.7, max_tokens: 100}
)
# Streaming agent completion
MistralClient.API.Agents.stream(
"agent-123",
[%{role: "user", content: "Tell me a story"}],
fn chunk ->
content = get_in(chunk, ["choices", Access.at(0), "delta", "content"])
if content, do: IO.write(content)
end
)
Summary
Functions
Create an agent completion.
Create an agent completion (legacy interface).
Create a streaming agent completion.
Create a streaming agent completion (legacy interface).
Create an agent completion with tool/function calling.
Types
@type message() :: MistralClient.Models.Message.t() | map()
@type options() :: %{ temperature: float() | nil, max_tokens: integer() | nil, top_p: float() | nil, stream: boolean() | nil, tools: list() | nil, tool_choice: String.t() | map() | nil, response_format: map() | nil, random_seed: integer() | nil, stop: String.t() | [String.t()] | nil, presence_penalty: float() | nil, frequency_penalty: float() | nil, n: integer() | nil, prediction: map() | nil, parallel_tool_calls: boolean() | nil }
Functions
@spec complete(keyword() | MistralClient.Client.t() | MistralClient.Config.t(), map()) :: {:ok, MistralClient.Models.ChatCompletion.t()} | {:error, Exception.t()}
Create an agent completion.
Parameters
config
- Configuration keyword list or Client structrequest
- Request map with agent_id, messages and options
Request Options
:agent_id
- Agent ID to use for completion (required):messages
- List of message maps (required):temperature
- Sampling temperature (0.0 to 2.0):max_tokens
- Maximum tokens to generate:top_p
- Nucleus sampling parameter:tools
- List of available tools/functions:tool_choice
- Tool choice strategy:response_format
- Structured output format:random_seed
- Random seed for reproducibility:stop
- Stop sequences:presence_penalty
- Presence penalty (-2.0 to 2.0):frequency_penalty
- Frequency penalty (-2.0 to 2.0):n
- Number of completions to return:prediction
- Prediction configuration:parallel_tool_calls
- Enable parallel tool calls
Examples
config = [api_key: "your-api-key"]
request = %{
"agent_id" => "agent-123",
"messages" => [%{"role" => "user", "content" => "Hello!"}]
}
{:ok, response} = MistralClient.API.Agents.complete(config, request)
@spec complete(String.t(), [message()], options(), MistralClient.Client.t() | nil) :: {:ok, MistralClient.Models.ChatCompletion.t()} | {:error, Exception.t()}
Create an agent completion (legacy interface).
Parameters
agent_id
- Agent ID to use for completionmessages
- List of message maps or structsoptions
- Optional parameters for the completionclient
- HTTP client (optional, uses default if not provided)
Examples
{:ok, response} = MistralClient.API.Agents.complete(
"agent-123",
[%{role: "user", content: "What is the capital of France?"}]
)
@spec stream( keyword() | MistralClient.Client.t() | MistralClient.Config.t(), map(), function() | nil ) :: {:ok, Enumerable.t()} | :ok | {:error, Exception.t()}
Create a streaming agent completion.
Parameters
config
- Configuration keyword list or Client structrequest
- Request map with agent_id, messages and optionscallback
- Function to handle each chunk (optional, returns stream if not provided)
Examples
config = [api_key: "your-api-key"]
request = %{
"agent_id" => "agent-123",
"messages" => [%{"role" => "user", "content" => "Tell me a story"}]
}
{:ok, stream} = MistralClient.API.Agents.stream(config, request)
@spec stream_legacy( String.t(), [message()], function(), options(), MistralClient.Client.t() | nil ) :: :ok | {:error, Exception.t()}
Create a streaming agent completion (legacy interface).
Parameters
agent_id
- Agent ID to use for completionmessages
- List of message maps or structscallback
- Function to handle each chunkoptions
- Optional parameters for the completionclient
- HTTP client (optional, uses default if not provided)
Examples
MistralClient.API.Agents.stream_legacy(
"agent-123",
[%{role: "user", content: "Tell me a story"}],
fn chunk ->
content = get_in(chunk, ["choices", Access.at(0), "delta", "content"])
if content, do: IO.write(content)
end
)
@spec with_tools( String.t(), [message()], [map()], options(), MistralClient.Client.t() | nil ) :: {:ok, MistralClient.Models.ChatCompletion.t()} | {:error, Exception.t()}
Create an agent completion with tool/function calling.
Parameters
agent_id
- Agent ID to use for completionmessages
- List of message maps or structstools
- List of available tools/functionsoptions
- Optional parameters for the completionclient
- HTTP client (optional, uses default if not provided)
Tool Format
tools = [
%{
type: "function",
function: %{
name: "get_weather",
description: "Get current weather for a location",
parameters: %{
type: "object",
properties: %{
location: %{type: "string", description: "City name"}
},
required: ["location"]
}
}
}
]
Examples
{:ok, response} = MistralClient.API.Agents.with_tools(
"agent-123",
[%{role: "user", content: "What's the weather in Paris?"}],
tools
)