MistralClient.API.Beta.Conversations (mistralex_ai v0.1.0)
View SourceBeta Conversations API for managing persistent conversations with agents.
This module provides functionality to create, manage, and interact with conversations that can persist across multiple interactions and maintain context.
Features
- Start conversations with agents or models
- Append messages to existing conversations
- Retrieve conversation history and messages
- Restart conversations from specific points
- Stream conversation responses in real-time
Usage
config = MistralClient.Config.new()
# Start a conversation with an agent
{:ok, conversation} = start(config, %{
agent_id: "agent_123",
inputs: "Hello, I need help with my order."
})
# Append to the conversation
{:ok, response} = append(config, conversation.conversation_id, %{
inputs: "Can you check order #12345?"
})
# Get conversation history
{:ok, history} = history(config, conversation.conversation_id)
Summary
Functions
Append new entries to an existing conversation.
Append new entries to an existing conversation with streaming.
Retrieve a specific conversation by ID.
Retrieve all entries in a conversation.
List conversations with optional pagination.
Retrieve all messages in a conversation.
Restart a conversation from a specific entry.
Restart a conversation from a specific entry with streaming.
Start a new conversation.
Start a new conversation with streaming responses.
Functions
@spec append(MistralClient.Config.t(), String.t(), map()) :: {:ok, MistralClient.Models.Beta.ConversationResponse.t()} | {:error, term()}
Append new entries to an existing conversation.
Parameters
config
- Client configurationconversation_id
- The conversation IDrequest
- Append request with::inputs
- New message(s) to append (required):stream
- Whether to stream responses (optional, default: false):store
- Whether to store results (optional, default: true):handoff_execution
- Handoff execution mode (optional):completion_args
- Completion arguments (optional)
Examples
{:ok, response} = append(config, conversation_id, %{
inputs: "What's the weather like today?"
})
@spec append_stream(MistralClient.Config.t(), String.t(), map(), function()) :: {:ok, term()} | {:error, term()}
Append new entries to an existing conversation with streaming.
Parameters
config
- Client configurationconversation_id
- The conversation IDrequest
- Append request (same as append/3)callback
- Function to handle streaming chunks
Examples
append_stream(config, conversation_id, %{
inputs: "Continue the story"
}, fn chunk ->
IO.write(chunk.content || "")
end)
@spec get(MistralClient.Config.t(), String.t()) :: {:ok, MistralClient.Models.Beta.Conversation.t()} | {:error, term()}
Retrieve a specific conversation by ID.
Parameters
config
- Client configurationconversation_id
- The conversation ID to retrieve
Examples
{:ok, conversation} = get(config, "conv_123")
@spec history(MistralClient.Config.t(), String.t()) :: {:ok, MistralClient.Models.Beta.ConversationHistory.t()} | {:error, term()}
Retrieve all entries in a conversation.
Parameters
config
- Client configurationconversation_id
- The conversation ID
Examples
{:ok, history} = history(config, conversation_id)
@spec list(MistralClient.Config.t(), map()) :: {:ok, [MistralClient.Models.Beta.Conversation.t()]} | {:error, term()}
List conversations with optional pagination.
Parameters
config
- Client configurationoptions
- Optional parameters::page
- Page number (default: 0):page_size
- Number of conversations per page (default: 100)
Examples
{:ok, conversations} = list(config)
{:ok, conversations} = list(config, %{page: 1, page_size: 50})
@spec messages(MistralClient.Config.t(), String.t()) :: {:ok, MistralClient.Models.Beta.ConversationHistory.t()} | {:error, term()}
Retrieve all messages in a conversation.
Parameters
config
- Client configurationconversation_id
- The conversation ID
Examples
{:ok, messages} = messages(config, conversation_id)
@spec restart(MistralClient.Config.t(), String.t(), map()) :: {:ok, MistralClient.Models.Beta.ConversationResponse.t()} | {:error, term()}
Restart a conversation from a specific entry.
Parameters
config
- Client configurationconversation_id
- The conversation IDrequest
- Restart request with::inputs
- New message(s) (required):from_entry_id
- Entry ID to restart from (required):stream
- Whether to stream responses (optional, default: false):store
- Whether to store results (optional, default: true):handoff_execution
- Handoff execution mode (optional):completion_args
- Completion arguments (optional)
Examples
{:ok, response} = restart(config, conversation_id, %{
inputs: "Let's try a different approach",
from_entry_id: "entry_456"
})
@spec restart_stream(MistralClient.Config.t(), String.t(), map(), function()) :: {:ok, term()} | {:error, term()}
Restart a conversation from a specific entry with streaming.
Parameters
config
- Client configurationconversation_id
- The conversation IDrequest
- Restart request (same as restart/3)callback
- Function to handle streaming chunks
Examples
restart_stream(config, conversation_id, %{
inputs: "Let's start over",
from_entry_id: "entry_123"
}, fn chunk ->
IO.write(chunk.content || "")
end)
@spec start(MistralClient.Config.t(), map()) :: {:ok, MistralClient.Models.Beta.ConversationResponse.t()} | {:error, term()}
Start a new conversation.
Parameters
config
- Client configurationrequest
- Conversation start request with::inputs
- Initial message(s) (required):agent_id
- Agent ID to use (optional, mutually exclusive with model):model
- Model to use (optional, mutually exclusive with agent_id):instructions
- Custom instructions (optional):tools
- Tools available in conversation (optional):completion_args
- Completion arguments (optional):name
- Conversation name (optional):description
- Conversation description (optional):store
- Whether to store the conversation (optional, default: true):handoff_execution
- Handoff execution mode (optional)
Examples
# Start with an agent
{:ok, conversation} = start(config, %{
agent_id: "agent_123",
inputs: "Hello, how can you help me?"
})
# Start with a model
{:ok, conversation} = start(config, %{
model: "mistral-large-latest",
inputs: "Explain quantum computing",
instructions: "You are a physics teacher."
})
@spec start_stream(MistralClient.Config.t(), map(), function()) :: {:ok, term()} | {:error, term()}
Start a new conversation with streaming responses.
Parameters
config
- Client configurationrequest
- Conversation start request (same as start/2)callback
- Function to handle streaming chunks
Examples
start_stream(config, %{
agent_id: "agent_123",
inputs: "Tell me a story"
}, fn chunk ->
IO.write(chunk.content || "")
end)