ExLLM.Adapters.Mistral (ex_llm v0.5.0)
View SourceMistral AI API adapter for ExLLM.
Mistral AI provides state-of-the-art language models including:
- Mistral Large: Most capable model for complex reasoning
- Mistral Medium: Balanced model for most use cases
- Mistral Small: Fast and cost-effective model
- Codestral: Specialized model for code generation
- Pixtral: Multimodal model with vision capabilities
Configuration
This adapter requires a Mistral AI API key and optionally a base URL.
Using Environment Variables
# Set environment variables
export MISTRAL_API_KEY="your-api-key"
export MISTRAL_MODEL="mistral/mistral-tiny" # optional
export MISTRAL_API_BASE="https://api.mistral.ai/v1" # optional
# Use with default environment provider
ExLLM.Adapters.Mistral.chat(messages, config_provider: ExLLM.ConfigProvider.Env)
Using Static Configuration
config = %{
mistral: %{
api_key: "your-api-key",
model: "mistral/mistral-small-latest",
base_url: "https://api.mistral.ai/v1" # optional
}
}
{:ok, provider} = ExLLM.ConfigProvider.Static.start_link(config)
ExLLM.Adapters.Mistral.chat(messages, config_provider: provider)
Example Usage
messages = [
%{role: "user", content: "Explain quantum computing"}
]
# Simple chat
{:ok, response} = ExLLM.Adapters.Mistral.chat(messages)
IO.puts(response.content)
# Streaming chat
{:ok, stream} = ExLLM.Adapters.Mistral.stream_chat(messages)
for chunk <- stream do
if chunk.content, do: IO.write(chunk.content)
end
# Function calling
functions = [
%{
name: "get_weather",
description: "Get weather information",
parameters: %{
type: "object",
properties: %{
location: %{type: "string", description: "City name"}
}
}
}
]
{:ok, response} = ExLLM.Adapters.Mistral.chat(messages, tools: functions)