Mistral (Mistral v0.1.0)

Client for the Mistral AI API.

This library provides a simple and convenient way to integrate with Mistral's API, allowing you to use their powerful language models in your Elixir applications.

Summary

Types

Client struct

Client response

Functions

Chat with a Mistral model. Send messages and get a completion from the model.

Generate embeddings for the given input using a specified model.

Creates a new Mistral API client using the API key set in your application's config.

Creates a new Mistral API client with the given API key.

Types

client()

@type client() :: %Mistral{req: Req.Request.t()}

Client struct

response()

@type response() :: {:ok, map() | Enumerable.t() | Task.t()} | {:error, term()}

Client response

Functions

chat(client, params)

@spec chat(
  client(),
  keyword()
) :: response()

Chat with a Mistral model. Send messages and get a completion from the model.

Options

  • :model - The model to use for generating the response (required)
  • :messages - List of messages in the conversation (required)
  • :temperature - Controls randomness (0.0 to 1.0)
  • :max_tokens - Maximum number of tokens to generate
  • :stream - When true, returns a stream of partial response chunks
  • :tools - List of tools available to the model. Each tool must be a map with:
    • :type set to "function"
    • :function containing function details
  • :tool_choice - How to choose tools

Examples

iex> Mistral.chat(client, [
...>   model: "mistral-small-latest",
...>   messages: [
...>     %{role: "user", content: "Write a haiku."}
...>   ]
...> ])
{:ok, %{"choices" => [%{"message" => %{"content" => "Nature's whisper soft..."}}], ...}}

# Stream the response
iex> {:ok, stream} = Mistral.chat(client, [
...>   model: "mistral-small-latest",
...>   messages: [%{role: "user", content: "Write a haiku."}],
...>   stream: true
...> ])
iex> Enum.to_list(stream)
[%{"choices" => [%{"delta" => ...}]}, ...]

## Tool Example

iex> Mistral.chat(client, [
...>   model: "mistral-large-latest",
...>   messages: [%{role: "user", content: "What is the weather?"}],
...>   tools: [
...>     %{
...>       type: "function",
...>       function: %{
...>         name: "get_weather",
...>         description: "Fetches current weather",
...>         parameters: %{type: "object", properties: %{}}
...>       }
...>     }
...>   ],
...>   tool_choice: "auto"
...> ])

embed(client, params)

@spec embed(
  client(),
  keyword()
) :: response()

Generate embeddings for the given input using a specified model.

Options

  • :model - The model to use for generating embeddings (default: "mistral-embed")
  • :input - Text or list of texts to generate embeddings for

Examples

iex> Mistral.embed(client, input: "Hello, world!")
{:ok, %{"data" => [%{"embedding" => [...]}]}}

iex> Mistral.embed(client, input: ["First text", "Second text"])
{:ok, %{"data" => [%{"embedding" => [...]}, %{"embedding" => [...]}]}}

init()

@spec init() :: client()

Creates a new Mistral API client using the API key set in your application's config.

config :mistral, :api_key, "your-api-key"

If given, a keyword list of options will be passed to Req.new/1.

init(opts)

@spec init(keyword()) :: client()

init(api_key, opts \\ [])

@spec init(
  String.t(),
  keyword()
) :: client()

Creates a new Mistral API client with the given API key.

Optionally, a keyword list of options can be passed through to Req.new/1.