MistralClient.API.Batch (mistralex_ai v0.1.0)
View SourceBatch API for processing multiple requests asynchronously.
The Batch API allows you to submit multiple requests for processing in the background. This is useful for processing large volumes of data without blocking your application.
Supported Endpoints
/v1/chat/completions
- Chat completions/v1/embeddings
- Text embeddings/v1/fim/completions
- Fill-in-the-middle completions/v1/moderations
- Content moderation/v1/chat/moderations
- Chat moderation
Example
# Create a batch job
{:ok, job} = MistralClient.API.Batch.create(client, %{
input_files: ["file-abc123"],
endpoint: "/v1/chat/completions",
model: "mistral-large-latest",
metadata: %{"description" => "Customer support batch"}
})
# Monitor progress
{:ok, updated_job} = MistralClient.API.Batch.get(client, job.id)
IO.puts("Status: #{updated_job.status}, Progress: #{updated_job.completed_requests}/#{updated_job.total_requests}")
# List all batch jobs
{:ok, jobs} = MistralClient.API.Batch.list(client, %{status: ["RUNNING", "QUEUED"]})
Summary
Functions
Cancel a running batch job.
Create a new batch job for processing multiple requests.
Get details of a specific batch job by ID.
List batch jobs with optional filtering and pagination.
Functions
@spec cancel(MistralClient.Client.t(), String.t()) :: {:ok, MistralClient.Models.BatchJobOut.t()} | {:error, any()}
Cancel a running batch job.
Parameters
client
- The MistralClient.Client instancejob_id
- The batch job ID to cancel
Returns
{:ok, %BatchJobOut{}}
- Updated batch job with cancellation status{:error, reason}
- Error details
Example
{:ok, job} = MistralClient.API.Batch.cancel(client, "batch_abc123")
# job.status will be "CANCELLATION_REQUESTED" or "CANCELLED"
@spec create(MistralClient.Client.t(), map() | MistralClient.Models.BatchJobIn.t()) :: {:ok, MistralClient.Models.BatchJobOut.t()} | {:error, any()}
Create a new batch job for processing multiple requests.
Parameters
client
- The MistralClient.Client instancerequest
- BatchJobIn struct or map with::input_files
- List of file IDs to process (required):endpoint
- API endpoint to use (required):model
- Model to use for processing (required):metadata
- Optional metadata map:timeout_hours
- Timeout in hours (default: 24)
Returns
{:ok, %BatchJobOut{}}
- Created batch job{:error, reason}
- Error details
Example
{:ok, job} = MistralClient.API.Batch.create(client, %{
input_files: ["file-abc123", "file-def456"],
endpoint: "/v1/chat/completions",
model: "mistral-large-latest",
metadata: %{"project" => "customer-support"},
timeout_hours: 48
})
@spec get(MistralClient.Client.t(), String.t()) :: {:ok, MistralClient.Models.BatchJobOut.t()} | {:error, any()}
Get details of a specific batch job by ID.
Parameters
client
- The MistralClient.Client instancejob_id
- The batch job ID
Returns
{:ok, %BatchJobOut{}}
- Batch job details{:error, reason}
- Error details
Example
{:ok, job} = MistralClient.API.Batch.get(client, "batch_abc123")
IO.puts("Status: #{job.status}")
IO.puts("Progress: #{job.completed_requests}/#{job.total_requests}")
@spec list(MistralClient.Client.t(), map()) :: {:ok, MistralClient.Models.BatchJobsOut.t()} | {:error, any()}
List batch jobs with optional filtering and pagination.
Parameters
client
- The MistralClient.Client instanceparams
- Optional parameters map::page
- Page number (default: 0):page_size
- Number of jobs per page (default: 100):model
- Filter by model name:metadata
- Filter by metadata:created_after
- Filter by creation date (DateTime):created_by_me
- Filter by ownership (boolean, default: false):status
- Filter by status list (e.g., ["RUNNING", "QUEUED"])
Returns
{:ok, %BatchJobsOut{}}
- List of batch jobs{:error, reason}
- Error details
Example
{:ok, jobs} = MistralClient.API.Batch.list(client, %{
page: 0,
page_size: 50,
status: ["RUNNING", "QUEUED"],
model: "mistral-large-latest"
})