Top-level public API for the HuggingFace Elixir client.
All inference functions accept either a HuggingfaceClient.Client struct (for
shared defaults across calls) or a plain keyword list / map of per-call options.
Quick start
# Build a reusable client
client = HuggingfaceClient.client("hf_your_token")
# Chat completion
{:ok, resp} = HuggingfaceClient.chat_completion(client, %{
model: "meta-llama/Llama-3-8B-Instruct",
messages: [%{role: "user", content: "Tell me a joke."}]
})
# Streaming
{:ok, stream} = HuggingfaceClient.chat_completion_stream(client, %{
model: "meta-llama/Llama-3-8B-Instruct",
messages: [%{role: "user", content: "Count to five."}]
})
HuggingfaceClient.StreamHelpers.each_content(stream, &IO.write/1)
# Text-to-image
{:ok, png_bytes} = HuggingfaceClient.text_to_image(client, %{
model: "stabilityai/stable-diffusion-xl-base-1.0",
inputs: "A cat wearing a space suit"
})
Summary
Functions
Transcribes audio. Pass raw audio bytes as :inputs.
Sends a chat-completion request. Returns {:ok, response_map} or {:error, exception}.
Sends a streaming chat-completion request.
Creates a reusable Client struct.
Creates a Client bound to a specific inference endpoint_url.
Extracts feature vectors / embeddings.
Classifies images.
Sentence similarity / bi-encoder scoring.
Summarises text.
Text-generation (non-chat).
Generates an image from a text prompt. Returns {:ok, binary} (PNG/JPEG bytes).
Translates text.
Classifies text into provided candidate_labels.
Functions
@spec automatic_speech_recognition(HuggingfaceClient.Client.t() | map(), map()) :: {:ok, map()} | {:error, Exception.t()}
Transcribes audio. Pass raw audio bytes as :inputs.
@spec chat_completion(HuggingfaceClient.Client.t() | map(), map()) :: {:ok, map()} | {:error, Exception.t()}
Sends a chat-completion request. Returns {:ok, response_map} or {:error, exception}.
@spec chat_completion_stream(HuggingfaceClient.Client.t() | map(), map()) :: {:ok, Enumerable.t()} | {:error, Exception.t()}
Sends a streaming chat-completion request.
Returns {:ok, stream} where stream is a lazy enumerable of decoded
chat.completion.chunk maps. Use HuggingfaceClient.StreamHelpers to consume it.
@spec client( String.t() | nil, keyword() ) :: HuggingfaceClient.Client.t()
Creates a reusable Client struct.
Options
:provider— inference provider, e.g."groq","together". Defaults to auto-routing.:bill_to— org slug billed for router requests.:endpoint_url— custom inference endpoint URL.:retry_on_503— whether to retry once on 503 (defaulttrue).:req_opts— extra keyword opts forwarded toReq.
Raises InputError on unknown or mistyped options.
@spec endpoint_client(String.t() | nil, String.t(), keyword()) :: HuggingfaceClient.Client.t()
Creates a Client bound to a specific inference endpoint_url.
@spec feature_extraction(HuggingfaceClient.Client.t() | map(), map()) :: {:ok, list()} | {:error, Exception.t()}
Extracts feature vectors / embeddings.
@spec image_classification(HuggingfaceClient.Client.t() | map(), map()) :: {:ok, list()} | {:error, Exception.t()}
Classifies images.
@spec sentence_similarity(HuggingfaceClient.Client.t() | map(), map()) :: {:ok, list()} | {:error, Exception.t()}
Sentence similarity / bi-encoder scoring.
@spec summarization(HuggingfaceClient.Client.t() | map(), map()) :: {:ok, map()} | {:error, Exception.t()}
Summarises text.
@spec text_generation(HuggingfaceClient.Client.t() | map(), map()) :: {:ok, map()} | {:error, Exception.t()}
Text-generation (non-chat).
@spec text_to_image(HuggingfaceClient.Client.t() | map(), map()) :: {:ok, binary()} | {:error, Exception.t()}
Generates an image from a text prompt. Returns {:ok, binary} (PNG/JPEG bytes).
@spec translation(HuggingfaceClient.Client.t() | map(), map()) :: {:ok, map()} | {:error, Exception.t()}
Translates text.
@spec zero_shot_classification(HuggingfaceClient.Client.t() | map(), map()) :: {:ok, list()} | {:error, Exception.t()}
Classifies text into provided candidate_labels.