HuggingfaceClient.Hub.Mixins (huggingface_client v0.1.0)

Copy Markdown View Source

HuggingFace Hub Integration Mixins.

Provides patterns for integrating any Elixir ML library with the Hub, mirroring the Python ModelHubMixin class.

See: https://huggingface.co/docs/huggingface_hub/guides/integrations

There are two main patterns:

  1. Helpers approach — call push_to_hub/2 and load_from_hub/2 directly
  2. Mixin approachuse HuggingfaceClient.Hub.Mixins in your struct module

Pattern 1: Direct helpers

# Save and upload model to Hub
HuggingfaceClient.push_to_hub(my_model_weights, "my-org/my-model",
  config: %{model_type: "custom", num_layers: 12},
  commit_message: "Add initial model weights",
  access_token: token
)

# Load model from Hub
{:ok, weights, config} = HuggingfaceClient.load_from_hub("my-org/my-model",
  access_token: token
)

Pattern 2: Mixin in your module

defmodule MyModel do
  use HuggingfaceClient.Hub.Mixins,
    weights_file: "model.safetensors",
    config_file: "config.json"

  defstruct [:weights, :config]
end

# Now your model has push_to_hub and from_pretrained
{:ok, model} = MyModel.from_pretrained("my-org/my-model")
MyModel.push_to_hub(model, "my-org/my-model")

Summary

Functions

Downloads a model from the Hub.

Pushes a model card (README.md) to the Hub.

Uploads model files to a Hub repository.

Functions

load_from_hub(repo_id, opts \\ [])

@spec load_from_hub(
  String.t(),
  keyword()
) :: {:ok, map()} | {:error, Exception.t()}

Downloads a model from the Hub.

This is the Elixir equivalent of Python's from_pretrained(). Downloads weights and config, then calls your optional loader function.

Options

  • :weights_filename — filename to download (default: "model.safetensors")
  • :config_filename — config filename (default: "config.json")
  • :revision — branch/commit (default: "main")
  • :loader — function (weights_path, config) -> {:ok, model} for custom loading
  • :access_token

Returns

{:ok, %{weights_path: path, config: config_map}} or with :loader, {:ok, model}

Example

# Basic download
{:ok, %{weights_path: path, config: cfg}} =
  HuggingfaceClient.load_from_hub("my-org/my-model", access_token: token)

# With custom loader
{:ok, model} = HuggingfaceClient.load_from_hub("my-org/my-model",
  loader: fn weights_path, config ->
    {:ok, MyModel.load(weights_path, config)}
  end,
  access_token: token
)

push_model_card(repo_id, opts)

@spec push_model_card(
  String.t(),
  keyword()
) :: :ok | {:error, Exception.t()}

Pushes a model card (README.md) to the Hub.

Options

  • :content — full README content (required)
  • :commit_message — commit message
  • :access_token

Example

:ok = HuggingfaceClient.push_model_card("my-org/my-model",
  content: """
  ---
  language: en
  license: apache-2.0
  tags:
    - text-classification
  ---
  # My Model
  A great text classification model.
  """,
  access_token: token
)

push_to_hub(weights, repo_id, opts \\ [])

@spec push_to_hub(binary() | nil, String.t(), keyword()) ::
  :ok | {:error, Exception.t()}

Uploads model files to a Hub repository.

This is the Elixir equivalent of Python's push_to_hub(). Creates the repo if it doesn't exist.

Options

  • :config — map of model config to save as config.json
  • :weights_filename — filename for weights (default: "model.safetensors")
  • :weights — raw binary weights (if not using :files)
  • :files — list of {dest_path, content} tuples to upload
  • :commit_message — git commit message
  • :revision — target branch (default: "main")
  • :private — create as private repo (default: false)
  • :exist_ok — don't fail if repo exists (default: true)
  • :access_token

Example

# Upload weights + config
:ok = HuggingfaceClient.push_to_hub(weights_binary, "my-org/my-bert",
  config: %{
    model_type: "bert",
    num_hidden_layers: 12,
    hidden_size: 768
  },
  commit_message: "Add BERT-base weights",
  access_token: token
)

# Upload multiple files
:ok = HuggingfaceClient.push_to_hub(nil, "my-org/my-model",
  files: [
    {"model.safetensors", File.read!("model.safetensors")},
    {"config.json", Jason.encode!(config)},
    {"tokenizer.json", File.read!("tokenizer.json")},
    {"README.md", "# My Model\n\nA great model."},
  ],
  access_token: token
)