Fetches metadata for a model from the HuggingFace Hub and displays it, including all available inference providers.
$ mix hf.model_info meta-llama/Llama-3.1-8B-Instruct
Model: meta-llama/Llama-3.1-8B-Instruct
Task: text-generation
Library: transformers
Downloads: 1_234_567
Likes: 8_901
Gated: false
Private: false
Available providers (status: live)
─────────────────────────────────
groq conversational llama-3.1-8b-instant
together conversational meta-llama/Meta-Llama-3.1-8B-Instruct-Turbo
nebius conversational meta-llama/Meta-Llama-3.1-8B-Instruct
...
Options
--token TOKEN HuggingFace access token (or set HF_TOKEN env var)
--json Output as JSON