HuggingFace MCP Client — Model Context Protocol integration.
Enables LLMs to interact with external tools via MCP (Model Context Protocol).
This mirrors the huggingface_hub.MCPClient from Python (new in v1.7).
MCP clients can connect to:
- Remote HTTP/SSE servers — web APIs exposed as MCP tools
- Local stdio scripts — local programs that expose MCP tools
See: https://huggingface.co/docs/huggingface_hub/package_reference/mcp See: https://modelcontextprotocol.io
Example
# Create an MCP client with Groq as the LLM backend
mcp = HuggingfaceClient.mcp_client(
model: "qwen/qwen3-32b",
provider: "groq",
api_key: "hf_..."
)
# Add MCP tool servers
mcp = HuggingfaceClient.MCP.add_server(mcp,
name: "brave_search",
url: "https://brave-search-mcp.example.com/sse",
type: :sse
)
# Run an agentic loop with the tools
{:ok, result} = HuggingfaceClient.MCP.run(mcp,
messages: [%{"role" => "user", "content" => "Search for recent AI news"}],
max_steps: 5
)
IO.puts(result["content"])
Summary
Functions
Adds an MCP tool server to the client.
Lists all connected MCP servers.
Lists all tools available from connected MCP servers.
Creates a new MCP client.
Removes an MCP server by name.
Runs an agentic loop: the LLM decides which tools to call, executes them via MCP,
and continues until it has a final answer or hits max_steps.
Returns the MCP server URL for a HuggingFace Space.
Types
Functions
Adds an MCP tool server to the client.
Options
:name— server name (required):url— server URL for HTTP/SSE servers (required unless:commandis set):type— connection type::sse,:http,:stdio(default::sse):command— command list for stdio servers (e.g.["python", "my_tools.py"]):headers— additional HTTP headers
Examples
# SSE server (remote)
mcp = HuggingfaceClient.MCP.add_server(mcp,
name: "brave_search",
url: "https://brave-search-mcp.example.com/sse",
type: :sse
)
# HTTP server
mcp = HuggingfaceClient.MCP.add_server(mcp,
name: "my_api",
url: "https://my-mcp-server.com/mcp",
type: :http
)
# Local stdio server
mcp = HuggingfaceClient.MCP.add_server(mcp,
name: "local_tools",
type: :stdio,
command: ["python", "tools/mcp_server.py"]
)
Lists all connected MCP servers.
@spec list_tools(t()) :: {:ok, map()} | {:error, Exception.t()}
Lists all tools available from connected MCP servers.
Returns a map of server_name => [tool_info].
Example
{:ok, tools} = HuggingfaceClient.MCP.list_tools(mcp)
Enum.each(tools, fn {server, tool_list} ->
IO.puts("#{server}: #{length(tool_list)} tools")
Enum.each(tool_list, fn t -> IO.puts(" - #{t["name"]}") end)
end)
Creates a new MCP client.
Options
:model— LLM model ID (e.g."Qwen/Qwen3-32B"):provider— inference provider (e.g."groq","together"):api_key— API key / HF token:base_url— custom inference endpoint URL:timeout— request timeout in ms (default: 120_000)
Example
mcp = HuggingfaceClient.MCP.new(
model: "Qwen/Qwen3-32B",
provider: "groq",
api_key: "hf_..."
)
Removes an MCP server by name.
@spec run( t(), keyword() ) :: {:ok, map()} | {:error, Exception.t()}
Runs an agentic loop: the LLM decides which tools to call, executes them via MCP,
and continues until it has a final answer or hits max_steps.
Options
:messages— list of chat messages (required):max_steps— maximum agentic steps (default: 10):system_prompt— system prompt for the agent
Returns
{:ok, %{content: final_answer, steps: step_count, tool_calls: [...]}} or {:error, ...}.
Example
{:ok, result} = HuggingfaceClient.MCP.run(mcp,
messages: [%{"role" => "user", "content" => "What is 2+2?"}],
max_steps: 5
)
IO.puts(result["content"])
Returns the MCP server URL for a HuggingFace Space.
Spaces can expose tools via the MCP protocol when they have an MCP endpoint.
Example
url = HuggingfaceClient.MCP.space_url("my-org/my-mcp-space")
# "https://my-org-my-mcp-space.hf.space/sse"
mcp = HuggingfaceClient.MCP.add_server(mcp,
name: "my_space_tools",
url: url,
type: :sse
)