ExLLM.Adapters.Perplexity (ex_llm v0.5.0)

View Source

Perplexity AI API adapter for ExLLM.

Perplexity AI provides search-augmented language models that combine LLM capabilities with real-time web search. Their models include:

Search Models

  • Sonar: Lightweight, cost-effective search model
  • Sonar Pro: Advanced search with grounding for complex queries

Research Models

  • Sonar Deep Research: Expert-level research conducting exhaustive searches

Reasoning Models

  • Sonar Reasoning: Chain of thought reasoning with web search
  • Sonar Reasoning Pro: Premier reasoning powered by DeepSeek R1

Standard Models

  • Various Llama, CodeLlama, and Mistral models without search capabilities

Configuration

This adapter requires a Perplexity API key and optionally a base URL.

Using Environment Variables

# Set environment variables
export PERPLEXITY_API_KEY="pplx-your-api-key"
export PERPLEXITY_MODEL="perplexity/sonar-pro"  # optional
export PERPLEXITY_API_BASE="https://api.perplexity.ai"  # optional

# Use with default environment provider
ExLLM.Adapters.Perplexity.chat(messages, config_provider: ExLLM.ConfigProvider.Env)

Using Static Configuration

config = %{
  perplexity: %{
    api_key: "pplx-your-api-key",
    model: "perplexity/sonar-pro",
    base_url: "https://api.perplexity.ai"  # optional
  }
}
{:ok, provider} = ExLLM.ConfigProvider.Static.start_link(config)
ExLLM.Adapters.Perplexity.chat(messages, config_provider: provider)

Example Usage

messages = [
  %{role: "user", content: "What's the latest news in AI research?"}
]

# Simple search-augmented chat
{:ok, response} = ExLLM.Adapters.Perplexity.chat(messages, model: "perplexity/sonar-pro")
IO.puts(response.content)

# Academic search mode
{:ok, response} = ExLLM.Adapters.Perplexity.chat(messages, 
  model: "perplexity/sonar-pro",
  search_mode: "academic",
  web_search_options: %{search_context_size: "medium"}
)

# Deep research with high reasoning effort
{:ok, response} = ExLLM.Adapters.Perplexity.chat(messages,
  model: "perplexity/sonar-deep-research", 
  reasoning_effort: "high"
)

# Streaming search results
{:ok, stream} = ExLLM.Adapters.Perplexity.stream_chat(messages,
  model: "perplexity/sonar",
  search_mode: "news"
)
for chunk <- stream do
  if chunk.content, do: IO.write(chunk.content)
end

Summary

Functions

Checks if a model supports reasoning capabilities.

Checks if a model supports web search capabilities.

Validates image filter parameters (domain or format filters).

Validates reasoning_effort parameter.

Validates search_mode parameter.

Functions

supports_reasoning?(model_id)

@spec supports_reasoning?(String.t()) :: boolean()

Checks if a model supports reasoning capabilities.

supports_web_search?(model_id)

@spec supports_web_search?(String.t()) :: boolean()

Checks if a model supports web search capabilities.

validate_image_filters(filters)

@spec validate_image_filters([String.t()]) :: :ok | {:error, String.t()}

Validates image filter parameters (domain or format filters).

validate_reasoning_effort(effort)

@spec validate_reasoning_effort(String.t()) :: :ok | {:error, String.t()}

Validates reasoning_effort parameter.

validate_search_mode(mode)

@spec validate_search_mode(String.t()) :: :ok | {:error, String.t()}

Validates search_mode parameter.