ExLLM.Providers.Perplexity (ex_llm v0.8.1)
View SourcePerplexity AI API adapter for ExLLM.
Perplexity AI provides search-augmented language models that combine LLM capabilities with real-time web search. Their models include:
Search Models
- Sonar: Lightweight, cost-effective search model
- Sonar Pro: Advanced search with grounding for complex queries
Research Models
- Sonar Deep Research: Expert-level research conducting exhaustive searches
Reasoning Models
- Sonar Reasoning: Chain of thought reasoning with web search
- Sonar Reasoning Pro: Premier reasoning powered by DeepSeek R1
Standard Models
- Various Llama, CodeLlama, and Mistral models without search capabilities
Configuration
This adapter requires a Perplexity API key and optionally a base URL.
Using Environment Variables
# Set environment variables
export PERPLEXITY_API_KEY="pplx-your-api-key"
export PERPLEXITY_MODEL="perplexity/sonar-pro" # optional
export PERPLEXITY_API_BASE="https://api.perplexity.ai" # optional
# Use with default environment provider
ExLLM.Providers.Perplexity.chat(messages, config_provider: ExLLM.Infrastructure.ConfigProvider.Env)
Using Static Configuration
config = %{
perplexity: %{
api_key: "pplx-your-api-key",
model: "perplexity/sonar-pro",
base_url: "https://api.perplexity.ai" # optional
}
}
{:ok, provider} = ExLLM.Infrastructure.ConfigProvider.Static.start_link(config)
ExLLM.Providers.Perplexity.chat(messages, config_provider: provider)
Example Usage
messages = [
%{role: "user", content: "What's the latest news in AI research?"}
]
# Simple search-augmented chat
{:ok, response} = ExLLM.Providers.Perplexity.chat(messages, model: "perplexity/sonar-pro")
IO.puts(response.content)
# Academic search mode
{:ok, response} = ExLLM.Providers.Perplexity.chat(messages,
model: "perplexity/sonar-pro",
search_mode: "academic",
web_search_options: %{search_context_size: "medium"}
)
# Deep research with high reasoning effort
{:ok, response} = ExLLM.Providers.Perplexity.chat(messages,
model: "perplexity/sonar-deep-research",
reasoning_effort: "high"
)
# Streaming search results
{:ok, stream} = ExLLM.Providers.Perplexity.stream_chat(messages,
model: "perplexity/sonar",
search_mode: "news"
)
for chunk <- stream do
if chunk.content, do: IO.write(chunk.content)
end
Summary
Functions
Checks if a model supports reasoning capabilities.
Checks if a model supports web search capabilities.
Validates image filter parameters (domain or format filters).
Validates reasoning_effort parameter.
Validates search_mode parameter.
Functions
Checks if a model supports reasoning capabilities.
Checks if a model supports web search capabilities.
Validates image filter parameters (domain or format filters).
Validates reasoning_effort parameter.
Validates search_mode parameter.