ExLLM.Adapters.Groq (ex_llm v0.5.0)
View SourceGroq adapter for ExLLM.
Groq provides extremely fast inference for open-source models using their custom LPU (Language Processing Unit) hardware.
Configuration
config :ex_llm,
groq: [
api_key: System.get_env("GROQ_API_KEY"),
base_url: "https://api.groq.com/openai/v1" # optional
]
Supported Models
- llama-3.3-70b-versatile
- llama-3.1-70b-versatile
- llama-3.1-8b-instant
- llama3-70b-8192
- llama3-8b-8192
- mixtral-8x7b-32768
- gemma2-9b-it
- gemma-7b-it
Example Usage
messages = [
%{role: "user", content: "Hello, how are you?"}
]
{:ok, response} = ExLLM.Adapters.Groq.chat(messages, model: "llama3-70b-8192")