View Source API Reference hyper_llm v0.3.0
Modules
HyperLLM.Chat is a single interface for interacting with LLM providers. The interface uses the OpenAI chat completion API. https://platform.openai.com/docs/api-reference/chat
HyperLLM.Conversation handles the lifecycle of a conversation, including starting, appending messages, and running the conversation.
Determine the provider and model for a given model name.
Defines the behaviour that all provider modules must implement.
Provider implementation for Anthropic.
Provider implementation for Cloudflare.
Groq provider.
Provider implementation for OpenAI.
Defines the behaviour that all workflow tool modules must implement.