LLMAgent Architecture
View SourceThis guide provides an overview of the LLMAgent architecture, explaining how it builds on AgentForge's signal-driven design while providing LLM-specific abstractions.
Core Principles
LLMAgent is designed with the following principles in mind:
- LLM-Specific Abstractions: Create patterns optimized for LLM interactions
- Separation of Concerns: Clearly delineate LLM logic from infrastructure
- Elixir Ecosystem Integration: Leverage the strengths of Elixir/OTP
- Lightweight Implementation: Maintain a clean, minimal codebase
- Testability: Ensure components can be tested in isolation
System Architecture
LLMAgent extends AgentForge's signal-driven architecture with components specifically designed for LLM interactions:
graph TD
User[User Input] --> Signals
Signals --> Flow
Flow --> Handlers
Handlers --> Store
Store --> LLM[LLM Provider]
LLM --> Store
Store --> Tools
Tools --> Store
Store --> Response[Response]
Core Components
- Signals: Represent events in the agent lifecycle
- Handlers: Process signals and update state
- Store: Manages conversation state and history
- Flows: Combine handlers into coherent sequences
- Providers: Interface with LLM backends
- Tools: External capabilities the agent can use
Signal Flow
The typical flow of a conversation follows these steps:
- User input is converted to a
:user_message
signal - The message handler processes the user message and generates a
:thinking
signal - The thinking handler calls the LLM and decides whether to use a tool or generate a response
- If using a tool, it generates a
:tool_call
signal - The tool handler executes the tool and generates a
:tool_result
signal - The tool result handler incorporates the result and generates a new
:thinking
signal - Eventually, the LLM generates a response, creating a
:response
signal - The response handler formats and returns the final response
Component Diagram
Here is a detailed component diagram showing the relationships between the main modules:
classDiagram
class LLMAgent.Signals {
+user_message(content)
+thinking(content)
+tool_call(name, args)
+tool_result(name, result)
+response(content)
+error(message, source)
}
class LLMAgent.Handlers {
+message_handler(signal, state)
+thinking_handler(signal, state)
+tool_handler(signal, state)
+tool_result_handler(signal, state)
+response_handler(signal, state)
+error_handler(signal, state)
}
class LLMAgent.Store {
+new(initial_state)
+add_message(state, role, content)
+add_thought(state, content)
+add_tool_call(state, name, args, result)
+prune_thoughts(state, max_thoughts)
}
class LLMAgent.Flows {
+conversation(system_prompt, tools, options)
+task_flow(task_definition, options)
+batch_processing(items, batch_handler, options)
+qa_agent(system_prompt, options)
}
LLMAgent.Signals -- LLMAgent.Handlers
LLMAgent.Handlers -- LLMAgent.Store
LLMAgent.Flows -- LLMAgent.Handlers
LLMAgent.Flows -- LLMAgent.Store
Extension Points
LLMAgent is designed to be extended in several ways:
- Custom Handlers: Create specialized handlers for domain-specific signals
- Custom Flows: Combine handlers in new ways for different interaction patterns
- Custom Tools: Add new capabilities to your agent
- Provider Plugins: Integrate with different LLM backends
Design Decisions
Why Signal-Driven Architecture?
The signal-driven architecture provides several benefits:
- Composability: Handlers can be combined in flexible ways
- Testability: Each component can be tested in isolation
- Extensibility: New signals and handlers can be added without changing existing code
- Visibility: The flow of information is explicit and traceable
Why Elixir?
Elixir's functional nature, pattern matching, and supervision trees make it ideal for building reliable, maintainable agent systems:
- Immutable State: Ensures predictable state transitions
- Pattern Matching: Makes signal handling elegant and explicit
- Concurrency: Allows handling multiple conversations efficiently
- Fault Tolerance: Supervisors can restart failed components
Next Steps
- Explore tool integration
- Learn how to create custom agents
- Understand LLM provider integration