LlmCore (llm_core v0.3.0)

Copy Markdown View Source

Public facade for LlmCore capabilities.

This is the primary entry point for most LlmCore usage. It delegates to the routing pipeline, inference pipeline, and Hindsight memory client.

Sending Prompts

# Route by task type (configured in TOML [routing.tasks])
{:ok, response} = LlmCore.send("Explain pattern matching", :reasoning)

# Stream a response
{:ok, stream} = LlmCore.stream("Write a GenServer example", :coding)
Enum.each(stream, &IO.write/1)

# Extract structured output
{:ok, response} = LlmCore.send(prompt, :extraction,
  response_format: {:json_schema, %{type: "object", properties: %{...}}}
)
response.structured #=> %{"name" => "value"}

Semantic Memory

# Store (async, buffered)
:ok = LlmCore.retain("Key architectural decision", %{context: "architecture"})

# Recall by meaning
{:ok, results} = LlmCore.recall("multi-tenancy patterns", bank_id: "my-bank")

# Synthesize an insight
{:ok, insight} = LlmCore.reflect("What patterns work best?", bank_id: "my-bank")

Routing Introspection

# View current routing table
table = LlmCore.routing_table()

# Force reload from config
:ok = LlmCore.reload_routing()

# Check Hindsight availability
LlmCore.hindsight_available?()

Summary

Functions

Checks if Hindsight API is available and enabled.

Semantic search for similar content with caching.

Natural language reflection/synthesis over a bank's memories.

Forces the router to reload configuration from disk/store.

Stores content in Hindsight for semantic indexing (async, buffered).

Stores content synchronously, bypassing the write buffer.

Resolves a task type to a configured agent.

Fetches the current routing table.

Sends a prompt through the router using the matching provider.

Streams a prompt, returning the provider stream enumerable.

Functions

hindsight_available?()

@spec hindsight_available?() :: boolean()

Checks if Hindsight API is available and enabled.

recall(query, opts \\ [])

@spec recall(
  String.t(),
  keyword()
) :: {:ok, [map()]} | {:error, term()}

Semantic search for similar content with caching.

reflect(query, opts \\ [])

@spec reflect(
  String.t() | atom(),
  keyword()
) :: {:ok, term()} | {:error, term()}

Natural language reflection/synthesis over a bank's memories.

reload_routing()

@spec reload_routing() :: :ok

Forces the router to reload configuration from disk/store.

retain(content, metadata \\ %{})

@spec retain(String.t(), map()) :: :ok

Stores content in Hindsight for semantic indexing (async, buffered).

retain_sync(content, metadata \\ %{})

@spec retain_sync(String.t(), map()) :: {:ok, map()} | {:error, term()}

Stores content synchronously, bypassing the write buffer.

route(task_type)

@spec route(String.t() | atom()) ::
  {:ok, LlmCore.Router.ResolvedRoute.t()} | {:error, term()}

Resolves a task type to a configured agent.

routing_table()

@spec routing_table() :: LlmCore.Router.RoutingTable.t() | nil

Fetches the current routing table.

send(prompt, task_type, opts \\ [])

@spec send(String.t(), String.t() | atom(), keyword()) ::
  {:ok, LlmCore.LLM.Response.t()} | {:error, term()}

Sends a prompt through the router using the matching provider.

stream(prompt, task_type, opts \\ [])

@spec stream(String.t(), String.t() | atom(), keyword()) ::
  {:ok, Enumerable.t()} | {:error, term()}

Streams a prompt, returning the provider stream enumerable.