LLM-augmented retrieval that combines hybrid search with LLM reasoning.
Uses memory search results as context to generate answers with citations.
Requires an llm_fn callback — Recollect never calls LLMs directly for completion.
Usage
Recollect.complete("What was decided about auth?",
owner_id: user_id,
llm_fn: fn messages -> MyApp.LLM.chat(messages) end
)
Summary
Functions
Answer a question using the memory system as context.
Functions
Answer a question using the memory system as context.
Returns {:ok, %{answer: answer, context: context_pack}} or {:error, reason}.
Options
:llm_fn(required) —fn messages -> {:ok, answer_string} | {:error, reason} end:system_prompt— Override the default system prompt:owner_id— Owner UUID for scoping search:scope_id— Scope UUID for scoping search:limit— Max chunks to retrieve (default: 10):hops— Graph expansion depth (default: 2)