Agentic.LLM.Transport behaviour
(agentic v0.2.2)
Copy Markdown
Behaviour describing one wire-protocol family used to talk to LLM
providers. A transport is pure: it knows how to translate a
canonical request shape into an HTTP request and how to parse the
HTTP response back into the shared Agentic.LLM.Response /
Agentic.LLM.Error structs. It does not perform any network I/O,
does not look up credentials, and does not implement any
provider-specific business logic.
Canonical chat params
Every transport accepts the same canonical chat params map. The
per-provider shim is responsible for translating its own input shape
into this canonical form before calling
build_chat_request/2.
%{
model: String.t(),
messages: [%{role: String.t() | atom(), content: term()}],
system: nil | String.t() | [map()],
tools: [%{name: ..., description: ..., input_schema: ...}],
max_tokens: pos_integer() | nil,
temperature: float() | nil,
tool_choice: nil | :auto | :none | :any | %{name: String.t()},
cache_control: nil | %{
stable_hash: String.t(),
prefix_changed: boolean()
}
}Transports MUST tolerate missing optional keys (tools, system,
tool_choice, temperature, cache_control) by treating them
as absent. Transports that don't implement provider-side prompt
caching ignore cache_control entirely; transports that do (e.g.
Agentic.LLM.Transport.AnthropicMessages) read prefix_changed
to decide whether to mark cache breakpoints in the request body.
Opts
build_chat_request/2 receives an opts keyword list whose keys
are intentionally narrow:
:base_url— required, fully-qualified provider base URL(no trailing slash needed):api_key— required, raw bearer / api key value:extra_headers— optional, list of extra{name, value}tuplesfor provider-specific headers (e.g. `HTTP-Referer`, `anthropic-version`)
Credential lookup is handled by Agentic.LLM.Credentials and
Agentic.LLM.Provider. The :api_key opt is resolved before
the transport is called.
Summary
Callbacks
Optional embedding callbacks. A transport that does not implement these will not be usable for embedding requests.
Types
@type canonical_params() :: %{ :model => String.t(), :messages => list(), optional(:system) => String.t() | list() | nil, optional(:tools) => list(), optional(:max_tokens) => pos_integer() | nil, optional(:temperature) => float() | nil, optional(:tool_choice) => term(), optional(:cache_control) => map() | nil }
Callbacks
@callback build_chat_request( canonical_params(), keyword() ) :: request()
@callback build_embedding_request( text_or_list :: String.t() | [String.t()], opts :: keyword() ) :: request() | :not_supported
Optional embedding callbacks. A transport that does not implement these will not be usable for embedding requests.
Opts (build_embedding_request/2)
:base_url— required, fully-qualified provider base URL:api_key— required, raw bearer / api key value:model— required, embedding model id:extra_headers— optional, list of extra{name, value}tuples
Response shape
parse_embedding_response/3 always returns a list of vectors,
even when the original input was a single string. The caller is
responsible for indexing into the list when it knows it submitted
a single text.
@callback id() :: atom()
@callback parse_chat_response(non_neg_integer(), term(), term()) :: {:ok, Agentic.LLM.Response.t()} | {:error, Agentic.LLM.Error.t()}
@callback parse_embedding_response( status :: non_neg_integer(), body :: term(), headers :: term() ) :: {:ok, [[float()]]} | {:error, Agentic.LLM.Error.t()} | :not_supported
@callback parse_rate_limit(term()) :: Agentic.LLM.RateLimit.t() | nil