LlmCore.LLM.CLIProvider (llm_core v0.3.0)

Copy Markdown View Source

Universal CLI-based LLM provider.

One module, any CLI. Configuration is pure data — loaded from TOML or constructed in code. New CLI clients are added without writing Elixir code.

Default Providers

The following CLI providers ship as type = "cli" entries in priv/config/llm_core.toml. Override or remove them via project or global TOML overrides — no Elixir changes needed.

NameBinaryNotes
:claude_codeclaudeWraps with /bin/sh for stdin redirect
:droiddroidSubcommand exec, rich flag set
:pi_clipiPi CLI non-interactive dispatch (--print)
:kimi_clikimi-cliKimi CLI with agent-file support
:codex_clicodexOpenAI Codex CLI
:gemini_cligeminiGoogle Gemini CLI

Usage

{:ok, provider} = CLIProvider.from_config(:claude_code)

# Check availability
CLIProvider.available?(provider)
#=> true

# Send a prompt
{:ok, response} = CLIProvider.send(provider, "Explain this code")

# Stream
{:ok, stream} = CLIProvider.stream(provider, "Write a story")

Custom Provider (no code needed)

config = %CLIProvider.Config{
  name: :my_tool,
  binary: "my-tool",
  default_timeout: 60_000,
  default_model: "v2",
  flags: %{model: "--model", temperature: "--temp"}
}
provider = CLIProvider.from_config(config)
{:ok, response} = CLIProvider.send(provider, "hello", model: "v2")

Summary

Functions

Builds the CLI argument list from prompt and opts.

Builds an Error struct for common failure modes.

Returns {executable, args} for the CLI invocation.

Builds a ready-to-use %CLIProvider{} struct from an id or alias.

Builds a Response struct from CLI output, applying any configured normalization.

Returns the legacy built-in map (empty since all defaults moved to TOML). Use list_all_configs/0 to get all known CLI provider configs.

Returns the config for a provider name.

Fetches a CLI config by id or alias. Returns a ready-to-use %CLIProvider.Config{}.

Creates a CLIProvider from a built-in name, a string id/alias, or a Config struct.

Returns a normalized invocation plan describing how the provider will run.

Returns all known CLI provider configs — runtime (TOML) merged with builtins. Runtime configs override builtins with the same name.

Runs declarative checks proving the CLI surface matches the configured contract.

Renders a prompt after applying any declared inline system-prompt fallback.

Resolves an alias or id string to the canonical provider name atom. Checks runtime definitions first (via Provider.Registry aliases), then builtins.

Returns whether the provider supports a semantic capability.

Types

t()

@type t() :: %LlmCore.LLM.CLIProvider{config: LlmCore.LLM.CLIProvider.Config.t()}

Functions

available?(cli_provider)

@spec available?(t()) :: boolean()

build_args(provider, prompt, opts)

@spec build_args(t(), LlmCore.LLM.Provider.prompt(), keyword()) :: [String.t()]

Builds the CLI argument list from prompt and opts.

build_error(cli_provider, arg2, opts)

@spec build_error(t(), atom() | {:exit_code, non_neg_integer()}, keyword()) ::
  LlmCore.LLM.Error.t()

Builds an Error struct for common failure modes.

build_invocation(provider, prompt, opts)

@spec build_invocation(t(), LlmCore.LLM.Provider.prompt(), keyword()) ::
  {String.t(), [String.t()]}

Returns {executable, args} for the CLI invocation.

build_provider(name)

@spec build_provider(atom() | String.t()) :: {:ok, t()} | {:error, :not_found}

Builds a ready-to-use %CLIProvider{} struct from an id or alias.

build_response(cli_provider, output, opts)

@spec build_response(t(), String.t(), keyword()) :: LlmCore.LLM.Response.t()

Builds a Response struct from CLI output, applying any configured normalization.

builtins()

@spec builtins() :: %{required(atom()) => LlmCore.LLM.CLIProvider.Config.t()}

Returns the legacy built-in map (empty since all defaults moved to TOML). Use list_all_configs/0 to get all known CLI provider configs.

capabilities(cli_provider)

@spec capabilities(t()) :: map()

config(name)

@spec config(atom()) ::
  {:ok, LlmCore.LLM.CLIProvider.Config.t()} | {:error, String.t()}

Returns the config for a provider name.

Resolution order:

  1. Runtime store (TOML-loaded CLI configs)
  2. Built-in @builtins
  3. Error

fetch_config(name)

@spec fetch_config(atom() | String.t()) ::
  {:ok, LlmCore.LLM.CLIProvider.Config.t()} | {:error, :not_found}

Fetches a CLI config by id or alias. Returns a ready-to-use %CLIProvider.Config{}.

from_config(name)

@spec from_config(atom() | String.t() | LlmCore.LLM.CLIProvider.Config.t()) :: t()

Creates a CLIProvider from a built-in name, a string id/alias, or a Config struct.

invocation_plan(provider, prompt, opts \\ [])

@spec invocation_plan(t(), LlmCore.LLM.Provider.prompt(), keyword()) :: map()

Returns a normalized invocation plan describing how the provider will run.

list_all_configs()

@spec list_all_configs() :: %{required(atom()) => LlmCore.LLM.CLIProvider.Config.t()}

Returns all known CLI provider configs — runtime (TOML) merged with builtins. Runtime configs override builtins with the same name.

preflight(provider)

@spec preflight(t()) :: {:ok, map()} | {:error, map()}

Runs declarative checks proving the CLI surface matches the configured contract.

provider_type(cli_provider)

@spec provider_type(t()) :: :cli

render_prompt(provider, prompt, opts \\ [])

@spec render_prompt(t(), LlmCore.LLM.Provider.prompt(), keyword()) :: String.t()

Renders a prompt after applying any declared inline system-prompt fallback.

resolve_id(name)

@spec resolve_id(atom() | String.t()) :: {:ok, atom()} | {:error, :not_found}

Resolves an alias or id string to the canonical provider name atom. Checks runtime definitions first (via Provider.Registry aliases), then builtins.

send(provider, prompt, opts \\ [])

@spec send(t(), LlmCore.LLM.Provider.prompt(), keyword()) ::
  {:ok, LlmCore.LLM.Response.t()} | {:error, LlmCore.LLM.Error.t()}

stream(provider, prompt, opts \\ [])

@spec stream(t(), LlmCore.LLM.Provider.prompt(), keyword()) ::
  {:ok, Enumerable.t()} | {:error, LlmCore.LLM.Error.t()}

supports?(provider, capability)

@spec supports?(t(), atom()) :: boolean()

Returns whether the provider supports a semantic capability.