View Source LangChain.Chains.RoutingChain (LangChain v0.1.7)
Run a router based on a user's initial prompt to determine what category best matches from the given options. If there is no good match, the value "DEFAULT" is returned.
Here's an example:
routes = [
PromptRoute.new!(%{
name: "marketing_email",
description: "Create a marketing focused email",
chain: marketing_email_chain
}),
PromptRoute.new!(%{
name: "blog_post",
description: "Create a blog post that will be linked from the company's landing page",
chain: blog_post_chain
}),
]
selected_chain =
RoutingChain.new(%{
llm: ChatOpenAI.new(%{model: "gpt-3.5-turbo", stream: false}),
input_text: "Let's create a marketing blog post about our new product 'Fuzzy Furries'",
routes: routes,
default_chain: fallback_chain
})
|> RoutingChain.evaluate()
# The `blog_post_chain` should be returned as the `selected_chain`.
The llm
is the model used to make the determination of which route is the
best match. A smaller, faster LLM may be a great choice for the routing
decision, then a more complex LLM may be used for a selected route.
The default_chain
is required and is used as a fallback if the user's prompt
doesn't match any of the specified routes.
Summary
Functions
Runs the RoutingChain and evaluates the result to return the selected chain.
Start a new RoutingChain.
Start a new RoutingChain and return it or raise an error if invalid.
Run a simple RoutingChain to summarize the user's prompt into a title for the conversation. Uses the provided model. Recommend faster, simpler LLMs without streaming.
Types
Functions
Runs the RoutingChain and evaluates the result to return the selected chain.
@spec new(attrs :: map()) :: {:ok, t()} | {:error, Ecto.Changeset.t()}
Start a new RoutingChain.
Start a new RoutingChain and return it or raise an error if invalid.
@spec run(t(), Keyword.t()) :: {:ok, LangChain.Chains.LLMChain.t(), LangChain.Message.t() | [LangChain.Message.t()]} | {:error, String.t()}
Run a simple RoutingChain to summarize the user's prompt into a title for the conversation. Uses the provided model. Recommend faster, simpler LLMs without streaming.
If it fails to summarize to a title, it returns the default text.
new_title = RoutingChain.new!(%{
llm: %ChatOpenAI{model: "gpt-3.5-turbo", stream: false},
text_to_summarize: "Let's create a marketing blog post about our new product 'Fuzzy Furries'"
})
|> RoutingChain.run()