View Source Instructor (Instructor v0.0.1)
Instructor is a library for structured prompting of LLMs.
See Instructor.chat_completion/1 for more information on how to use it.
By default we use the OpenAI adapter, but you can configure your own adapter by setting the :adapter
config value.
Configuration
Instructor can be configured in your config.exs
file:
config :instructor, adapter: Instructor.Adapters.OpenAI
Other adapters are Instructor.Adapters.Llamacpp
Summary
Functions
Create a new chat completion for the provided messages and parameters.
Functions
@spec chat_completion(Keyword.t()) :: {:ok, Ecto.Schema.t()} | {:error, Ecto.Changeset.t()} | {:error, String.t()}
Create a new chat completion for the provided messages and parameters.
The parameters are passed directly to the LLM adapter. By default they shadow the OpenAI API parameters. For more information on the parameters, see the OpenAI API docs.
Additionally, the following parameters are supported:
:response_model
- The Ecto schema to validate the response against.
Examples
iex> Instructor.chat_completion(%{ ...> model: "gpt-3.5-turbo", ...> response_model: Instructor.Demos.SpamPrediction, ...> messages: [ ...> %{ ...> role: "user", ...> content: "Classify the following text: Hello, I am a Nigerian prince and I would like to give you $1,000,000." ...> } ...> }) {:ok,
%Instructor.Demos.SpamPrediction{
class: :spam
score: 0.999
}}