View Source API Reference Instructor v0.0.2

Modules

Instructor.ex is a spiritual port of the great Instructor Python Library by @jxnlco. This library brings structured prompting to LLMs. Instead of receiving text as output, Instructor will coax the LLM to output valid JSON that maps directly to the provided Ecto schema. If the LLM fails to do so, or provides values that do not pass your validations, it will provide you utilities to automatically retry with the LLM to correct errors. By default it's designed to be used with the OpenAI API, however it provides an extendable adapter behavior to work with ggerganov/llama.cpp and Bumblebee (Coming Soon!).

Runs against the llama.cpp server. To be clear this calls the llamacpp specific endpoints, not the open-ai compliant ones.

By default you'll get whatever OpenAI returns. This behavior provides a hook for you to critique the response using standard ecto changesets validations. This can be used in conjuction with the :max_retries parameter to Instructor.chat_completion/1 to retry the completion until it passes your validation.