Skip to content

Structured outputs for LLMs in Elixir

Notifications You must be signed in to change notification settings

martosaur/instructor_lite

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

InstructorLite

Instructor version Hex Docs Hex Downloads GitHub stars CI

Structured prompting for LLMs. InstructorLite is a fork and spiritual successor to instructor_ex library, which is the Elixir member of the great Instructor family.

The Instructor is useful for coaxing an LLM to return JSON that maps to an Ecto schema that you provide, rather than the default unstructured text output. If you define your own validation logic, Instructor can automatically retry prompts when validation fails (returning natural language error messages to the LLM, to guide it when making corrections).

Why Lite

InstructorLite is designed to be:

  1. Lean. It does so little it makes you question if you should just write your own version!
  2. Composable. Almost everything it does can be overridden or extended.
  3. Magic-free. It doesn't hide complexity behind one line function calls, but does its best to provide you with enough information to understand what's going on.

InstructorLite comes with 3 adapters: OpenAI, Anthropic and Llamacpp.

Features

InstructorLite can be boiled down to these features:

  1. It provides a very simple function for generating JSON-schema from Ecto schema.
  2. It facilitates generating prompts, calling LLMs, casting and validating responses, including retrying prompts when validation fails.
  3. It holds knowledge of major LLM providers' API interfaces with adapters.

Any of the features above can be used independently.

Usage

Define an instruction, which is a normal Ecto schema with an extra use Instructor.Instruction call.

defmodule UserInfo do
  use Ecto.Schema
  use InstructorLite.Instruction
  
  @primary_key false
  embedded_schema do
    field(:name, :string)
    field(:age, :integer)
  end
end

Now let's use InstructorLite.instruct/2 to fill the schema from unstructured text:

OpenAI

iex> InstructorLite.instruct(%{
    messages: [
      %{role: "user", content: "John Doe is fourty two years old"}
    ]
  },
  response_model: UserInfo,
  adapter_context: [api_key: Application.fetch_env!(:instructor_lite, :openai_key)]
)
{:ok, %UserInfo{name: "John Doe", age: 42}}

Anthropic

iex> InstructorLite.instruct(%{
    messages: [
      %{role: "user", content: "John Doe is fourty two years old"}
    ]
  },
  response_model: UserInfo,
  adapter: InstructorLite.Adapters.Anthropic,
  adapter_context: [api_key: Application.fetch_env!(:instructor_lite, :anthropic_key)]
)
{:ok, %UserInfo{name: "John Doe", age: 42}}

Llamacpp

iex> InstructorLite.instruct(%{
    prompt: "John Doe is fourty two years old"
  },
  response_model: UserInfo,
  adapter: InstructorLite.Adapters.Llamacpp,
  adapter_context: [url: Application.fetch_env!(:instructor_lite, :llamacpp_url)]
)
{:ok, %UserInfo{name: "John Doe", age: 42}}

Configuration

InstructorLite does not access the application environment for configuration options like adapter or api key. Instead, they're passed as options when needed. Note that different adapters may require different options, so make sure to check their documentation.

Installation

In your mix.exs, add :instructor_lite to your list of dependencies:

def deps do
  [
    {:instructor_lite, "~> 0.2.0"}
  ]
end

Optionally, include m:Req HTTP client as it's used by default:

def deps do
  [
    {:req, "~> 0.5 or ~> 1.0"}
  ]
end