instructor 1.11.3


pip install instructor

  Latest version

Released: Sep 09, 2025

Project Links

Meta
Author: Jason Liu, Ivan Leo
Requires Python: <4.0,>=3.9

Classifiers

Instructor: Structured Outputs for LLMs

Get reliable JSON from any LLM. Built on Pydantic for validation, type safety, and IDE support.

import instructor
from pydantic import BaseModel


# Define what you want
class User(BaseModel):
    name: str
    age: int


# Extract it from natural language
client = instructor.from_provider("openai/gpt-4o-mini")
user = client.chat.completions.create(
    response_model=User,
    messages=[{"role": "user", "content": "John is 25 years old"}],
)

print(user)  # User(name='John', age=25)

That's it. No JSON parsing, no error handling, no retries. Just define a model and get structured data.

PyPI Downloads GitHub Stars Discord Twitter

Why Instructor?

Getting structured data from LLMs is hard. You need to:

  1. Write complex JSON schemas
  2. Handle validation errors
  3. Retry failed extractions
  4. Parse unstructured responses
  5. Deal with different provider APIs

Instructor handles all of this with one simple interface:

Without Instructor With Instructor
response = openai.chat.completions.create(
    model="gpt-4",
    messages=[{"role": "user", "content": "..."}],
    tools=[
        {
            "type": "function",
            "function": {
                "name": "extract_user",
                "parameters": {
                    "type": "object",
                    "properties": {
                        "name": {"type": "string"},
                        "age": {"type": "integer"},
                    },
                },
            },
        }
    ],
)

# Parse response
tool_call = response.choices[0].message.tool_calls[0]
user_data = json.loads(tool_call.function.arguments)

# Validate manually
if "name" not in user_data:
    # Handle error...
    pass
client = instructor.from_provider("openai/gpt-4")

user = client.chat.completions.create(
    response_model=User,
    messages=[{"role": "user", "content": "..."}],
)

# That's it! user is validated and typed

Install in seconds

pip install instructor

Or with your package manager:

uv add instructor
poetry add instructor

Works with every major provider

Use the same code with any LLM provider:

# OpenAI
client = instructor.from_provider("openai/gpt-4o")

# Anthropic
client = instructor.from_provider("anthropic/claude-3-5-sonnet")

# Google
client = instructor.from_provider("google/gemini-pro")

# Ollama (local)
client = instructor.from_provider("ollama/llama3.2")

# With API keys directly (no environment variables needed)
client = instructor.from_provider("openai/gpt-4o", api_key="sk-...")
client = instructor.from_provider("anthropic/claude-3-5-sonnet", api_key="sk-ant-...")
client = instructor.from_provider("groq/llama-3.1-8b-instant", api_key="gsk_...")

# All use the same API!
user = client.chat.completions.create(
    response_model=User,
    messages=[{"role": "user", "content": "..."}],
)

Production-ready features

Automatic retries

Failed validations are automatically retried with the error message:

from pydantic import BaseModel, field_validator


class User(BaseModel):
    name: str
    age: int

    @field_validator('age')
    def validate_age(cls, v):
        if v < 0:
            raise ValueError('Age must be positive')
        return v


# Instructor automatically retries when validation fails
user = client.chat.completions.create(
    response_model=User,
    messages=[{"role": "user", "content": "..."}],
    max_retries=3,
)

Streaming support

Stream partial objects as they're generated:

from instructor import Partial

for partial_user in client.chat.completions.create(
    response_model=Partial[User],
    messages=[{"role": "user", "content": "..."}],
    stream=True,
):
    print(partial_user)
    # User(name=None, age=None)
    # User(name="John", age=None)
    # User(name="John", age=25)

Nested objects

Extract complex, nested data structures:

from typing import List


class Address(BaseModel):
    street: str
    city: str
    country: str


class User(BaseModel):
    name: str
    age: int
    addresses: List[Address]


# Instructor handles nested objects automatically
user = client.chat.completions.create(
    response_model=User,
    messages=[{"role": "user", "content": "..."}],
)

Used in production by

Trusted by over 100,000 developers and companies building AI applications:

  • 3M+ monthly downloads
  • 10K+ GitHub stars
  • 1000+ community contributors

Companies using Instructor include teams at OpenAI, Google, Microsoft, AWS, and many YC startups.

Get started

Basic extraction

Extract structured data from any text:

from pydantic import BaseModel
import instructor

client = instructor.from_provider("openai/gpt-4o-mini")


class Product(BaseModel):
    name: str
    price: float
    in_stock: bool


product = client.chat.completions.create(
    response_model=Product,
    messages=[{"role": "user", "content": "iPhone 15 Pro, $999, available now"}],
)

print(product)
# Product(name='iPhone 15 Pro', price=999.0, in_stock=True)

Multiple languages

Instructor's simple API is available in many languages:

  • Python - The original
  • TypeScript - Full TypeScript support
  • Ruby - Ruby implementation
  • Go - Go implementation
  • Elixir - Elixir implementation
  • Rust - Rust implementation

Learn more

Why use Instructor over alternatives?

vs Raw JSON mode: Instructor provides automatic validation, retries, streaming, and nested object support. No manual schema writing.

vs LangChain/LlamaIndex: Instructor is focused on one thing - structured extraction. It's lighter, faster, and easier to debug.

vs Custom solutions: Battle-tested by thousands of developers. Handles edge cases you haven't thought of yet.

Contributing

We welcome contributions! Check out our good first issues to get started.

License

MIT License - see LICENSE for details.


Built by the Instructor community. Special thanks to Jason Liu and all contributors.

1.11.3 Sep 09, 2025
1.11.2 Aug 27, 2025
1.11.0 Aug 27, 2025
1.10.0 Jul 18, 2025
1.9.2 Jul 07, 2025
1.9.1 Jul 07, 2025
1.9.0 Jun 21, 2025
1.8.3 May 22, 2025
1.8.2 May 15, 2025
1.8.1 May 09, 2025
1.8.0 May 07, 2025
1.7.9 Apr 03, 2025
1.7.8 Mar 29, 2025
1.7.7 Mar 17, 2025
1.7.6 Mar 17, 2025
1.7.5 Mar 16, 2025
1.7.4 Mar 12, 2025
1.7.3 Mar 06, 2025
1.7.2 Dec 26, 2024
1.7.1 Dec 25, 2024
1.7.0 Nov 27, 2024
1.6.4 Nov 14, 2024
1.6.3 Oct 21, 2024
1.6.2 Oct 17, 2024
1.6.1 Oct 17, 2024
1.6.0 Oct 17, 2024
1.5.2 Oct 08, 2024
1.5.1 Oct 04, 2024
1.5.0 Sep 30, 2024
1.4.3 Sep 19, 2024
1.4.2 Sep 14, 2024
1.4.1 Sep 06, 2024
1.4.0 Aug 22, 2024
1.3.7 Jul 24, 2024
1.3.6 Jul 23, 2024
1.3.5 Jul 17, 2024
1.3.4 Jun 25, 2024
1.3.3 Jun 11, 2024
1.3.2 May 27, 2024
1.3.1 May 23, 2024
1.3.0 May 23, 2024
1.2.6 May 09, 2024
1.2.5 May 01, 2024
1.2.4 Apr 29, 2024
1.2.3 Apr 27, 2024
1.2.2 Apr 20, 2024
1.2.1 Apr 18, 2024
1.2.0 Apr 14, 2024
1.1.0 Apr 11, 2024
1.0.3 Apr 05, 2024
1.0.2 Apr 05, 2024
1.0.0 Apr 01, 2024
0.6.8 Mar 29, 2024
0.6.7 Mar 21, 2024
0.6.6 Mar 21, 2024
0.6.5 Mar 20, 2024
0.6.4 Mar 08, 2024
0.6.3 Mar 06, 2024
0.6.2 Mar 01, 2024
0.6.1 Feb 20, 2024
0.6.0 Feb 18, 2024
0.5.2 Feb 07, 2024
0.5.0 Feb 04, 2024
0.4.8 Jan 23, 2024
0.4.7 Jan 14, 2024
0.4.6 Jan 05, 2024
0.4.5 Dec 19, 2023
0.4.4 Dec 17, 2023
0.4.3 Dec 17, 2023
0.4.2 Dec 06, 2023
0.4.0 Nov 27, 2023
0.3.5 Nov 19, 2023
0.3.4 Nov 13, 2023
0.3.3 Nov 13, 2023
0.3.2 Nov 11, 2023
0.3.1 Nov 09, 2023
0.3.0 Nov 08, 2023
0.2.11 Nov 06, 2023
0.2.9 Oct 22, 2023
0.2.8 Sep 19, 2023
0.2.7 Sep 08, 2023
0.2.6 Sep 06, 2023
0.2.5 Aug 24, 2023
0.2.4 Aug 17, 2023
0.2.1 Jul 28, 2023

Wheel compatibility matrix

Platform Python 3
any

Files in release

Extras:
Dependencies:
aiohttp (<4.0.0,>=3.9.1)
diskcache (>=5.6.3)
docstring-parser (<1.0,>=0.16)
jinja2 (<4.0.0,>=3.1.4)
jiter (<0.11,>=0.6.1)
openai (<2.0.0,>=1.70.0)
pydantic-core (<3.0.0,>=2.18.0)
pydantic (<3.0.0,>=2.8.0)
requests (<3.0.0,>=2.32.3)
rich (<15.0.0,>=13.7.0)
tenacity (<10.0.0,>=8.2.3)
typer (<1.0.0,>=0.9.0)