An event-driven, async-first, step-based way to control the execution flow of AI applications like Agents.
Project Links
Meta
Requires Python: >=3.10
Classifiers
LlamaIndex Workflows
LlamaIndex Workflows are a framework for orchestrating and chaining together complex systems of steps and events.
What can you build with Workflows?
Workflows shine when you need to orchestrate complex, multi-step processes that involve AI models, APIs, and decision-making. Here are some examples of what you can build:
- AI Agents - Create intelligent systems that can reason, make decisions, and take actions across multiple steps
- Document Processing Pipelines - Build systems that ingest, analyze, summarize, and route documents through various processing stages
- Multi-Model AI Applications - Coordinate between different AI models (LLMs, vision models, etc.) to solve complex tasks
- Research Assistants - Develop workflows that can search, analyze, synthesize information, and provide comprehensive answers
- Content Generation Systems - Create pipelines that generate, review, edit, and publish content with human-in-the-loop approval
- Customer Support Automation - Build intelligent routing systems that can understand, categorize, and respond to customer inquiries
The async-first, event-driven architecture makes it easy to build workflows that can route between different capabilities, implement parallel processing patterns, loop over complex sequences, and maintain state across multiple steps - all the features you need to make your AI applications production-ready.
Key Features
- async-first - workflows are built around python's async functionality - steps are async functions that process incoming events from an asyncio queue and emit new events to other queues. This also means that workflows work best in your async apps like FastAPI, Jupyter Notebooks, etc.
- event-driven - workflows consist of steps and events. Organizing your code around events and steps makes it easier to reason about and test.
- state management - each run of a workflow is self-contained, meaning you can launch a workflow, save information within it, serialize the state of a workflow and resume it later.
- observability - workflows are automatically instrumented for observability, meaning you can use tools like
Arize PhoenixandOpenTelemetryright out of the box.
Quick Start
Install the package:
pip install llama-index-workflows
And create your first workflow:
import asyncio
from pydantic import BaseModel, Field
from workflows import Context, Workflow, step
from workflows.events import Event, StartEvent, StopEvent
class MyEvent(Event):
msg: list[str]
class RunState(BaseModel):
num_runs: int = Field(default=0)
class MyWorkflow(Workflow):
@step
async def start(self, ctx: Context[RunState], ev: StartEvent) -> MyEvent:
async with ctx.store.edit_state() as state:
state.num_runs += 1
return MyEvent(msg=[ev.input_msg] * state.num_runs)
@step
async def process(self, ctx: Context[RunState], ev: MyEvent) -> StopEvent:
data_length = len("".join(ev.msg))
new_msg = f"Processed {len(ev.msg)} times, data length: {data_length}"
return StopEvent(result=new_msg)
async def main():
workflow = MyWorkflow()
# [optional] provide a context object to the workflow
ctx = Context(workflow)
result = await workflow.run(input_msg="Hello, world!", ctx=ctx)
print("Workflow result:", result)
# re-running with the same context will retain the state
result = await workflow.run(input_msg="Hello, world!", ctx=ctx)
print("Workflow result:", result)
if __name__ == "__main__":
asyncio.run(main())
In the example above
- Steps that accept a
StartEventwill be run first. - Steps that return a
StopEventwill end the workflow. - Intermediate events are user defined and can be used to pass information between steps.
- The
Contextobject is also used to share information between steps.
Visit the complete documentation for more examples using llama-index!
2.17.1
Mar 20, 2026
2.17.0
Mar 17, 2026
2.16.1
Mar 12, 2026
2.16.0
Mar 11, 2026
2.15.1
Mar 07, 2026
2.15.0
Feb 28, 2026
2.15.0rc1
Feb 24, 2026
2.15.0rc0
Feb 12, 2026
2.14.2
Feb 13, 2026
2.14.1
Feb 06, 2026
2.14.0
Feb 05, 2026
2.13.1
Jan 25, 2026
2.13.0
Jan 23, 2026
2.12.2
Jan 15, 2026
2.12.1
Jan 14, 2026
2.12.0
Jan 13, 2026
2.11.7
Jan 06, 2026
2.11.6
Dec 24, 2025
2.11.5
Nov 24, 2025
2.11.4
Nov 21, 2025
2.11.3
Nov 20, 2025
2.11.2
Nov 17, 2025
2.11.1
Nov 07, 2025
2.11.0
Nov 06, 2025
2.10.3
Nov 03, 2025
2.10.2
Nov 02, 2025
2.10.1
Nov 02, 2025
2.10.0
Oct 30, 2025
2.9.1
Oct 29, 2025
2.9.0
Oct 28, 2025
2.8.3
Oct 17, 2025
2.8.2
Oct 17, 2025
2.8.1
Oct 15, 2025
2.8.0
Oct 10, 2025
2.7.1
Oct 09, 2025
2.7.0
Oct 08, 2025
2.6.0
Oct 02, 2025
2.5.0
Sep 26, 2025
2.4.0
Sep 23, 2025
2.3.0
Sep 23, 2025
2.2.2
Sep 17, 2025
2.2.1
Sep 16, 2025
2.2.0
Sep 12, 2025
2.1.0
Sep 10, 2025
2.0.1
Sep 09, 2025
2.0.0
Aug 29, 2025
1.3.0
Aug 07, 2025
1.2.0
Jul 23, 2025
1.1.0
Jul 08, 2025
1.0.1
Jun 26, 2025
1.0.0
Jun 25, 2025
0.2.2
Jun 19, 2025
0.2.1
Jun 11, 2025
0.2.0
Jun 11, 2025
0.1.0
Jun 10, 2025
Wheel compatibility matrix
Files in release
Extras:
Dependencies:
llama-index-instrumentation
(>=0.4.3)
pydantic
(>=2.11.5)
typing-extensions
(>=4.6.0)