HTTP client for connecting to and interacting with LlamaIndex workflow servers
Project Links
Meta
Requires Python: >=3.10
Classifiers
LlamaAgents Client
Async HTTP client for interacting with deployed llama-agents-server instances.
Installation
pip install llama-agents-client
Quick Start
import asyncio
from llama_agents.client import WorkflowClient
async def main():
client = WorkflowClient(base_url="http://localhost:8080")
# Run a workflow asynchronously
handler = await client.run_workflow_nowait("my_workflow")
# Stream events as they are produced
async for event in client.get_workflow_events(handler.handler_id):
print(f"Event: {event.type} -> {event.value}")
# Get the final result
result = await client.get_handler(handler.handler_id)
print(f"Result: {result.result} (status: {result.status})")
asyncio.run(main())
Features
- Run workflows synchronously or asynchronously
- Stream events in real-time as a workflow executes
- Human-in-the-loop support via
send_eventfor injecting events into running workflows - Bring your own
httpx.AsyncClientfor custom auth, headers, or transport
Documentation
See the full deployment guide for detailed usage and API reference.
0.3.1
Mar 20, 2026
0.3.0
Mar 17, 2026
0.2.3
Mar 12, 2026
0.2.2
Mar 11, 2026
0.2.1
Mar 07, 2026
0.2.0
Feb 28, 2026
0.2.0rc1
Feb 24, 2026
0.2.0rc0
Feb 12, 2026
0.1.3
Feb 13, 2026
0.1.2
Feb 06, 2026
0.1.1
Feb 05, 2026