llama-parse 0.6.94


pip install llama-parse

  Latest version

Released: Feb 13, 2026

Project Links

Meta
Author: Logan Markewich
Requires Python: <4.0,>=3.9

Classifiers

LlamaParse

⚠️ DEPRECATION NOTICE

This repository and its packages are deprecated and will be maintained until May 1, 2026.

Please migrate to the new packages:

  • Python: pip install llama-cloud>=1.0 (GitHub)
  • TypeScript: npm install @llamaindex/llama-cloud (GitHub)

The new packages provide the same functionality with improved performance, better support, and active development.

PyPI - Downloads GitHub contributors Discord

LlamaParse is a GenAI-native document parser that can parse complex document data for any downstream LLM use case (RAG, agents).

It is really good at the following:

  • Broad file type support: Parsing a variety of unstructured file types (.pdf, .pptx, .docx, .xlsx, .html) with text, tables, visual elements, weird layouts, and more.
  • Table recognition: Parsing embedded tables accurately into text and semi-structured representations.
  • Multimodal parsing and chunking: Extracting visual elements (images/diagrams) into structured formats and return image chunks using the latest multimodal models.
  • Custom parsing: Input custom prompt instructions to customize the output the way you want it.

LlamaParse directly integrates with LlamaIndex.

The free plan is up to 1000 pages a day. Paid plan is free 7k pages per week + 0.3c per additional page by default. There is a sandbox available to test the API https://cloud.llamaindex.ai/parse ↗.

Read below for some quickstart information, or see the full documentation.

If you're a company interested in enterprise RAG solutions, and/or high volume/on-prem usage of LlamaParse, come talk to us.

Getting Started

First, login and get an api-key from https://cloud.llamaindex.ai/api-key ↗.

Then, make sure you have the latest LlamaIndex version installed.

NOTE: If you are upgrading from v0.9.X, we recommend following our migration guide, as well as uninstalling your previous version first.

pip uninstall llama-index  # run this if upgrading from v0.9.x or older
pip install -U llama-index --upgrade --no-cache-dir --force-reinstall

Lastly, install the package:

pip install llama-parse

Now you can parse your first PDF file using the command line interface. Use the command llama-parse [file_paths]. See the help text with llama-parse --help.

export LLAMA_CLOUD_API_KEY='llx-...'

# output as text
llama-parse my_file.pdf --result-type text --output-file output.txt

# output as markdown
llama-parse my_file.pdf --result-type markdown --output-file output.md

# output as raw json
llama-parse my_file.pdf --output-raw-json --output-file output.json

You can also create simple scripts:

import nest_asyncio

nest_asyncio.apply()

from llama_parse import LlamaParse

parser = LlamaParse(
    api_key="llx-...",  # can also be set in your env as LLAMA_CLOUD_API_KEY
    result_type="markdown",  # "markdown" and "text" are available
    num_workers=4,  # if multiple files passed, split in `num_workers` API calls
    verbose=True,
    language="en",  # Optionally you can define a language, default=en
)

# sync
documents = parser.load_data("./my_file.pdf")

# sync batch
documents = parser.load_data(["./my_file1.pdf", "./my_file2.pdf"])

# async
documents = await parser.aload_data("./my_file.pdf")

# async batch
documents = await parser.aload_data(["./my_file1.pdf", "./my_file2.pdf"])

Using with file object

You can parse a file object directly:

import nest_asyncio

nest_asyncio.apply()

from llama_parse import LlamaParse

parser = LlamaParse(
    api_key="llx-...",  # can also be set in your env as LLAMA_CLOUD_API_KEY
    result_type="markdown",  # "markdown" and "text" are available
    num_workers=4,  # if multiple files passed, split in `num_workers` API calls
    verbose=True,
    language="en",  # Optionally you can define a language, default=en
)

file_name = "my_file1.pdf"
extra_info = {"file_name": file_name}

with open(f"./{file_name}", "rb") as f:
    # must provide extra_info with file_name key with passing file object
    documents = parser.load_data(f, extra_info=extra_info)

# you can also pass file bytes directly
with open(f"./{file_name}", "rb") as f:
    file_bytes = f.read()
    # must provide extra_info with file_name key with passing file bytes
    documents = parser.load_data(file_bytes, extra_info=extra_info)

Using with SimpleDirectoryReader

You can also integrate the parser as the default PDF loader in SimpleDirectoryReader:

import nest_asyncio

nest_asyncio.apply()

from llama_parse import LlamaParse
from llama_index.core import SimpleDirectoryReader

parser = LlamaParse(
    api_key="llx-...",  # can also be set in your env as LLAMA_CLOUD_API_KEY
    result_type="markdown",  # "markdown" and "text" are available
    verbose=True,
)

file_extractor = {".pdf": parser}
documents = SimpleDirectoryReader(
    "./data", file_extractor=file_extractor
).load_data()

Full documentation for SimpleDirectoryReader can be found on the LlamaIndex Documentation.

Examples

Several end-to-end indexing examples can be found in the examples folder

Documentation

https://docs.cloud.llamaindex.ai/

Terms of Service

See the Terms of Service Here.

Get in Touch (LlamaCloud)

LlamaParse is part of LlamaCloud, our e2e enterprise RAG platform that provides out-of-the-box, production-ready connectors, indexing, and retrieval over your complex data sources. We offer SaaS and VPC options.

LlamaCloud is currently available via waitlist (join by creating an account). If you're interested in state-of-the-art quality and in centralizing your RAG efforts, come get in touch with us.

0.6.94 Feb 13, 2026
0.6.93 Feb 11, 2026
0.6.92 Feb 02, 2026
0.6.91 Jan 21, 2026
0.6.90 Jan 14, 2026
0.6.89 Jan 09, 2026
0.6.88 Dec 05, 2025
0.6.87 Dec 03, 2025
0.6.86 Dec 03, 2025
0.6.85 Dec 02, 2025
0.6.84 Dec 02, 2025
0.6.83 Nov 25, 2025
0.6.82 Nov 24, 2025
0.6.81 Nov 17, 2025
0.6.80 Nov 17, 2025
0.6.79 Nov 04, 2025
0.6.78 Nov 04, 2025
0.6.77 Oct 30, 2025
0.6.76 Oct 15, 2025
0.6.75 Oct 15, 2025
0.6.74 Oct 14, 2025
0.6.73 Oct 06, 2025
0.6.72 Oct 04, 2025
0.6.71 Oct 03, 2025
0.6.70 Oct 03, 2025
0.6.69 Sep 22, 2025
0.6.68 Sep 18, 2025
0.6.67 Sep 17, 2025
0.6.66 Sep 16, 2025
0.6.65 Sep 08, 2025
0.6.64 Sep 06, 2025
0.6.63 Sep 02, 2025
0.6.62 Aug 21, 2025
0.6.60 Aug 14, 2025
0.6.59 Aug 14, 2025
0.6.58 Aug 13, 2025
0.6.57 Aug 11, 2025
0.6.56 Aug 07, 2025
0.6.55 Aug 06, 2025
0.6.54 Aug 01, 2025
0.6.53 Jul 30, 2025
0.6.52 Jul 28, 2025
0.6.51 Jul 21, 2025
0.6.50 Jul 18, 2025
0.6.49 Jul 16, 2025
0.6.48 Jul 16, 2025
0.6.47 Jul 16, 2025
0.6.46 Jul 09, 2025
0.6.45 Jul 09, 2025
0.6.44 Jul 08, 2025
0.6.43 Jul 08, 2025
0.6.42 Jul 08, 2025
0.6.41 Jun 30, 2025
0.6.40 Jun 30, 2025
0.6.39 Jun 27, 2025
0.6.38 Jun 27, 2025
0.6.37 Jun 25, 2025
0.6.36 Jun 23, 2025
0.6.35 Jun 20, 2025
0.6.34 Jun 16, 2025
0.6.33 Jun 16, 2025
0.6.32 Jun 13, 2025
0.6.31 Jun 11, 2025
0.6.30 Jun 05, 2025
0.6.28 Jun 02, 2025
0.6.27 Jun 02, 2025
0.6.26 Jun 02, 2025
0.6.25 May 30, 2025
0.6.24 May 29, 2025
0.6.23 May 20, 2025
0.6.22 May 08, 2025
0.6.21 May 01, 2025
0.6.20 Apr 29, 2025
0.6.18 Apr 29, 2025
0.6.16 Apr 25, 2025
0.6.12 Apr 11, 2025
0.6.9 Apr 09, 2025
0.6.4.post1 Mar 06, 2025
0.6.4 Mar 06, 2025
0.6.2 Feb 26, 2025
0.6.1 Feb 12, 2025
0.6.0 Feb 06, 2025
0.5.20 Jan 22, 2025
0.5.19 Dec 27, 2024
0.5.18 Dec 18, 2024
0.5.17 Dec 06, 2024
0.5.16 Dec 04, 2024
0.5.15 Nov 21, 2024
0.5.14 Nov 12, 2024
0.5.13 Nov 01, 2024
0.5.12 Oct 25, 2024
0.5.11 Oct 22, 2024
0.5.10 Oct 17, 2024
0.5.9 Oct 16, 2024
0.5.8 Oct 15, 2024
0.5.7 Oct 03, 2024
0.5.6 Sep 19, 2024
0.5.5 Sep 10, 2024
0.5.4 Sep 10, 2024
0.5.3 Sep 09, 2024
0.5.2 Sep 04, 2024
0.5.1 Aug 29, 2024
0.5.0 Aug 23, 2024
0.4.9 Jul 17, 2024
0.4.8 Jul 16, 2024
0.4.7 Jul 13, 2024
0.4.6 Jul 08, 2024
0.4.5 Jul 01, 2024
0.4.4 May 30, 2024
0.4.3 May 14, 2024
0.4.2 Apr 25, 2024
0.4.1 Apr 14, 2024
0.4.0 Mar 21, 2024
0.3.9 Mar 13, 2024
0.3.8 Mar 07, 2024
0.3.7 Mar 06, 2024
0.3.6 Mar 05, 2024
0.3.5 Feb 29, 2024
0.3.4 Feb 20, 2024
0.3.3 Feb 20, 2024
0.3.2 Feb 19, 2024
0.3.1 Feb 19, 2024
0.3.0 Feb 19, 2024
0.2.1 Feb 14, 2024
0.2.0 Feb 12, 2024
0.1.4 Feb 03, 2024
0.1.3 Feb 02, 2024

Wheel compatibility matrix

Platform Python 3
any

Files in release

Extras: None
Dependencies:
llama-cloud-services (>=0.6.94)