openinference-instrumentation-llama-index


Nameopeninference-instrumentation-llama-index JSON
Version 4.3.4 PyPI version JSON
download
home_pageNone
SummaryOpenInference LlamaIndex Instrumentation
upload_time2025-08-01 22:33:33
maintainerNone
docs_urlNone
authorNone
requires_python<3.14,>=3.9
licenseNone
keywords
VCS
bugtrack_url
requirements openai python-dotenv grpcio opentelemetry-api opentelemetry-sdk opentelemetry-instrumentation-openai opentelemetry-instrumentation-langchain opentelemetry-exporter-otlp-proto-grpc langchain langgraph typing-extensions phoenix-client openinference-instrumentation-openllmetry
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # OpenInference LlamaIndex Instrumentation
Python auto-instrumentation library for LlamaIndex.

These traces are fully OpenTelemetry compatible and can be sent to an OpenTelemetry collector for viewing, such as [`arize-phoenix`](https://github.com/Arize-ai/phoenix).

[![pypi](https://badge.fury.io/py/openinference-instrumentation-llama-index.svg)](https://pypi.org/project/openinference-instrumentation-llama-index/)

## Installation

```shell
pip install openinference-instrumentation-llama-index
```

## Compatibility

| llama-index version | openinference-instrumentation-llama-index version |
|---------------------|---------------------------------------------------|
| \>=0.12.3           | \>=4.0                                            |
| \>=0.11.0           | \>=3.0                                            |
| \>=0.10.43          | \>=2.0, <3.0                                      |
| \>=0.10.0, <0.10.43 | \>=1.0, <0.2                                      |
| \>=0.9.14, <0.10.0  | 0.1.3                                             |

## Quickstart

Install packages needed for this demonstration.

```shell
python -m pip install --upgrade \
    openinference-instrumentation-llama-index \
    opentelemetry-sdk \
    opentelemetry-exporter-otlp \
    "opentelemetry-proto>=1.12.0" \
    arize-phoenix
```

Start the Phoenix app in the background as a collector. By default, it listens on `http://localhost:6006`. You can visit the app via a browser at the same address.

The Phoenix app does not send data over the internet. It only operates locally on your machine.

```shell
python -m phoenix.server.main serve
```

The following Python code sets up the `LlamaIndexInstrumentor` to trace `llama-index` and send the traces to Phoenix at the endpoint shown below.

```python
from openinference.instrumentation.llama_index import LlamaIndexInstrumentor
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
from opentelemetry.sdk import trace as trace_sdk
from opentelemetry.sdk.trace.export import SimpleSpanProcessor

endpoint = "http://127.0.0.1:6006/v1/traces"
tracer_provider = trace_sdk.TracerProvider()
tracer_provider.add_span_processor(SimpleSpanProcessor(OTLPSpanExporter(endpoint)))

LlamaIndexInstrumentor().instrument(tracer_provider=tracer_provider)
```

To demonstrate tracing, we'll use LlamaIndex below to query a document. 

First, download a text file.

```python
import tempfile
from urllib.request import urlretrieve
from llama_index.core import SimpleDirectoryReader

url = "https://raw.githubusercontent.com/Arize-ai/phoenix-assets/main/data/paul_graham/paul_graham_essay.txt"
with tempfile.NamedTemporaryFile() as tf:
    urlretrieve(url, tf.name)
    documents = SimpleDirectoryReader(input_files=[tf.name]).load_data()
```

Next, we'll query using OpenAI. To do that you need to set up your OpenAI API key in an environment variable.

```python
import os

os.environ["OPENAI_API_KEY"] = "<your openai key>"
```

Now we can query the indexed documents.

```python
from llama_index.core import VectorStoreIndex

query_engine = VectorStoreIndex.from_documents(documents).as_query_engine()
print(query_engine.query("What did the author do growing up?"))
```

Visit the Phoenix app at `http://localhost:6006` to see the traces.

## More Info

* [More info on OpenInference and Phoenix](https://docs.arize.com/phoenix)
* [How to customize spans to track sessions, metadata, etc.](https://github.com/Arize-ai/openinference/tree/main/python/openinference-instrumentation#customizing-spans)
* [How to account for private information and span payload customization](https://github.com/Arize-ai/openinference/tree/main/python/openinference-instrumentation#tracing-configuration)

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "openinference-instrumentation-llama-index",
    "maintainer": null,
    "docs_url": null,
    "requires_python": "<3.14,>=3.9",
    "maintainer_email": null,
    "keywords": null,
    "author": null,
    "author_email": "OpenInference Authors <oss@arize.com>",
    "download_url": "https://files.pythonhosted.org/packages/89/df/6622fb09eacfad140067c5c6f649c6a3d835f081b5cfba6c8fa5252ab944/openinference_instrumentation_llama_index-4.3.4.tar.gz",
    "platform": null,
    "description": "# OpenInference LlamaIndex Instrumentation\nPython auto-instrumentation library for LlamaIndex.\n\nThese traces are fully OpenTelemetry compatible and can be sent to an OpenTelemetry collector for viewing, such as [`arize-phoenix`](https://github.com/Arize-ai/phoenix).\n\n[![pypi](https://badge.fury.io/py/openinference-instrumentation-llama-index.svg)](https://pypi.org/project/openinference-instrumentation-llama-index/)\n\n## Installation\n\n```shell\npip install openinference-instrumentation-llama-index\n```\n\n## Compatibility\n\n| llama-index version | openinference-instrumentation-llama-index version |\n|---------------------|---------------------------------------------------|\n| \\>=0.12.3           | \\>=4.0                                            |\n| \\>=0.11.0           | \\>=3.0                                            |\n| \\>=0.10.43          | \\>=2.0, <3.0                                      |\n| \\>=0.10.0, <0.10.43 | \\>=1.0, <0.2                                      |\n| \\>=0.9.14, <0.10.0  | 0.1.3                                             |\n\n## Quickstart\n\nInstall packages needed for this demonstration.\n\n```shell\npython -m pip install --upgrade \\\n    openinference-instrumentation-llama-index \\\n    opentelemetry-sdk \\\n    opentelemetry-exporter-otlp \\\n    \"opentelemetry-proto>=1.12.0\" \\\n    arize-phoenix\n```\n\nStart the Phoenix app in the background as a collector. By default, it listens on `http://localhost:6006`. You can visit the app via a browser at the same address.\n\nThe Phoenix app does not send data over the internet. It only operates locally on your machine.\n\n```shell\npython -m phoenix.server.main serve\n```\n\nThe following Python code sets up the `LlamaIndexInstrumentor` to trace `llama-index` and send the traces to Phoenix at the endpoint shown below.\n\n```python\nfrom openinference.instrumentation.llama_index import LlamaIndexInstrumentor\nfrom opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter\nfrom opentelemetry.sdk import trace as trace_sdk\nfrom opentelemetry.sdk.trace.export import SimpleSpanProcessor\n\nendpoint = \"http://127.0.0.1:6006/v1/traces\"\ntracer_provider = trace_sdk.TracerProvider()\ntracer_provider.add_span_processor(SimpleSpanProcessor(OTLPSpanExporter(endpoint)))\n\nLlamaIndexInstrumentor().instrument(tracer_provider=tracer_provider)\n```\n\nTo demonstrate tracing, we'll use LlamaIndex below to query a document. \n\nFirst, download a text file.\n\n```python\nimport tempfile\nfrom urllib.request import urlretrieve\nfrom llama_index.core import SimpleDirectoryReader\n\nurl = \"https://raw.githubusercontent.com/Arize-ai/phoenix-assets/main/data/paul_graham/paul_graham_essay.txt\"\nwith tempfile.NamedTemporaryFile() as tf:\n    urlretrieve(url, tf.name)\n    documents = SimpleDirectoryReader(input_files=[tf.name]).load_data()\n```\n\nNext, we'll query using OpenAI. To do that you need to set up your OpenAI API key in an environment variable.\n\n```python\nimport os\n\nos.environ[\"OPENAI_API_KEY\"] = \"<your openai key>\"\n```\n\nNow we can query the indexed documents.\n\n```python\nfrom llama_index.core import VectorStoreIndex\n\nquery_engine = VectorStoreIndex.from_documents(documents).as_query_engine()\nprint(query_engine.query(\"What did the author do growing up?\"))\n```\n\nVisit the Phoenix app at `http://localhost:6006` to see the traces.\n\n## More Info\n\n* [More info on OpenInference and Phoenix](https://docs.arize.com/phoenix)\n* [How to customize spans to track sessions, metadata, etc.](https://github.com/Arize-ai/openinference/tree/main/python/openinference-instrumentation#customizing-spans)\n* [How to account for private information and span payload customization](https://github.com/Arize-ai/openinference/tree/main/python/openinference-instrumentation#tracing-configuration)\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "OpenInference LlamaIndex Instrumentation",
    "version": "4.3.4",
    "project_urls": {
        "Homepage": "https://github.com/Arize-ai/openinference/tree/main/python/instrumentation/openinference-instrumentation-llama-index"
    },
    "split_keywords": [],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "bea7e63f05f49ea66d416c770b92f0670723811c285e2f10c13ce91c0558eb4b",
                "md5": "0c1060e5dbe7a4cf512f4885a747f2e4",
                "sha256": "aa94297ede190cd55f62e04dd050fbb7975fec55a63752df8f4faec6b403bac0"
            },
            "downloads": -1,
            "filename": "openinference_instrumentation_llama_index-4.3.4-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "0c1060e5dbe7a4cf512f4885a747f2e4",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<3.14,>=3.9",
            "size": 28815,
            "upload_time": "2025-08-01T22:33:32",
            "upload_time_iso_8601": "2025-08-01T22:33:32.285802Z",
            "url": "https://files.pythonhosted.org/packages/be/a7/e63f05f49ea66d416c770b92f0670723811c285e2f10c13ce91c0558eb4b/openinference_instrumentation_llama_index-4.3.4-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "89df6622fb09eacfad140067c5c6f649c6a3d835f081b5cfba6c8fa5252ab944",
                "md5": "e902d308cfa183205c72595e2a65e531",
                "sha256": "2e03347bf7d9d6f7ff4b62101239c7408c2a6c69065f7d00f9a3b1e2145ac626"
            },
            "downloads": -1,
            "filename": "openinference_instrumentation_llama_index-4.3.4.tar.gz",
            "has_sig": false,
            "md5_digest": "e902d308cfa183205c72595e2a65e531",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<3.14,>=3.9",
            "size": 60693,
            "upload_time": "2025-08-01T22:33:33",
            "upload_time_iso_8601": "2025-08-01T22:33:33.532613Z",
            "url": "https://files.pythonhosted.org/packages/89/df/6622fb09eacfad140067c5c6f649c6a3d835f081b5cfba6c8fa5252ab944/openinference_instrumentation_llama_index-4.3.4.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-08-01 22:33:33",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "Arize-ai",
    "github_project": "openinference",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "requirements": [
        {
            "name": "openai",
            "specs": [
                [
                    ">=",
                    "1.0.0"
                ]
            ]
        },
        {
            "name": "python-dotenv",
            "specs": [
                [
                    ">=",
                    "1.0.0"
                ]
            ]
        },
        {
            "name": "grpcio",
            "specs": [
                [
                    ">=",
                    "1.60.0"
                ]
            ]
        },
        {
            "name": "opentelemetry-api",
            "specs": [
                [
                    ">=",
                    "1.22.0"
                ]
            ]
        },
        {
            "name": "opentelemetry-sdk",
            "specs": [
                [
                    ">=",
                    "1.22.0"
                ]
            ]
        },
        {
            "name": "opentelemetry-instrumentation-openai",
            "specs": [
                [
                    ">=",
                    "0.40b0"
                ]
            ]
        },
        {
            "name": "opentelemetry-instrumentation-langchain",
            "specs": [
                [
                    ">=",
                    "0.40b0"
                ]
            ]
        },
        {
            "name": "opentelemetry-exporter-otlp-proto-grpc",
            "specs": [
                [
                    ">=",
                    "1.22.0"
                ]
            ]
        },
        {
            "name": "langchain",
            "specs": [
                [
                    ">=",
                    "0.1.0"
                ]
            ]
        },
        {
            "name": "langgraph",
            "specs": [
                [
                    ">=",
                    "0.0.15"
                ]
            ]
        },
        {
            "name": "typing-extensions",
            "specs": [
                [
                    ">=",
                    "4.8.0"
                ]
            ]
        },
        {
            "name": "phoenix-client",
            "specs": [
                [
                    ">=",
                    "0.1.0"
                ]
            ]
        },
        {
            "name": "openinference-instrumentation-openllmetry",
            "specs": [
                [
                    ">=",
                    "0.1.0"
                ]
            ]
        }
    ],
    "lcname": "openinference-instrumentation-llama-index"
}
        
Elapsed time: 1.16735s