openinference-instrumentation-llama-index


Nameopeninference-instrumentation-llama-index JSON
Version 3.3.1 PyPI version JSON
download
home_pageNone
SummaryOpenInference LlamaIndex Instrumentation
upload_time2025-02-19 21:34:47
maintainerNone
docs_urlNone
authorNone
requires_python<3.14,>=3.9
licenseNone
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # OpenInference LlamaIndex Instrumentation
Python auto-instrumentation library for LlamaIndex.

These traces are fully OpenTelemetry compatible and can be sent to an OpenTelemetry collector for viewing, such as [`arize-phoenix`](https://github.com/Arize-ai/phoenix).

[![pypi](https://badge.fury.io/py/openinference-instrumentation-llama-index.svg)](https://pypi.org/project/openinference-instrumentation-llama-index/)

## Installation

```shell
pip install openinference-instrumentation-llama-index
```

## Compatibility

| llama-index version | openinference-instrumentation-llama-index version |
|---------------------|---------------------------------------------------|
| \>=0.11.0           | \>=3.0                                            |
| \>=0.10.43          | \>=2.0, <3.0                                      |
| \>=0.10.0, <0.10.43 | \>=1.0, <0.2                                      |
| \>=0.9.14, <0.10.0  | 0.1.3                                             |

## Quickstart

Install packages needed for this demonstration.

```shell
python -m pip install --upgrade \
    openinference-instrumentation-llama-index \
    opentelemetry-sdk \
    opentelemetry-exporter-otlp \
    "opentelemetry-proto>=1.12.0" \
    arize-phoenix
```

Start the Phoenix app in the background as a collector. By default, it listens on `http://localhost:6006`. You can visit the app via a browser at the same address.

The Phoenix app does not send data over the internet. It only operates locally on your machine.

```shell
python -m phoenix.server.main serve
```

The following Python code sets up the `LlamaIndexInstrumentor` to trace `llama-index` and send the traces to Phoenix at the endpoint shown below.

```python
from openinference.instrumentation.llama_index import LlamaIndexInstrumentor
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
from opentelemetry.sdk import trace as trace_sdk
from opentelemetry.sdk.trace.export import SimpleSpanProcessor

endpoint = "http://127.0.0.1:6006/v1/traces"
tracer_provider = trace_sdk.TracerProvider()
tracer_provider.add_span_processor(SimpleSpanProcessor(OTLPSpanExporter(endpoint)))

LlamaIndexInstrumentor().instrument(tracer_provider=tracer_provider)
```

To demonstrate tracing, we'll use LlamaIndex below to query a document. 

First, download a text file.

```python
import tempfile
from urllib.request import urlretrieve
from llama_index.core import SimpleDirectoryReader

url = "https://raw.githubusercontent.com/Arize-ai/phoenix-assets/main/data/paul_graham/paul_graham_essay.txt"
with tempfile.NamedTemporaryFile() as tf:
    urlretrieve(url, tf.name)
    documents = SimpleDirectoryReader(input_files=[tf.name]).load_data()
```

Next, we'll query using OpenAI. To do that you need to set up your OpenAI API key in an environment variable.

```python
import os

os.environ["OPENAI_API_KEY"] = "<your openai key>"
```

Now we can query the indexed documents.

```python
from llama_index.core import VectorStoreIndex

query_engine = VectorStoreIndex.from_documents(documents).as_query_engine()
print(query_engine.query("What did the author do growing up?"))
```

Visit the Phoenix app at `http://localhost:6006` to see the traces.

## More Info

* [More info on OpenInference and Phoenix](https://docs.arize.com/phoenix)
* [How to customize spans to track sessions, metadata, etc.](https://github.com/Arize-ai/openinference/tree/main/python/openinference-instrumentation#customizing-spans)
* [How to account for private information and span payload customization](https://github.com/Arize-ai/openinference/tree/main/python/openinference-instrumentation#tracing-configuration)

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "openinference-instrumentation-llama-index",
    "maintainer": null,
    "docs_url": null,
    "requires_python": "<3.14,>=3.9",
    "maintainer_email": null,
    "keywords": null,
    "author": null,
    "author_email": "OpenInference Authors <oss@arize.com>",
    "download_url": "https://files.pythonhosted.org/packages/0b/91/7f7ed0834b961cd870d2389b7164489aa34c3e3eb3e3dedb20bb6ac3caf6/openinference_instrumentation_llama_index-3.3.1.tar.gz",
    "platform": null,
    "description": "# OpenInference LlamaIndex Instrumentation\nPython auto-instrumentation library for LlamaIndex.\n\nThese traces are fully OpenTelemetry compatible and can be sent to an OpenTelemetry collector for viewing, such as [`arize-phoenix`](https://github.com/Arize-ai/phoenix).\n\n[![pypi](https://badge.fury.io/py/openinference-instrumentation-llama-index.svg)](https://pypi.org/project/openinference-instrumentation-llama-index/)\n\n## Installation\n\n```shell\npip install openinference-instrumentation-llama-index\n```\n\n## Compatibility\n\n| llama-index version | openinference-instrumentation-llama-index version |\n|---------------------|---------------------------------------------------|\n| \\>=0.11.0           | \\>=3.0                                            |\n| \\>=0.10.43          | \\>=2.0, <3.0                                      |\n| \\>=0.10.0, <0.10.43 | \\>=1.0, <0.2                                      |\n| \\>=0.9.14, <0.10.0  | 0.1.3                                             |\n\n## Quickstart\n\nInstall packages needed for this demonstration.\n\n```shell\npython -m pip install --upgrade \\\n    openinference-instrumentation-llama-index \\\n    opentelemetry-sdk \\\n    opentelemetry-exporter-otlp \\\n    \"opentelemetry-proto>=1.12.0\" \\\n    arize-phoenix\n```\n\nStart the Phoenix app in the background as a collector. By default, it listens on `http://localhost:6006`. You can visit the app via a browser at the same address.\n\nThe Phoenix app does not send data over the internet. It only operates locally on your machine.\n\n```shell\npython -m phoenix.server.main serve\n```\n\nThe following Python code sets up the `LlamaIndexInstrumentor` to trace `llama-index` and send the traces to Phoenix at the endpoint shown below.\n\n```python\nfrom openinference.instrumentation.llama_index import LlamaIndexInstrumentor\nfrom opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter\nfrom opentelemetry.sdk import trace as trace_sdk\nfrom opentelemetry.sdk.trace.export import SimpleSpanProcessor\n\nendpoint = \"http://127.0.0.1:6006/v1/traces\"\ntracer_provider = trace_sdk.TracerProvider()\ntracer_provider.add_span_processor(SimpleSpanProcessor(OTLPSpanExporter(endpoint)))\n\nLlamaIndexInstrumentor().instrument(tracer_provider=tracer_provider)\n```\n\nTo demonstrate tracing, we'll use LlamaIndex below to query a document. \n\nFirst, download a text file.\n\n```python\nimport tempfile\nfrom urllib.request import urlretrieve\nfrom llama_index.core import SimpleDirectoryReader\n\nurl = \"https://raw.githubusercontent.com/Arize-ai/phoenix-assets/main/data/paul_graham/paul_graham_essay.txt\"\nwith tempfile.NamedTemporaryFile() as tf:\n    urlretrieve(url, tf.name)\n    documents = SimpleDirectoryReader(input_files=[tf.name]).load_data()\n```\n\nNext, we'll query using OpenAI. To do that you need to set up your OpenAI API key in an environment variable.\n\n```python\nimport os\n\nos.environ[\"OPENAI_API_KEY\"] = \"<your openai key>\"\n```\n\nNow we can query the indexed documents.\n\n```python\nfrom llama_index.core import VectorStoreIndex\n\nquery_engine = VectorStoreIndex.from_documents(documents).as_query_engine()\nprint(query_engine.query(\"What did the author do growing up?\"))\n```\n\nVisit the Phoenix app at `http://localhost:6006` to see the traces.\n\n## More Info\n\n* [More info on OpenInference and Phoenix](https://docs.arize.com/phoenix)\n* [How to customize spans to track sessions, metadata, etc.](https://github.com/Arize-ai/openinference/tree/main/python/openinference-instrumentation#customizing-spans)\n* [How to account for private information and span payload customization](https://github.com/Arize-ai/openinference/tree/main/python/openinference-instrumentation#tracing-configuration)\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "OpenInference LlamaIndex Instrumentation",
    "version": "3.3.1",
    "project_urls": {
        "Homepage": "https://github.com/Arize-ai/openinference/tree/main/python/instrumentation/openinference-instrumentation-llama-index"
    },
    "split_keywords": [],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "89967f3ef33cddb6c467a80c515a87daeea04c3328362a4c089d10ee4944b05e",
                "md5": "3a153326a2b431f315138b0b7b7039c7",
                "sha256": "606dd85deaacc3ef9557b79332dc6d218fdecf20fd13a18776478a720c2526f3"
            },
            "downloads": -1,
            "filename": "openinference_instrumentation_llama_index-3.3.1-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "3a153326a2b431f315138b0b7b7039c7",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<3.14,>=3.9",
            "size": 26175,
            "upload_time": "2025-02-19T21:34:45",
            "upload_time_iso_8601": "2025-02-19T21:34:45.401567Z",
            "url": "https://files.pythonhosted.org/packages/89/96/7f3ef33cddb6c467a80c515a87daeea04c3328362a4c089d10ee4944b05e/openinference_instrumentation_llama_index-3.3.1-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "0b917f7ed0834b961cd870d2389b7164489aa34c3e3eb3e3dedb20bb6ac3caf6",
                "md5": "1f263119740be38d776ce200c39b6f44",
                "sha256": "8f77d832689ece1d299c284c589fffc98d5f749e9e71b5bad6cddf9dbb4c5b2e"
            },
            "downloads": -1,
            "filename": "openinference_instrumentation_llama_index-3.3.1.tar.gz",
            "has_sig": false,
            "md5_digest": "1f263119740be38d776ce200c39b6f44",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<3.14,>=3.9",
            "size": 55242,
            "upload_time": "2025-02-19T21:34:47",
            "upload_time_iso_8601": "2025-02-19T21:34:47.038703Z",
            "url": "https://files.pythonhosted.org/packages/0b/91/7f7ed0834b961cd870d2389b7164489aa34c3e3eb3e3dedb20bb6ac3caf6/openinference_instrumentation_llama_index-3.3.1.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-02-19 21:34:47",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "Arize-ai",
    "github_project": "openinference",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "openinference-instrumentation-llama-index"
}
        
Elapsed time: 0.45829s