Name | openinference-instrumentation-llama-index JSON |
Version |
3.1.1
JSON |
| download |
home_page | None |
Summary | OpenInference LlamaIndex Instrumentation |
upload_time | 2024-12-17 21:16:14 |
maintainer | None |
docs_url | None |
author | None |
requires_python | <3.13,>=3.8 |
license | None |
keywords |
|
VCS |
|
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
# OpenInference LlamaIndex Instrumentation
Python auto-instrumentation library for LlamaIndex.
These traces are fully OpenTelemetry compatible and can be sent to an OpenTelemetry collector for viewing, such as [`arize-phoenix`](https://github.com/Arize-ai/phoenix).
[![pypi](https://badge.fury.io/py/openinference-instrumentation-llama-index.svg)](https://pypi.org/project/openinference-instrumentation-llama-index/)
## Installation
```shell
pip install openinference-instrumentation-llama-index
```
## Compatibility
| llama-index version | openinference-instrumentation-llama-index version |
|---------------------|---------------------------------------------------|
| \>=0.11.0 | \>=3.0 |
| \>=0.10.43 | \>=2.0, <3.0 |
| \>=0.10.0, <0.10.43 | \>=1.0, <0.2 |
| \>=0.9.14, <0.10.0 | 0.1.3 |
## Quickstart
Install packages needed for this demonstration.
```shell
python -m pip install --upgrade \
openinference-instrumentation-llama-index \
opentelemetry-sdk \
opentelemetry-exporter-otlp \
"opentelemetry-proto>=1.12.0" \
arize-phoenix
```
Start the Phoenix app in the background as a collector. By default, it listens on `http://localhost:6006`. You can visit the app via a browser at the same address.
The Phoenix app does not send data over the internet. It only operates locally on your machine.
```shell
python -m phoenix.server.main serve
```
The following Python code sets up the `LlamaIndexInstrumentor` to trace `llama-index` and send the traces to Phoenix at the endpoint shown below.
```python
from openinference.instrumentation.llama_index import LlamaIndexInstrumentor
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
from opentelemetry.sdk import trace as trace_sdk
from opentelemetry.sdk.trace.export import SimpleSpanProcessor
endpoint = "http://127.0.0.1:6006/v1/traces"
tracer_provider = trace_sdk.TracerProvider()
tracer_provider.add_span_processor(SimpleSpanProcessor(OTLPSpanExporter(endpoint)))
LlamaIndexInstrumentor().instrument(tracer_provider=tracer_provider)
```
To demonstrate tracing, we'll use LlamaIndex below to query a document.
First, download a text file.
```python
import tempfile
from urllib.request import urlretrieve
from llama_index.core import SimpleDirectoryReader
url = "https://raw.githubusercontent.com/Arize-ai/phoenix-assets/main/data/paul_graham/paul_graham_essay.txt"
with tempfile.NamedTemporaryFile() as tf:
urlretrieve(url, tf.name)
documents = SimpleDirectoryReader(input_files=[tf.name]).load_data()
```
Next, we'll query using OpenAI. To do that you need to set up your OpenAI API key in an environment variable.
```python
import os
os.environ["OPENAI_API_KEY"] = "<your openai key>"
```
Now we can query the indexed documents.
```python
from llama_index.core import VectorStoreIndex
query_engine = VectorStoreIndex.from_documents(documents).as_query_engine()
print(query_engine.query("What did the author do growing up?"))
```
Visit the Phoenix app at `http://localhost:6006` to see the traces.
## More Info
* [More info on OpenInference and Phoenix](https://docs.arize.com/phoenix)
* [How to customize spans to track sessions, metadata, etc.](https://github.com/Arize-ai/openinference/tree/main/python/openinference-instrumentation#customizing-spans)
* [How to account for private information and span payload customization](https://github.com/Arize-ai/openinference/tree/main/python/openinference-instrumentation#tracing-configuration)
Raw data
{
"_id": null,
"home_page": null,
"name": "openinference-instrumentation-llama-index",
"maintainer": null,
"docs_url": null,
"requires_python": "<3.13,>=3.8",
"maintainer_email": null,
"keywords": null,
"author": null,
"author_email": "OpenInference Authors <oss@arize.com>",
"download_url": "https://files.pythonhosted.org/packages/6d/cb/73443247bc80da4e5a23a923929095b36b8931bb00ed7b8d73122687b12f/openinference_instrumentation_llama_index-3.1.1.tar.gz",
"platform": null,
"description": "# OpenInference LlamaIndex Instrumentation\nPython auto-instrumentation library for LlamaIndex.\n\nThese traces are fully OpenTelemetry compatible and can be sent to an OpenTelemetry collector for viewing, such as [`arize-phoenix`](https://github.com/Arize-ai/phoenix).\n\n[![pypi](https://badge.fury.io/py/openinference-instrumentation-llama-index.svg)](https://pypi.org/project/openinference-instrumentation-llama-index/)\n\n## Installation\n\n```shell\npip install openinference-instrumentation-llama-index\n```\n\n## Compatibility\n\n| llama-index version | openinference-instrumentation-llama-index version |\n|---------------------|---------------------------------------------------|\n| \\>=0.11.0 | \\>=3.0 |\n| \\>=0.10.43 | \\>=2.0, <3.0 |\n| \\>=0.10.0, <0.10.43 | \\>=1.0, <0.2 |\n| \\>=0.9.14, <0.10.0 | 0.1.3 |\n\n## Quickstart\n\nInstall packages needed for this demonstration.\n\n```shell\npython -m pip install --upgrade \\\n openinference-instrumentation-llama-index \\\n opentelemetry-sdk \\\n opentelemetry-exporter-otlp \\\n \"opentelemetry-proto>=1.12.0\" \\\n arize-phoenix\n```\n\nStart the Phoenix app in the background as a collector. By default, it listens on `http://localhost:6006`. You can visit the app via a browser at the same address.\n\nThe Phoenix app does not send data over the internet. It only operates locally on your machine.\n\n```shell\npython -m phoenix.server.main serve\n```\n\nThe following Python code sets up the `LlamaIndexInstrumentor` to trace `llama-index` and send the traces to Phoenix at the endpoint shown below.\n\n```python\nfrom openinference.instrumentation.llama_index import LlamaIndexInstrumentor\nfrom opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter\nfrom opentelemetry.sdk import trace as trace_sdk\nfrom opentelemetry.sdk.trace.export import SimpleSpanProcessor\n\nendpoint = \"http://127.0.0.1:6006/v1/traces\"\ntracer_provider = trace_sdk.TracerProvider()\ntracer_provider.add_span_processor(SimpleSpanProcessor(OTLPSpanExporter(endpoint)))\n\nLlamaIndexInstrumentor().instrument(tracer_provider=tracer_provider)\n```\n\nTo demonstrate tracing, we'll use LlamaIndex below to query a document. \n\nFirst, download a text file.\n\n```python\nimport tempfile\nfrom urllib.request import urlretrieve\nfrom llama_index.core import SimpleDirectoryReader\n\nurl = \"https://raw.githubusercontent.com/Arize-ai/phoenix-assets/main/data/paul_graham/paul_graham_essay.txt\"\nwith tempfile.NamedTemporaryFile() as tf:\n urlretrieve(url, tf.name)\n documents = SimpleDirectoryReader(input_files=[tf.name]).load_data()\n```\n\nNext, we'll query using OpenAI. To do that you need to set up your OpenAI API key in an environment variable.\n\n```python\nimport os\n\nos.environ[\"OPENAI_API_KEY\"] = \"<your openai key>\"\n```\n\nNow we can query the indexed documents.\n\n```python\nfrom llama_index.core import VectorStoreIndex\n\nquery_engine = VectorStoreIndex.from_documents(documents).as_query_engine()\nprint(query_engine.query(\"What did the author do growing up?\"))\n```\n\nVisit the Phoenix app at `http://localhost:6006` to see the traces.\n\n## More Info\n\n* [More info on OpenInference and Phoenix](https://docs.arize.com/phoenix)\n* [How to customize spans to track sessions, metadata, etc.](https://github.com/Arize-ai/openinference/tree/main/python/openinference-instrumentation#customizing-spans)\n* [How to account for private information and span payload customization](https://github.com/Arize-ai/openinference/tree/main/python/openinference-instrumentation#tracing-configuration)\n",
"bugtrack_url": null,
"license": null,
"summary": "OpenInference LlamaIndex Instrumentation",
"version": "3.1.1",
"project_urls": {
"Homepage": "https://github.com/Arize-ai/openinference/tree/main/python/instrumentation/openinference-instrumentation-llama-index"
},
"split_keywords": [],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "8836e1cc1d0c6d7369e8e76bb4c8a262614ca71a4d455fac0e207672a7bffe91",
"md5": "914d94166a01ddacafe6e7faa5cfac0e",
"sha256": "6a78530faac5c5f5ea8c9da388634f2bceb98928d1f1f61f184dd58901eaf0f0"
},
"downloads": -1,
"filename": "openinference_instrumentation_llama_index-3.1.1-py3-none-any.whl",
"has_sig": false,
"md5_digest": "914d94166a01ddacafe6e7faa5cfac0e",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": "<3.13,>=3.8",
"size": 25678,
"upload_time": "2024-12-17T21:16:12",
"upload_time_iso_8601": "2024-12-17T21:16:12.589233Z",
"url": "https://files.pythonhosted.org/packages/88/36/e1cc1d0c6d7369e8e76bb4c8a262614ca71a4d455fac0e207672a7bffe91/openinference_instrumentation_llama_index-3.1.1-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "6dcb73443247bc80da4e5a23a923929095b36b8931bb00ed7b8d73122687b12f",
"md5": "cd8475ea9d4901301bf98407fb50ce36",
"sha256": "417ce49450cfea5d8a3441b4d6d151da36fa47d95ee26a64fabe0dd7ae45b3a9"
},
"downloads": -1,
"filename": "openinference_instrumentation_llama_index-3.1.1.tar.gz",
"has_sig": false,
"md5_digest": "cd8475ea9d4901301bf98407fb50ce36",
"packagetype": "sdist",
"python_version": "source",
"requires_python": "<3.13,>=3.8",
"size": 53202,
"upload_time": "2024-12-17T21:16:14",
"upload_time_iso_8601": "2024-12-17T21:16:14.689171Z",
"url": "https://files.pythonhosted.org/packages/6d/cb/73443247bc80da4e5a23a923929095b36b8931bb00ed7b8d73122687b12f/openinference_instrumentation_llama_index-3.1.1.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-12-17 21:16:14",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "Arize-ai",
"github_project": "openinference",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"lcname": "openinference-instrumentation-llama-index"
}