llama-index-packs-arize-phoenix-query-engine


Namellama-index-packs-arize-phoenix-query-engine JSON
Version 0.3.0 PyPI version JSON
download
home_pageNone
Summaryllama-index packs arize_phoenix_query_engine integration
upload_time2024-11-18 02:31:54
maintaineraxiomofjoy
docs_urlNone
authorYour Name
requires_python<3.12,>=3.9
licenseMIT
keywords arize engine index phoenix query
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # Arize-Phoenix LlamaPack

This LlamaPack instruments your LlamaIndex app for LLM tracing with [Phoenix](https://github.com/Arize-ai/phoenix), an open-source LLM observability library from [Arize AI](https://phoenix.arize.com/).

## CLI Usage

You can download llamapacks directly using `llamaindex-cli`, which comes installed with the `llama-index` python package:

```bash
llamaindex-cli download-llamapack ArizePhoenixQueryEnginePack --download-dir ./arize_pack
```

You can then inspect the files at `./arize_pack` and use them as a template for your own project!

## Code Usage

You can download the pack to a the `./arize_pack` directory:

```python
from llama_index.core.llama_pack import download_llama_pack

# download and install dependencies
ArizePhoenixQueryEnginePack = download_llama_pack(
    "ArizePhoenixQueryEnginePack", "./arize_pack"
)
```

You can then inspect the files at `./arize_pack` or continue on to use the module.

```python
import os

from llama_index.core.node_parser import SentenceSplitter
from llama_index.readers.web import SimpleWebPageReader
from tqdm.auto import tqdm
```

Configure your OpenAI API key.

```python
os.environ["OPENAI_API_KEY"] = "copy-your-openai-api-key-here"
```

Parse your documents into a list of nodes and pass to your LlamaPack. In this example, use nodes from a Paul Graham essay as input.

```python
documents = SimpleWebPageReader().load_data(
    [
        "https://raw.githubusercontent.com/jerryjliu/llama_index/adb054429f642cc7bbfcb66d4c232e072325eeab/examples/paul_graham_essay/data/paul_graham_essay.txt"
    ]
)
parser = SentenceSplitter()
nodes = parser.get_nodes_from_documents(documents)
phoenix_pack = ArizePhoenixQueryEnginePack(nodes=nodes)
```

Run a set of queries via the pack's `run` method, which delegates to the underlying query engine.

```python
queries = [
    "What did Paul Graham do growing up?",
    "When and how did Paul Graham's mother die?",
    "What, in Paul Graham's opinion, is the most distinctive thing about YC?",
    "When and how did Paul Graham meet Jessica Livingston?",
    "What is Bel, and when and where was it written?",
]
for query in tqdm(queries):
    print("Query")
    print("=====")
    print(query)
    print()
    response = phoenix_pack.run(query)
    print("Response")
    print("========")
    print(response)
    print()
```

View your trace data in the Phoenix UI.

```python
phoenix_session_url = phoenix_pack.get_modules()["session_url"]
print(f"Open the Phoenix UI to view your trace data: {phoenix_session_url}")
```

You can access the internals of the LlamaPack, including your Phoenix session and your query engine, via the `get_modules` method.

```python
phoenix_pack.get_modules()
```

Check out the [Phoenix documentation](https://docs.arize.com/phoenix/) for more information!

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "llama-index-packs-arize-phoenix-query-engine",
    "maintainer": "axiomofjoy",
    "docs_url": null,
    "requires_python": "<3.12,>=3.9",
    "maintainer_email": null,
    "keywords": "arize, engine, index, phoenix, query",
    "author": "Your Name",
    "author_email": "you@example.com",
    "download_url": "https://files.pythonhosted.org/packages/df/72/29e77cc759b632e7863e4ff4cc7cb662865491a33ee89177a2cb1290aa55/llama_index_packs_arize_phoenix_query_engine-0.3.0.tar.gz",
    "platform": null,
    "description": "# Arize-Phoenix LlamaPack\n\nThis LlamaPack instruments your LlamaIndex app for LLM tracing with [Phoenix](https://github.com/Arize-ai/phoenix), an open-source LLM observability library from [Arize AI](https://phoenix.arize.com/).\n\n## CLI Usage\n\nYou can download llamapacks directly using `llamaindex-cli`, which comes installed with the `llama-index` python package:\n\n```bash\nllamaindex-cli download-llamapack ArizePhoenixQueryEnginePack --download-dir ./arize_pack\n```\n\nYou can then inspect the files at `./arize_pack` and use them as a template for your own project!\n\n## Code Usage\n\nYou can download the pack to a the `./arize_pack` directory:\n\n```python\nfrom llama_index.core.llama_pack import download_llama_pack\n\n# download and install dependencies\nArizePhoenixQueryEnginePack = download_llama_pack(\n    \"ArizePhoenixQueryEnginePack\", \"./arize_pack\"\n)\n```\n\nYou can then inspect the files at `./arize_pack` or continue on to use the module.\n\n```python\nimport os\n\nfrom llama_index.core.node_parser import SentenceSplitter\nfrom llama_index.readers.web import SimpleWebPageReader\nfrom tqdm.auto import tqdm\n```\n\nConfigure your OpenAI API key.\n\n```python\nos.environ[\"OPENAI_API_KEY\"] = \"copy-your-openai-api-key-here\"\n```\n\nParse your documents into a list of nodes and pass to your LlamaPack. In this example, use nodes from a Paul Graham essay as input.\n\n```python\ndocuments = SimpleWebPageReader().load_data(\n    [\n        \"https://raw.githubusercontent.com/jerryjliu/llama_index/adb054429f642cc7bbfcb66d4c232e072325eeab/examples/paul_graham_essay/data/paul_graham_essay.txt\"\n    ]\n)\nparser = SentenceSplitter()\nnodes = parser.get_nodes_from_documents(documents)\nphoenix_pack = ArizePhoenixQueryEnginePack(nodes=nodes)\n```\n\nRun a set of queries via the pack's `run` method, which delegates to the underlying query engine.\n\n```python\nqueries = [\n    \"What did Paul Graham do growing up?\",\n    \"When and how did Paul Graham's mother die?\",\n    \"What, in Paul Graham's opinion, is the most distinctive thing about YC?\",\n    \"When and how did Paul Graham meet Jessica Livingston?\",\n    \"What is Bel, and when and where was it written?\",\n]\nfor query in tqdm(queries):\n    print(\"Query\")\n    print(\"=====\")\n    print(query)\n    print()\n    response = phoenix_pack.run(query)\n    print(\"Response\")\n    print(\"========\")\n    print(response)\n    print()\n```\n\nView your trace data in the Phoenix UI.\n\n```python\nphoenix_session_url = phoenix_pack.get_modules()[\"session_url\"]\nprint(f\"Open the Phoenix UI to view your trace data: {phoenix_session_url}\")\n```\n\nYou can access the internals of the LlamaPack, including your Phoenix session and your query engine, via the `get_modules` method.\n\n```python\nphoenix_pack.get_modules()\n```\n\nCheck out the [Phoenix documentation](https://docs.arize.com/phoenix/) for more information!\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "llama-index packs arize_phoenix_query_engine integration",
    "version": "0.3.0",
    "project_urls": null,
    "split_keywords": [
        "arize",
        " engine",
        " index",
        " phoenix",
        " query"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "134b5b9264f2fbd7e4deee6531a73a167d1878b5b0305e13a2433e3c3f8e8e1a",
                "md5": "8070571c48c21d82abe3e66766af2b91",
                "sha256": "9b06a278526d5be09cb4c2f4fd2bb3a3460aa82fea4ee141ed1d20f539836052"
            },
            "downloads": -1,
            "filename": "llama_index_packs_arize_phoenix_query_engine-0.3.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "8070571c48c21d82abe3e66766af2b91",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<3.12,>=3.9",
            "size": 3923,
            "upload_time": "2024-11-18T02:31:52",
            "upload_time_iso_8601": "2024-11-18T02:31:52.716019Z",
            "url": "https://files.pythonhosted.org/packages/13/4b/5b9264f2fbd7e4deee6531a73a167d1878b5b0305e13a2433e3c3f8e8e1a/llama_index_packs_arize_phoenix_query_engine-0.3.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "df7229e77cc759b632e7863e4ff4cc7cb662865491a33ee89177a2cb1290aa55",
                "md5": "c9b2c41a97a01a6c129db6596317da4e",
                "sha256": "dd0ad4048f2084e6508e9bd6243fe117de2bab39e91d019921f8f8bdcc94a78f"
            },
            "downloads": -1,
            "filename": "llama_index_packs_arize_phoenix_query_engine-0.3.0.tar.gz",
            "has_sig": false,
            "md5_digest": "c9b2c41a97a01a6c129db6596317da4e",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<3.12,>=3.9",
            "size": 3487,
            "upload_time": "2024-11-18T02:31:54",
            "upload_time_iso_8601": "2024-11-18T02:31:54.293949Z",
            "url": "https://files.pythonhosted.org/packages/df/72/29e77cc759b632e7863e4ff4cc7cb662865491a33ee89177a2cb1290aa55/llama_index_packs_arize_phoenix_query_engine-0.3.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-11-18 02:31:54",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "lcname": "llama-index-packs-arize-phoenix-query-engine"
}
        
Elapsed time: 7.70660s