llama-index-packs-trulens-eval-packs


Namellama-index-packs-trulens-eval-packs JSON
Version 0.3.0 PyPI version JSON
download
home_pageNone
Summaryllama-index packs trulens_eval_packs integration
upload_time2024-11-17 22:42:06
maintainerjoshreini1
docs_urlNone
authorYour Name
requires_python!=2.7.*,!=3.0.*,!=3.1.*,!=3.12.*,!=3.2.*,!=3.3.*,!=3.4.*,!=3.5.*,!=3.6.*,!=3.7.*,!=3.8.*,>=3.9
licenseMIT
keywords eval harmless helpful rag triad trulens
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # TruLens-Eval Llama-Pack

![TruLens](https://www.trulens.org/assets/images/Neural_Network_Explainability.png)

The best way to support TruLens is to give us a ⭐ on [GitHub](https://www.github.com/truera/trulens) and join our [slack community](https://communityinviter.com/apps/aiqualityforum/josh)!

TruLens provides three Llamma Packs for LLM app observability:

- The first is the **TruLensRAGTriadPack** (context relevance, groundedness, answer relevance). This triad holds the key to detecting hallucination.

- Second, is the **TruLensHarmlessPack** including moderation and safety evaluations like criminality, violence and more.

- Last is the **TruLensHelpfulPack**, including evaluations like conciseness and language match.

No matter which TruLens LlamaPack you choose, all three provide evaluation and tracking for your LlamaIndex app with [TruLens](https://github.com/truera/trulens), an open-source LLM observability library from [TruEra](https://www.truera.com/).

## CLI Usage

You can download llamapacks directly using `llamaindex-cli`, which comes installed with the `llama-index` python package:

```bash
llamaindex-cli download-llamapack TruLensRAGTriadPack --download-dir ./trulens_pack
```

You can then inspect the files at `./trulens_pack` and use them as a template for your own project.

## Code Usage

You can download each pack to a `./trulens_pack` directory:

```python
from llama_index.core.llama_pack import download_llama_pack

# download and install dependencies
TruLensRAGTriadPack = download_llama_pack(
    "TruLensRAGTriadPack", "./trulens_pack"
)
```

From here, you can use the pack, or inspect and modify the pack in `./trulens_pack`.

Then, you can set up the pack like so:

```python
import os

os.environ["OPENAI_API_KEY"] = "sk-..."

from tqdm.auto import tqdm
from llama_index.core.node_parser import SentenceSplitter
from llama_index.readers.web import SimpleWebPageReader

documents = SimpleWebPageReader(html_to_text=True).load_data(
    ["http://paulgraham.com/worked.html"]
)

splitter = SentenceSplitter()
nodes = splitter.get_nodes_from_documents(documents)

trulens_ragtriad_pack = TruLensRAGTriadPack(
    nodes=nodes, app_id="Query Engine v1: RAG Triad Evals"
)
```

Then run your queries and evaluate!

```python
queries = [
    "What did Paul Graham do growing up?",
    "When and how did Paul Graham's mother die?",
    "What, in Paul Graham's opinion, is the most distinctive thing about YC?",
    "When and how did Paul Graham meet Jessica Livingston?",
    "What is Bel, and when and where was it written?",
]
for query in tqdm(queries):
    print("Query")
    print("=====")
    print(query)
    print()
    response = trulens_ragtriad_pack.run(query)
    print("Response")
    print("========")
    print(response)
```

You can access the internals of the LlamaPack, including your TruLens session and your query engine, via the `get_modules` method.

```python
modules = trulens_ragtriad_pack.get_modules()
tru = modules["session"]
index = modules["index"]
query_engine = modules["query_engine"]
tru_query_engine = modules["tru_query_engine"]
```

```python
tru.get_leaderboard(app_ids=["Query Engine v1: RAG Triad Evals"])
```

## Resources

There is a more complete notebook demo [available in the llama-hub repo](https://github.com/run-llama/llama-hub/blob/main/llama_hub/llama_packs/trulens_eval_packs/trulens_eval_llama_packs.ipynb).

Check out the [TruLens documentation](https://www.trulens.org/trulens_eval/install/) for more information!

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "llama-index-packs-trulens-eval-packs",
    "maintainer": "joshreini1",
    "docs_url": null,
    "requires_python": "!=2.7.*,!=3.0.*,!=3.1.*,!=3.12.*,!=3.2.*,!=3.3.*,!=3.4.*,!=3.5.*,!=3.6.*,!=3.7.*,!=3.8.*,>=3.9",
    "maintainer_email": null,
    "keywords": "eval, harmless, helpful, rag, triad, trulens",
    "author": "Your Name",
    "author_email": "you@example.com",
    "download_url": "https://files.pythonhosted.org/packages/fb/d0/a27838d7a04b6b371a4f895d25301d84da835224c2e02c60e3f4741c35ab/llama_index_packs_trulens_eval_packs-0.3.0.tar.gz",
    "platform": null,
    "description": "# TruLens-Eval Llama-Pack\n\n![TruLens](https://www.trulens.org/assets/images/Neural_Network_Explainability.png)\n\nThe best way to support TruLens is to give us a \u2b50 on [GitHub](https://www.github.com/truera/trulens) and join our [slack community](https://communityinviter.com/apps/aiqualityforum/josh)!\n\nTruLens provides three Llamma Packs for LLM app observability:\n\n- The first is the **TruLensRAGTriadPack** (context relevance, groundedness, answer relevance). This triad holds the key to detecting hallucination.\n\n- Second, is the **TruLensHarmlessPack** including moderation and safety evaluations like criminality, violence and more.\n\n- Last is the **TruLensHelpfulPack**, including evaluations like conciseness and language match.\n\nNo matter which TruLens LlamaPack you choose, all three provide evaluation and tracking for your LlamaIndex app with [TruLens](https://github.com/truera/trulens), an open-source LLM observability library from [TruEra](https://www.truera.com/).\n\n## CLI Usage\n\nYou can download llamapacks directly using `llamaindex-cli`, which comes installed with the `llama-index` python package:\n\n```bash\nllamaindex-cli download-llamapack TruLensRAGTriadPack --download-dir ./trulens_pack\n```\n\nYou can then inspect the files at `./trulens_pack` and use them as a template for your own project.\n\n## Code Usage\n\nYou can download each pack to a `./trulens_pack` directory:\n\n```python\nfrom llama_index.core.llama_pack import download_llama_pack\n\n# download and install dependencies\nTruLensRAGTriadPack = download_llama_pack(\n    \"TruLensRAGTriadPack\", \"./trulens_pack\"\n)\n```\n\nFrom here, you can use the pack, or inspect and modify the pack in `./trulens_pack`.\n\nThen, you can set up the pack like so:\n\n```python\nimport os\n\nos.environ[\"OPENAI_API_KEY\"] = \"sk-...\"\n\nfrom tqdm.auto import tqdm\nfrom llama_index.core.node_parser import SentenceSplitter\nfrom llama_index.readers.web import SimpleWebPageReader\n\ndocuments = SimpleWebPageReader(html_to_text=True).load_data(\n    [\"http://paulgraham.com/worked.html\"]\n)\n\nsplitter = SentenceSplitter()\nnodes = splitter.get_nodes_from_documents(documents)\n\ntrulens_ragtriad_pack = TruLensRAGTriadPack(\n    nodes=nodes, app_id=\"Query Engine v1: RAG Triad Evals\"\n)\n```\n\nThen run your queries and evaluate!\n\n```python\nqueries = [\n    \"What did Paul Graham do growing up?\",\n    \"When and how did Paul Graham's mother die?\",\n    \"What, in Paul Graham's opinion, is the most distinctive thing about YC?\",\n    \"When and how did Paul Graham meet Jessica Livingston?\",\n    \"What is Bel, and when and where was it written?\",\n]\nfor query in tqdm(queries):\n    print(\"Query\")\n    print(\"=====\")\n    print(query)\n    print()\n    response = trulens_ragtriad_pack.run(query)\n    print(\"Response\")\n    print(\"========\")\n    print(response)\n```\n\nYou can access the internals of the LlamaPack, including your TruLens session and your query engine, via the `get_modules` method.\n\n```python\nmodules = trulens_ragtriad_pack.get_modules()\ntru = modules[\"session\"]\nindex = modules[\"index\"]\nquery_engine = modules[\"query_engine\"]\ntru_query_engine = modules[\"tru_query_engine\"]\n```\n\n```python\ntru.get_leaderboard(app_ids=[\"Query Engine v1: RAG Triad Evals\"])\n```\n\n## Resources\n\nThere is a more complete notebook demo [available in the llama-hub repo](https://github.com/run-llama/llama-hub/blob/main/llama_hub/llama_packs/trulens_eval_packs/trulens_eval_llama_packs.ipynb).\n\nCheck out the [TruLens documentation](https://www.trulens.org/trulens_eval/install/) for more information!\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "llama-index packs trulens_eval_packs integration",
    "version": "0.3.0",
    "project_urls": null,
    "split_keywords": [
        "eval",
        " harmless",
        " helpful",
        " rag",
        " triad",
        " trulens"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "0678eab49dbbd49fc4118ae76d2b03654fbb60488a7277a62e8d2874d9056fa3",
                "md5": "24b5e064389bd9d96f160a61f94a0cc1",
                "sha256": "28f250a8b275a49da03cd106795a81d175f853746fccb485b707f3388f9773ec"
            },
            "downloads": -1,
            "filename": "llama_index_packs_trulens_eval_packs-0.3.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "24b5e064389bd9d96f160a61f94a0cc1",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "!=2.7.*,!=3.0.*,!=3.1.*,!=3.12.*,!=3.2.*,!=3.3.*,!=3.4.*,!=3.5.*,!=3.6.*,!=3.7.*,!=3.8.*,>=3.9",
            "size": 5236,
            "upload_time": "2024-11-17T22:42:05",
            "upload_time_iso_8601": "2024-11-17T22:42:05.268451Z",
            "url": "https://files.pythonhosted.org/packages/06/78/eab49dbbd49fc4118ae76d2b03654fbb60488a7277a62e8d2874d9056fa3/llama_index_packs_trulens_eval_packs-0.3.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "fbd0a27838d7a04b6b371a4f895d25301d84da835224c2e02c60e3f4741c35ab",
                "md5": "819e2c6bf111092cde5908c4976e2467",
                "sha256": "6bc21c20627d7468e4469844f34c64230506e8d24fe13b96f5cec5114d8fe009"
            },
            "downloads": -1,
            "filename": "llama_index_packs_trulens_eval_packs-0.3.0.tar.gz",
            "has_sig": false,
            "md5_digest": "819e2c6bf111092cde5908c4976e2467",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "!=2.7.*,!=3.0.*,!=3.1.*,!=3.12.*,!=3.2.*,!=3.3.*,!=3.4.*,!=3.5.*,!=3.6.*,!=3.7.*,!=3.8.*,>=3.9",
            "size": 4844,
            "upload_time": "2024-11-17T22:42:06",
            "upload_time_iso_8601": "2024-11-17T22:42:06.784034Z",
            "url": "https://files.pythonhosted.org/packages/fb/d0/a27838d7a04b6b371a4f895d25301d84da835224c2e02c60e3f4741c35ab/llama_index_packs_trulens_eval_packs-0.3.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-11-17 22:42:06",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "lcname": "llama-index-packs-trulens-eval-packs"
}
        
Elapsed time: 9.62743s