llama-index-packs-tables


Namellama-index-packs-tables JSON
Version 0.3.0 PyPI version JSON
download
home_pageNone
Summaryllama-index packs tables integration
upload_time2024-11-17 22:43:52
maintainerDisiok
docs_urlNone
authorYour Name
requires_python<4.0,>=3.9
licenseMIT
keywords chain dataframe pandas table tables
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # Tables Packs

## Chain-of-table Pack

This LlamaPack implements the [Chain-of-Table paper by Wang et al.](https://arxiv.org/pdf/2401.04398v1.pdf).

Chain-of-Table proposes the following: given a user query over tabular data, plan out a sequence of tabular operations over the table to retrieve the right information in order to satisfy the user query. The updated table is explicitly used/modified throughout the intermediate chain (unlike chain-of-thought/ReAct which uses generic thoughts).

There is a fixed set of tabular operations that are defined in the paper:

- `f_add_column`
- `f_select_row`
- `f_select_column`
- `f_group_by`
- `f_sort_by`

We implemented the paper based on the prompts described in the paper, and adapted it to get it working. That said, this is marked as beta, so there may still be kinks to work through. Do you have suggestions / contributions on how to improve the robustness? Let us know!

A full notebook guide can be found [here](https://github.com/run-llama/llama-hub/blob/main/llama_hub/llama_packs/tables/chain_of_table/chain_of_table.ipynb).

### CLI Usage

You can download llamapacks directly using `llamaindex-cli`, which comes installed with the `llama-index` python package:

```bash
llamaindex-cli download-llamapack ChainOfTablePack --download-dir ./chain_of_table_pack
```

You can then inspect the files at `./chain_of_table_pack` and use them as a template for your own project!

### Code Usage

We will show you how to import the agent from these files!

```python
from llama_index.core.llama_pack import download_llama_pack

# download and install dependencies
ChainOfTablePack = download_llama_pack(
    "ChainOfTablePack", "./chain_of_table_pack"
)
```

From here, you can use the pack. You can import the relevant modules from the download folder (in the example below we assume it's a relative import or the directory
has been added to your system path).

```python
from chain_of_table_pack.base import ChainOfTableQueryEngine, serialize_table

query_engine = ChainOfTableQueryEngine(df, llm=llm, verbose=True)
response = query_engine.query(
    "Who won best Director in the 1972 Academy Awards?"
)
```

You can also use/initialize the pack directly.

```python
from llm_compiler_agent_pack.base import ChainOfTablePack

agent_pack = ChainOfTablePack(df, llm=llm, verbose=True)
```

The `run()` function is a light wrapper around `agent.chat()`.

```python
response = pack.run("Who won best Director in the 1972 Academy Awards?")
```

## Mix-Self-Consistency Pack

This LlamaPack implements the mix self-consistency method proposed in ["Rethinking Tabular Data Understanding with Large Language Models"](https://arxiv.org/pdf/2312.16702v1.pdf) paper by Liu et al.

LLMs can reason over tabular data in 2 main ways:

1. textual reasoning via direct prompting
2. symbolic reasoning via program synthesis (e.g. python, SQL, etc)

The key insight of the paper is that different reasoning pathways work well in different tasks. By aggregating results from both with a self-consistency mechanism (i.e. majority voting), it achieves SoTA performance.

We implemented the paper based on the prompts described in the paper, and adapted it to get it working. That said, this is marked as beta, so there may still be kinks to work through. Do you have suggestions / contributions on how to improve the robustness? Let us know!

A full notebook guide can be found [here](https://github.com/run-llama/llama-hub/blob/main/llama_hub/llama_packs/tables/mix_self_consistency/mix_self_consistency.ipynb).

### CLI Usage

You can download llamapacks directly using `llamaindex-cli`, which comes installed with the `llama-index` python package:

```bash
llamaindex-cli download-llamapack MixSelfConsistencyPack --download-dir ./mix_self_consistency_pack
```

You can then inspect the files at `./mix_self_consistency_pack` and use them as a template for your own project!

### Code Usage

We will show you how to import the module from these files!

```python
from llama_index.core.llama_pack import download_llama_pack

# download and install dependencies
MixSelfConsistencyPack = download_llama_pack(
    "MixSelfConsistencyPack", "./mix_self_consistency_pack"
)
```

From here, you can use the pack. You can import the relevant modules from the download folder (in the example below we assume it's a relative import or the directory
has been added to your system path).

```python
from mix_self_consistency_pack.base import MixSelfConsistencyQueryEngine

query_engine = MixSelfConsistencyQueryEngine(df=df, llm=llm, verbose=True)
response = query_engine.query(
    "Who won best Director in the 1972 Academy Awards?"
)
```

You can also use/initialize the pack directly.

```python
from mix_self_consistency_pack.base import MixSelfConsistencyPack

pack = MixSelfConsistencyPack(df=df, llm=llm, verbose=True)
```

The `run()` function is a light wrapper around `query_engine.query()`.

```python
response = pack.run("Who won best Director in the 1972 Academy Awards?")
```

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "llama-index-packs-tables",
    "maintainer": "Disiok",
    "docs_url": null,
    "requires_python": "<4.0,>=3.9",
    "maintainer_email": null,
    "keywords": "chain, dataframe, pandas, table, tables",
    "author": "Your Name",
    "author_email": "you@example.com",
    "download_url": "https://files.pythonhosted.org/packages/6f/5c/3f4042791f9f9a2d8dcd7321fa0cc1ee0fe091467264501cc68fee63d974/llama_index_packs_tables-0.3.0.tar.gz",
    "platform": null,
    "description": "# Tables Packs\n\n## Chain-of-table Pack\n\nThis LlamaPack implements the [Chain-of-Table paper by Wang et al.](https://arxiv.org/pdf/2401.04398v1.pdf).\n\nChain-of-Table proposes the following: given a user query over tabular data, plan out a sequence of tabular operations over the table to retrieve the right information in order to satisfy the user query. The updated table is explicitly used/modified throughout the intermediate chain (unlike chain-of-thought/ReAct which uses generic thoughts).\n\nThere is a fixed set of tabular operations that are defined in the paper:\n\n- `f_add_column`\n- `f_select_row`\n- `f_select_column`\n- `f_group_by`\n- `f_sort_by`\n\nWe implemented the paper based on the prompts described in the paper, and adapted it to get it working. That said, this is marked as beta, so there may still be kinks to work through. Do you have suggestions / contributions on how to improve the robustness? Let us know!\n\nA full notebook guide can be found [here](https://github.com/run-llama/llama-hub/blob/main/llama_hub/llama_packs/tables/chain_of_table/chain_of_table.ipynb).\n\n### CLI Usage\n\nYou can download llamapacks directly using `llamaindex-cli`, which comes installed with the `llama-index` python package:\n\n```bash\nllamaindex-cli download-llamapack ChainOfTablePack --download-dir ./chain_of_table_pack\n```\n\nYou can then inspect the files at `./chain_of_table_pack` and use them as a template for your own project!\n\n### Code Usage\n\nWe will show you how to import the agent from these files!\n\n```python\nfrom llama_index.core.llama_pack import download_llama_pack\n\n# download and install dependencies\nChainOfTablePack = download_llama_pack(\n    \"ChainOfTablePack\", \"./chain_of_table_pack\"\n)\n```\n\nFrom here, you can use the pack. You can import the relevant modules from the download folder (in the example below we assume it's a relative import or the directory\nhas been added to your system path).\n\n```python\nfrom chain_of_table_pack.base import ChainOfTableQueryEngine, serialize_table\n\nquery_engine = ChainOfTableQueryEngine(df, llm=llm, verbose=True)\nresponse = query_engine.query(\n    \"Who won best Director in the 1972 Academy Awards?\"\n)\n```\n\nYou can also use/initialize the pack directly.\n\n```python\nfrom llm_compiler_agent_pack.base import ChainOfTablePack\n\nagent_pack = ChainOfTablePack(df, llm=llm, verbose=True)\n```\n\nThe `run()` function is a light wrapper around `agent.chat()`.\n\n```python\nresponse = pack.run(\"Who won best Director in the 1972 Academy Awards?\")\n```\n\n## Mix-Self-Consistency Pack\n\nThis LlamaPack implements the mix self-consistency method proposed in [\"Rethinking Tabular Data Understanding with Large Language Models\"](https://arxiv.org/pdf/2312.16702v1.pdf) paper by Liu et al.\n\nLLMs can reason over tabular data in 2 main ways:\n\n1. textual reasoning via direct prompting\n2. symbolic reasoning via program synthesis (e.g. python, SQL, etc)\n\nThe key insight of the paper is that different reasoning pathways work well in different tasks. By aggregating results from both with a self-consistency mechanism (i.e. majority voting), it achieves SoTA performance.\n\nWe implemented the paper based on the prompts described in the paper, and adapted it to get it working. That said, this is marked as beta, so there may still be kinks to work through. Do you have suggestions / contributions on how to improve the robustness? Let us know!\n\nA full notebook guide can be found [here](https://github.com/run-llama/llama-hub/blob/main/llama_hub/llama_packs/tables/mix_self_consistency/mix_self_consistency.ipynb).\n\n### CLI Usage\n\nYou can download llamapacks directly using `llamaindex-cli`, which comes installed with the `llama-index` python package:\n\n```bash\nllamaindex-cli download-llamapack MixSelfConsistencyPack --download-dir ./mix_self_consistency_pack\n```\n\nYou can then inspect the files at `./mix_self_consistency_pack` and use them as a template for your own project!\n\n### Code Usage\n\nWe will show you how to import the module from these files!\n\n```python\nfrom llama_index.core.llama_pack import download_llama_pack\n\n# download and install dependencies\nMixSelfConsistencyPack = download_llama_pack(\n    \"MixSelfConsistencyPack\", \"./mix_self_consistency_pack\"\n)\n```\n\nFrom here, you can use the pack. You can import the relevant modules from the download folder (in the example below we assume it's a relative import or the directory\nhas been added to your system path).\n\n```python\nfrom mix_self_consistency_pack.base import MixSelfConsistencyQueryEngine\n\nquery_engine = MixSelfConsistencyQueryEngine(df=df, llm=llm, verbose=True)\nresponse = query_engine.query(\n    \"Who won best Director in the 1972 Academy Awards?\"\n)\n```\n\nYou can also use/initialize the pack directly.\n\n```python\nfrom mix_self_consistency_pack.base import MixSelfConsistencyPack\n\npack = MixSelfConsistencyPack(df=df, llm=llm, verbose=True)\n```\n\nThe `run()` function is a light wrapper around `query_engine.query()`.\n\n```python\nresponse = pack.run(\"Who won best Director in the 1972 Academy Awards?\")\n```\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "llama-index packs tables integration",
    "version": "0.3.0",
    "project_urls": null,
    "split_keywords": [
        "chain",
        " dataframe",
        " pandas",
        " table",
        " tables"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "cf0e8f01a322e3b4ddbbd760f3ae09452f4552ab6d5e1b579abafa95fd726b00",
                "md5": "745cf6c69ab43ce0494b21ad5964e41c",
                "sha256": "b5182d1ddb9d008c8b43667da6cf628f45dc6f4a43432d27279c4e20e81b6fa1"
            },
            "downloads": -1,
            "filename": "llama_index_packs_tables-0.3.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "745cf6c69ab43ce0494b21ad5964e41c",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<4.0,>=3.9",
            "size": 14530,
            "upload_time": "2024-11-17T22:43:50",
            "upload_time_iso_8601": "2024-11-17T22:43:50.648925Z",
            "url": "https://files.pythonhosted.org/packages/cf/0e/8f01a322e3b4ddbbd760f3ae09452f4552ab6d5e1b579abafa95fd726b00/llama_index_packs_tables-0.3.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "6f5c3f4042791f9f9a2d8dcd7321fa0cc1ee0fe091467264501cc68fee63d974",
                "md5": "2a277f3bdfe00ad9ffd8e279706e4e3e",
                "sha256": "7f4d7119c4597582e8014d2d8ad8f50fe161a80a5e75b1b22509446040a94481"
            },
            "downloads": -1,
            "filename": "llama_index_packs_tables-0.3.0.tar.gz",
            "has_sig": false,
            "md5_digest": "2a277f3bdfe00ad9ffd8e279706e4e3e",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<4.0,>=3.9",
            "size": 14236,
            "upload_time": "2024-11-17T22:43:52",
            "upload_time_iso_8601": "2024-11-17T22:43:52.231653Z",
            "url": "https://files.pythonhosted.org/packages/6f/5c/3f4042791f9f9a2d8dcd7321fa0cc1ee0fe091467264501cc68fee63d974/llama_index_packs_tables-0.3.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-11-17 22:43:52",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "lcname": "llama-index-packs-tables"
}
        
Elapsed time: 0.35466s