llama-index-packs-agents-coa


Namellama-index-packs-agents-coa JSON
Version 0.3.0 PyPI version JSON
download
home_pageNone
Summaryllama-index packs for chain-of-abstraction
upload_time2024-11-17 22:43:43
maintainerjerryjliu
docs_urlNone
authorLogan Markewich
requires_python<4.0,>=3.9
licenseMIT
keywords abstraction agent chain coa
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # Chain-of-Abstraction Agent Pack

`pip install llama-index-packs-agents-coa`

The chain-of-abstraction (CoA) LlamaPack implements a generalized version of the strategy described in the [origin CoA paper](https://arxiv.org/abs/2401.17464).

By prompting the LLM to write function calls in a chain-of-thought format, we can execute both simple and complex combinations of function calls needed to execute a task.

The LLM is prompted to write a response containing function calls, for example, a CoA plan might look like:

```
After buying the apples, Sally has [FUNC add(3, 2) = y1] apples.
Then, the wizard casts a spell to multiply the number of apples by 3,
resulting in [FUNC multiply(y1, 3) = y2] apples.
```

From there, the function calls can be parsed into a dependency graph, and executed.

Then, the values in the CoA are replaced with their actual results.

As an extension to the original paper, we also run the LLM a final time, to rewrite the response in a more readable and user-friendly way.

**NOTE:** In the original paper, the authors fine-tuned an LLM specifically for this, and also for specific functions and datasets. As such, only capabale LLMs (OpenAI, Anthropic, etc.) will be (hopefully) reliable for this without finetuning.

A full example notebook is [also provided](https://github.com/run-llama/llama_index/blob/main/docs/docs/examples/agent/coa_agent.ipynb).

## Code Usage

`pip install llama-index-packs-agents-coa`

First, setup some tools (could be function tools, query engines, etc.)

```python
from llama_index.core.tools import QueryEngineTool, FunctionTool


def add(a: int, b: int) -> int:
    """Add two numbers together."""
    return a + b


query_engine = index.as_query_engine(...)

function_tool = FunctionTool.from_defaults(fn=add)
query_tool = QueryEngineTool.from_defaults(
    query_engine=query_engine, name="...", description="..."
)
```

Next, create the pack with the tools, and run it!

```python
from llama_index.packs.agents_coa import CoAAgentPack
from llama_index.llms.openai import OpenAI

pack = CoAAgentPack(
    tools=[function_tool, query_tool], llm=OpenAI(model="gpt-4")
)

print(pack.run("What is 1245 + 4321?"))
```

See the example notebook for [more thorough details](https://github.com/run-llama/llama_index/blob/main/docs/docs/examples/agent/coa_agent.ipynb).

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "llama-index-packs-agents-coa",
    "maintainer": "jerryjliu",
    "docs_url": null,
    "requires_python": "<4.0,>=3.9",
    "maintainer_email": null,
    "keywords": "abstraction, agent, chain, coa",
    "author": "Logan Markewich",
    "author_email": "logan@runllama.ai",
    "download_url": "https://files.pythonhosted.org/packages/fc/2c/60ce86dcfd3eaa026429016e0e0f5d0f709ff4937e36adc6b68ffa575c60/llama_index_packs_agents_coa-0.3.0.tar.gz",
    "platform": null,
    "description": "# Chain-of-Abstraction Agent Pack\n\n`pip install llama-index-packs-agents-coa`\n\nThe chain-of-abstraction (CoA) LlamaPack implements a generalized version of the strategy described in the [origin CoA paper](https://arxiv.org/abs/2401.17464).\n\nBy prompting the LLM to write function calls in a chain-of-thought format, we can execute both simple and complex combinations of function calls needed to execute a task.\n\nThe LLM is prompted to write a response containing function calls, for example, a CoA plan might look like:\n\n```\nAfter buying the apples, Sally has [FUNC add(3, 2) = y1] apples.\nThen, the wizard casts a spell to multiply the number of apples by 3,\nresulting in [FUNC multiply(y1, 3) = y2] apples.\n```\n\nFrom there, the function calls can be parsed into a dependency graph, and executed.\n\nThen, the values in the CoA are replaced with their actual results.\n\nAs an extension to the original paper, we also run the LLM a final time, to rewrite the response in a more readable and user-friendly way.\n\n**NOTE:** In the original paper, the authors fine-tuned an LLM specifically for this, and also for specific functions and datasets. As such, only capabale LLMs (OpenAI, Anthropic, etc.) will be (hopefully) reliable for this without finetuning.\n\nA full example notebook is [also provided](https://github.com/run-llama/llama_index/blob/main/docs/docs/examples/agent/coa_agent.ipynb).\n\n## Code Usage\n\n`pip install llama-index-packs-agents-coa`\n\nFirst, setup some tools (could be function tools, query engines, etc.)\n\n```python\nfrom llama_index.core.tools import QueryEngineTool, FunctionTool\n\n\ndef add(a: int, b: int) -> int:\n    \"\"\"Add two numbers together.\"\"\"\n    return a + b\n\n\nquery_engine = index.as_query_engine(...)\n\nfunction_tool = FunctionTool.from_defaults(fn=add)\nquery_tool = QueryEngineTool.from_defaults(\n    query_engine=query_engine, name=\"...\", description=\"...\"\n)\n```\n\nNext, create the pack with the tools, and run it!\n\n```python\nfrom llama_index.packs.agents_coa import CoAAgentPack\nfrom llama_index.llms.openai import OpenAI\n\npack = CoAAgentPack(\n    tools=[function_tool, query_tool], llm=OpenAI(model=\"gpt-4\")\n)\n\nprint(pack.run(\"What is 1245 + 4321?\"))\n```\n\nSee the example notebook for [more thorough details](https://github.com/run-llama/llama_index/blob/main/docs/docs/examples/agent/coa_agent.ipynb).\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "llama-index packs for chain-of-abstraction",
    "version": "0.3.0",
    "project_urls": null,
    "split_keywords": [
        "abstraction",
        " agent",
        " chain",
        " coa"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "a1506070836c5e6026691de672bc2bbe9688d783e6845aa470d9811ced696bed",
                "md5": "faa9c4615227b95611666f4602863d58",
                "sha256": "b4f4928293eb97d707166d5fa1e3084c452535573d60bc2186585d6631d1a50c"
            },
            "downloads": -1,
            "filename": "llama_index_packs_agents_coa-0.3.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "faa9c4615227b95611666f4602863d58",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<4.0,>=3.9",
            "size": 9236,
            "upload_time": "2024-11-17T22:43:41",
            "upload_time_iso_8601": "2024-11-17T22:43:41.822773Z",
            "url": "https://files.pythonhosted.org/packages/a1/50/6070836c5e6026691de672bc2bbe9688d783e6845aa470d9811ced696bed/llama_index_packs_agents_coa-0.3.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "fc2c60ce86dcfd3eaa026429016e0e0f5d0f709ff4937e36adc6b68ffa575c60",
                "md5": "e8e063fb7a674ee7c1c707befc4f24e3",
                "sha256": "d9b24a677fee380b2a109c456ab02b2e33dc8e460ff8f634f28f1cfdb019ded3"
            },
            "downloads": -1,
            "filename": "llama_index_packs_agents_coa-0.3.0.tar.gz",
            "has_sig": false,
            "md5_digest": "e8e063fb7a674ee7c1c707befc4f24e3",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<4.0,>=3.9",
            "size": 8378,
            "upload_time": "2024-11-17T22:43:43",
            "upload_time_iso_8601": "2024-11-17T22:43:43.427669Z",
            "url": "https://files.pythonhosted.org/packages/fc/2c/60ce86dcfd3eaa026429016e0e0f5d0f709ff4937e36adc6b68ffa575c60/llama_index_packs_agents_coa-0.3.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-11-17 22:43:43",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "lcname": "llama-index-packs-agents-coa"
}
        
Elapsed time: 0.47951s