llama-index-agent-llm-compiler


Namellama-index-agent-llm-compiler JSON
Version 0.2.0 PyPI version JSON
download
home_pageNone
Summaryllama-index agent llm compiler integration
upload_time2024-08-22 13:33:46
maintainerjerryjliu
docs_urlNone
authorYour Name
requires_python<4.0,>=3.8.1
licenseMIT
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # LlamaIndex Agent Integration: LLM Compiler

This Agent integration implements the [LLMCompiler agent paper](https://github.com/SqueezeAILab/LLMCompiler).

A lot of code came from the source repo, we repurposed with LlamaIndex abstractions. All credits
to the original authors for a great work!

A full notebook guide can be found [here](https://github.com/run-llama/llama_index/blob/main/docs/docs/examples/agent/llm_compiler.ipynb).

## Usage

First install the package:

```bash
pip install llama-index-agent-llm-compiler
```

```python
# setup pack arguments

from llama_index.core.agent import AgentRunner
from llama_index.agent.llm_compiler.step import LLMCompilerAgentWorker

agent_worker = LLMCompilerAgentWorker.from_tools(
    tools, llm=llm, verbose=True, callback_manager=callback_manager
)
agent = AgentRunner(agent_worker, callback_manager=callback_manager)

# start using the agent
response = agent.chat("What is (121 * 3) + 42?")
```

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "llama-index-agent-llm-compiler",
    "maintainer": "jerryjliu",
    "docs_url": null,
    "requires_python": "<4.0,>=3.8.1",
    "maintainer_email": null,
    "keywords": null,
    "author": "Your Name",
    "author_email": "you@example.com",
    "download_url": "https://files.pythonhosted.org/packages/a5/d8/00dd9c27059465a5787e1deb1ac865ef66c2d6466c85bcb49373453d1d42/llama_index_agent_llm_compiler-0.2.0.tar.gz",
    "platform": null,
    "description": "# LlamaIndex Agent Integration: LLM Compiler\n\nThis Agent integration implements the [LLMCompiler agent paper](https://github.com/SqueezeAILab/LLMCompiler).\n\nA lot of code came from the source repo, we repurposed with LlamaIndex abstractions. All credits\nto the original authors for a great work!\n\nA full notebook guide can be found [here](https://github.com/run-llama/llama_index/blob/main/docs/docs/examples/agent/llm_compiler.ipynb).\n\n## Usage\n\nFirst install the package:\n\n```bash\npip install llama-index-agent-llm-compiler\n```\n\n```python\n# setup pack arguments\n\nfrom llama_index.core.agent import AgentRunner\nfrom llama_index.agent.llm_compiler.step import LLMCompilerAgentWorker\n\nagent_worker = LLMCompilerAgentWorker.from_tools(\n    tools, llm=llm, verbose=True, callback_manager=callback_manager\n)\nagent = AgentRunner(agent_worker, callback_manager=callback_manager)\n\n# start using the agent\nresponse = agent.chat(\"What is (121 * 3) + 42?\")\n```\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "llama-index agent llm compiler integration",
    "version": "0.2.0",
    "project_urls": null,
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "8529b1d5af31a79aed7d2bc7f121aea9c91937f184c66bdf289745cd69020423",
                "md5": "1e9ab424a8be83ad701595914dc2acb1",
                "sha256": "ca9d28e031fe1e6335147da8232dd2ca900fe00341e2f9eb44f88cc4053e8904"
            },
            "downloads": -1,
            "filename": "llama_index_agent_llm_compiler-0.2.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "1e9ab424a8be83ad701595914dc2acb1",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<4.0,>=3.8.1",
            "size": 19400,
            "upload_time": "2024-08-22T13:33:45",
            "upload_time_iso_8601": "2024-08-22T13:33:45.180652Z",
            "url": "https://files.pythonhosted.org/packages/85/29/b1d5af31a79aed7d2bc7f121aea9c91937f184c66bdf289745cd69020423/llama_index_agent_llm_compiler-0.2.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "a5d800dd9c27059465a5787e1deb1ac865ef66c2d6466c85bcb49373453d1d42",
                "md5": "7d3c291a0566ff823afe6bb97361778b",
                "sha256": "a6d40638fae383c7ca0d4e659cb047f549359875fcaaa3b3282a44bc35848668"
            },
            "downloads": -1,
            "filename": "llama_index_agent_llm_compiler-0.2.0.tar.gz",
            "has_sig": false,
            "md5_digest": "7d3c291a0566ff823afe6bb97361778b",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<4.0,>=3.8.1",
            "size": 17155,
            "upload_time": "2024-08-22T13:33:46",
            "upload_time_iso_8601": "2024-08-22T13:33:46.750937Z",
            "url": "https://files.pythonhosted.org/packages/a5/d8/00dd9c27059465a5787e1deb1ac865ef66c2d6466c85bcb49373453d1d42/llama_index_agent_llm_compiler-0.2.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-08-22 13:33:46",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "lcname": "llama-index-agent-llm-compiler"
}
        
Elapsed time: 0.30896s