# LLMCompiler Agent Pack
This LlamaPack implements the [LLMCompiler agent paper](https://github.com/SqueezeAILab/LLMCompiler).
A lot of code came from the source repo, we repurposed with LlamaIndex abstractions. All credits
to the original authors for a great work!
A full notebook guide can be found [here](https://github.com/run-llama/llama-hub/blob/main/llama_hub/llama_packs/agents/llm_compiler/llm_compiler.ipynb).
## CLI Usage
You can download llamapacks directly using `llamaindex-cli`, which comes installed with the `llama-index` python package:
```bash
llamaindex-cli download-llamapack LLMCompilerAgentPack --download-dir ./llm_compiler_agent_pack
```
You can then inspect the files at `./llm_compiler_agent_pack` and use them as a template for your own project!
## Code Usage
You can download the pack to a directory. **NOTE**: You must specify `skip_load=True` - the pack contains multiple files,
which makes it hard to load directly.
We will show you how to import the agent from these files!
```python
from llama_index.core.llama_pack import download_llama_pack
# download and install dependencies
download_llama_pack("LLMCompilerAgentPack", "./llm_compiler_agent_pack")
```
From here, you can use the pack. You can import the relevant modules from the download folder (in the example below we assume it's a relative import or the directory has been added to your system path).
```python
# setup pack arguments
from llama_index.core.agent import AgentRunner
from llm_compiler_agent_pack.step import LLMCompilerAgentWorker
agent_worker = LLMCompilerAgentWorker.from_tools(
tools, llm=llm, verbose=True, callback_manager=callback_manager
)
agent = AgentRunner(agent_worker, callback_manager=callback_manager)
# start using the agent
response = agent.chat("What is (121 * 3) + 42?")
```
You can also use/initialize the pack directly.
```python
from llm_compiler_agent_pack.base import LLMCompilerAgentPack
agent_pack = LLMCompilerAgentPack(tools, llm=llm)
```
The `run()` function is a light wrapper around `agent.chat()`.
```python
response = pack.run("Tell me about the population of Boston")
```
You can also directly get modules from the pack.
```python
# use the agent
agent = pack.agent
response = agent.chat("task")
```
Raw data
{
"_id": null,
"home_page": null,
"name": "llama-index-packs-agents-llm-compiler",
"maintainer": "jerryjliu",
"docs_url": null,
"requires_python": "<4.0,>=3.9",
"maintainer_email": null,
"keywords": "agent, compiler, llm",
"author": "Your Name",
"author_email": "you@example.com",
"download_url": "https://files.pythonhosted.org/packages/90/42/d08412eb1b93f4c975c902bbf52a17e19b8d6c097854448e577cbb44371c/llama_index_packs_agents_llm_compiler-0.3.0.tar.gz",
"platform": null,
"description": "# LLMCompiler Agent Pack\n\nThis LlamaPack implements the [LLMCompiler agent paper](https://github.com/SqueezeAILab/LLMCompiler).\n\nA lot of code came from the source repo, we repurposed with LlamaIndex abstractions. All credits\nto the original authors for a great work!\n\nA full notebook guide can be found [here](https://github.com/run-llama/llama-hub/blob/main/llama_hub/llama_packs/agents/llm_compiler/llm_compiler.ipynb).\n\n## CLI Usage\n\nYou can download llamapacks directly using `llamaindex-cli`, which comes installed with the `llama-index` python package:\n\n```bash\nllamaindex-cli download-llamapack LLMCompilerAgentPack --download-dir ./llm_compiler_agent_pack\n```\n\nYou can then inspect the files at `./llm_compiler_agent_pack` and use them as a template for your own project!\n\n## Code Usage\n\nYou can download the pack to a directory. **NOTE**: You must specify `skip_load=True` - the pack contains multiple files,\nwhich makes it hard to load directly.\n\nWe will show you how to import the agent from these files!\n\n```python\nfrom llama_index.core.llama_pack import download_llama_pack\n\n# download and install dependencies\ndownload_llama_pack(\"LLMCompilerAgentPack\", \"./llm_compiler_agent_pack\")\n```\n\nFrom here, you can use the pack. You can import the relevant modules from the download folder (in the example below we assume it's a relative import or the directory has been added to your system path).\n\n```python\n# setup pack arguments\n\nfrom llama_index.core.agent import AgentRunner\nfrom llm_compiler_agent_pack.step import LLMCompilerAgentWorker\n\nagent_worker = LLMCompilerAgentWorker.from_tools(\n tools, llm=llm, verbose=True, callback_manager=callback_manager\n)\nagent = AgentRunner(agent_worker, callback_manager=callback_manager)\n\n# start using the agent\nresponse = agent.chat(\"What is (121 * 3) + 42?\")\n```\n\nYou can also use/initialize the pack directly.\n\n```python\nfrom llm_compiler_agent_pack.base import LLMCompilerAgentPack\n\nagent_pack = LLMCompilerAgentPack(tools, llm=llm)\n```\n\nThe `run()` function is a light wrapper around `agent.chat()`.\n\n```python\nresponse = pack.run(\"Tell me about the population of Boston\")\n```\n\nYou can also directly get modules from the pack.\n\n```python\n# use the agent\nagent = pack.agent\nresponse = agent.chat(\"task\")\n```\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "llama-index packs agents llm compiler",
"version": "0.3.0",
"project_urls": null,
"split_keywords": [
"agent",
" compiler",
" llm"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "b46d320796ce639f307d7fab699d9cb6fe1c024fdd4ae852acac6e0c44ace03a",
"md5": "7e91dd1ecdbd29110866477a8c0f34c8",
"sha256": "397e857ff50d36c57247db5977b161f7b6603648145fc53204380e3d1adf29ef"
},
"downloads": -1,
"filename": "llama_index_packs_agents_llm_compiler-0.3.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "7e91dd1ecdbd29110866477a8c0f34c8",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": "<4.0,>=3.9",
"size": 20864,
"upload_time": "2024-11-18T01:32:58",
"upload_time_iso_8601": "2024-11-18T01:32:58.297101Z",
"url": "https://files.pythonhosted.org/packages/b4/6d/320796ce639f307d7fab699d9cb6fe1c024fdd4ae852acac6e0c44ace03a/llama_index_packs_agents_llm_compiler-0.3.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "9042d08412eb1b93f4c975c902bbf52a17e19b8d6c097854448e577cbb44371c",
"md5": "e135cd9c6a0e1717b48c8389fcccc7bd",
"sha256": "c4f2deede8503d7aecffff6e7581d1fb24e6abecb3977e2cb2b34e13705cf172"
},
"downloads": -1,
"filename": "llama_index_packs_agents_llm_compiler-0.3.0.tar.gz",
"has_sig": false,
"md5_digest": "e135cd9c6a0e1717b48c8389fcccc7bd",
"packagetype": "sdist",
"python_version": "source",
"requires_python": "<4.0,>=3.9",
"size": 18462,
"upload_time": "2024-11-18T01:32:59",
"upload_time_iso_8601": "2024-11-18T01:32:59.401126Z",
"url": "https://files.pythonhosted.org/packages/90/42/d08412eb1b93f4c975c902bbf52a17e19b8d6c097854448e577cbb44371c/llama_index_packs_agents_llm_compiler-0.3.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-11-18 01:32:59",
"github": false,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"lcname": "llama-index-packs-agents-llm-compiler"
}