# LLMCompiler Agent Pack
This LlamaPack implements the [LLMCompiler agent paper](https://github.com/SqueezeAILab/LLMCompiler).
A lot of code came from the source repo, we repurposed with LlamaIndex abstractions. All credits
to the original authors for a great work!
A full notebook guide can be found [here](https://github.com/run-llama/llama-hub/blob/main/llama_hub/llama_packs/agents/llm_compiler/llm_compiler.ipynb).
## CLI Usage
You can download llamapacks directly using `llamaindex-cli`, which comes installed with the `llama-index` python package:
```bash
llamaindex-cli download-llamapack LLMCompilerAgentPack --download-dir ./llm_compiler_agent_pack
```
You can then inspect the files at `./llm_compiler_agent_pack` and use them as a template for your own project!
## Code Usage
You can download the pack to a directory. **NOTE**: You must specify `skip_load=True` - the pack contains multiple files,
which makes it hard to load directly.
We will show you how to import the agent from these files!
```python
from llama_index.core.llama_pack import download_llama_pack
# download and install dependencies
download_llama_pack("LLMCompilerAgentPack", "./llm_compiler_agent_pack")
```
From here, you can use the pack. You can import the relevant modules from the download folder (in the example below we assume it's a relative import or the directory has been added to your system path).
```python
# setup pack arguments
from llama_index.core.agent import AgentRunner
from llm_compiler_agent_pack.step import LLMCompilerAgentWorker
agent_worker = LLMCompilerAgentWorker.from_tools(
tools, llm=llm, verbose=True, callback_manager=callback_manager
)
agent = AgentRunner(agent_worker, callback_manager=callback_manager)
# start using the agent
response = agent.chat("What is (121 * 3) + 42?")
```
You can also use/initialize the pack directly.
```python
from llm_compiler_agent_pack.base import LLMCompilerAgentPack
agent_pack = LLMCompilerAgentPack(tools, llm=llm)
```
The `run()` function is a light wrapper around `agent.chat()`.
```python
response = pack.run("Tell me about the population of Boston")
```
You can also directly get modules from the pack.
```python
# use the agent
agent = pack.agent
response = agent.chat("task")
```
Raw data
{
"_id": null,
"home_page": "",
"name": "llama-index-packs-agents-llm-compiler",
"maintainer": "jerryjliu",
"docs_url": null,
"requires_python": ">=3.8.1,<3.12",
"maintainer_email": "",
"keywords": "agent,compiler,llm",
"author": "Your Name",
"author_email": "you@example.com",
"download_url": "https://files.pythonhosted.org/packages/2d/63/8bcf060e8ded3867fa7ff5ecf134b64b4e57beff46610d665c89b0a88609/llama_index_packs_agents_llm_compiler-0.1.2.tar.gz",
"platform": null,
"description": "# LLMCompiler Agent Pack\n\nThis LlamaPack implements the [LLMCompiler agent paper](https://github.com/SqueezeAILab/LLMCompiler).\n\nA lot of code came from the source repo, we repurposed with LlamaIndex abstractions. All credits\nto the original authors for a great work!\n\nA full notebook guide can be found [here](https://github.com/run-llama/llama-hub/blob/main/llama_hub/llama_packs/agents/llm_compiler/llm_compiler.ipynb).\n\n## CLI Usage\n\nYou can download llamapacks directly using `llamaindex-cli`, which comes installed with the `llama-index` python package:\n\n```bash\nllamaindex-cli download-llamapack LLMCompilerAgentPack --download-dir ./llm_compiler_agent_pack\n```\n\nYou can then inspect the files at `./llm_compiler_agent_pack` and use them as a template for your own project!\n\n## Code Usage\n\nYou can download the pack to a directory. **NOTE**: You must specify `skip_load=True` - the pack contains multiple files,\nwhich makes it hard to load directly.\n\nWe will show you how to import the agent from these files!\n\n```python\nfrom llama_index.core.llama_pack import download_llama_pack\n\n# download and install dependencies\ndownload_llama_pack(\"LLMCompilerAgentPack\", \"./llm_compiler_agent_pack\")\n```\n\nFrom here, you can use the pack. You can import the relevant modules from the download folder (in the example below we assume it's a relative import or the directory has been added to your system path).\n\n```python\n# setup pack arguments\n\nfrom llama_index.core.agent import AgentRunner\nfrom llm_compiler_agent_pack.step import LLMCompilerAgentWorker\n\nagent_worker = LLMCompilerAgentWorker.from_tools(\n tools, llm=llm, verbose=True, callback_manager=callback_manager\n)\nagent = AgentRunner(agent_worker, callback_manager=callback_manager)\n\n# start using the agent\nresponse = agent.chat(\"What is (121 * 3) + 42?\")\n```\n\nYou can also use/initialize the pack directly.\n\n```python\nfrom llm_compiler_agent_pack.base import LLMCompilerAgentPack\n\nagent_pack = LLMCompilerAgentPack(tools, llm=llm)\n```\n\nThe `run()` function is a light wrapper around `agent.chat()`.\n\n```python\nresponse = pack.run(\"Tell me about the population of Boston\")\n```\n\nYou can also directly get modules from the pack.\n\n```python\n# use the agent\nagent = pack.agent\nresponse = agent.chat(\"task\")\n```\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "llama-index packs agents llm compiler",
"version": "0.1.2",
"project_urls": null,
"split_keywords": [
"agent",
"compiler",
"llm"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "040ba64e5514b9516b78c35cda1e6cf8ab8cb0a9484d9475bb14291c05ef314e",
"md5": "32e39f086b9cabbcaa85460200942ed0",
"sha256": "a39281c9ddac82760561ae9955acbce680031cfed7607ce731e4d99f90ff781c"
},
"downloads": -1,
"filename": "llama_index_packs_agents_llm_compiler-0.1.2-py3-none-any.whl",
"has_sig": false,
"md5_digest": "32e39f086b9cabbcaa85460200942ed0",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.8.1,<3.12",
"size": 21081,
"upload_time": "2024-02-13T22:50:25",
"upload_time_iso_8601": "2024-02-13T22:50:25.089825Z",
"url": "https://files.pythonhosted.org/packages/04/0b/a64e5514b9516b78c35cda1e6cf8ab8cb0a9484d9475bb14291c05ef314e/llama_index_packs_agents_llm_compiler-0.1.2-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "2d638bcf060e8ded3867fa7ff5ecf134b64b4e57beff46610d665c89b0a88609",
"md5": "b9624f7c3e8448ef9e41fd66e118fb33",
"sha256": "904877f338e0ed0b8de9e46683cb40f62fa7cd8236fe30e5df57c45744feed1d"
},
"downloads": -1,
"filename": "llama_index_packs_agents_llm_compiler-0.1.2.tar.gz",
"has_sig": false,
"md5_digest": "b9624f7c3e8448ef9e41fd66e118fb33",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.8.1,<3.12",
"size": 18461,
"upload_time": "2024-02-13T22:50:26",
"upload_time_iso_8601": "2024-02-13T22:50:26.881265Z",
"url": "https://files.pythonhosted.org/packages/2d/63/8bcf060e8ded3867fa7ff5ecf134b64b4e57beff46610d665c89b0a88609/llama_index_packs_agents_llm_compiler-0.1.2.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-02-13 22:50:26",
"github": false,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"lcname": "llama-index-packs-agents-llm-compiler"
}