# RAG Local CLI Pack
This LlamaPack implements a fully local version of our [RAG CLI](https://docs.llamaindex.ai/en/stable/use_cases/q_and_a/rag_cli.html),
with Mistral (through Ollama) and [BGE-M3](https://huggingface.co/BAAI/bge-m3).
## CLI Usage
You can download llamapacks directly using `llamaindex-cli`, which comes installed with the `llama-index` python package:
```bash
llamaindex-cli download-llamapack LocalRAGCLIPack --download-dir ./local_rag_cli_pack
```
You can then inspect the files at `./local_rag_cli_pack` and use them as a template for your own project!
## Code Usage
You can download the pack to a directory. **NOTE**: You must specify `skip_load=True` - the pack contains multiple files,
which makes it hard to load directly.
We will show you how to import the agent from these files!
```python
from llama_index.core.llama_pack import download_llama_pack
# download and install dependencies
download_llama_pack("LocalRAGCLIPack", "./local_rag_cli_pack", skip_load=True)
```
From here, you can use the pack. The most straightforward way is through the CLI. You can directly run base.py, or run the `setup_cli.sh` script.
```bash
cd local_rag_cli_pack
# option 1
python base.py rag -h
# option 2 - you may need sudo
# default name is lcli_local
sudo sh setup_cli.sh
lcli_local rag -h
```
You can also directly get modules from the pack.
```python
from local_rag_cli_pack.base import LocalRAGCLIPack
pack = LocalRAGCLIPack(
verbose=True, llm_model_name="mistral", embed_model_name="BAAI/bge-m3"
)
# will spin up the CLI
pack.run()
# get modules
rag_cli = pack.get_modules()["rag_cli"]
rag_cli.cli()
```
Raw data
{
"_id": null,
"home_page": null,
"name": "llama-index-packs-rag-cli-local",
"maintainer": "jerryjliu",
"docs_url": null,
"requires_python": "<4.0,>=3.9",
"maintainer_email": null,
"keywords": "cli, local, rag",
"author": "Your Name",
"author_email": "you@example.com",
"download_url": "https://files.pythonhosted.org/packages/8c/8d/4c9d49aef4b644d493c28ae5a72dd1907f01979bdee2887463b5c2edfec7/llama_index_packs_rag_cli_local-0.4.0.tar.gz",
"platform": null,
"description": "# RAG Local CLI Pack\n\nThis LlamaPack implements a fully local version of our [RAG CLI](https://docs.llamaindex.ai/en/stable/use_cases/q_and_a/rag_cli.html),\nwith Mistral (through Ollama) and [BGE-M3](https://huggingface.co/BAAI/bge-m3).\n\n## CLI Usage\n\nYou can download llamapacks directly using `llamaindex-cli`, which comes installed with the `llama-index` python package:\n\n```bash\nllamaindex-cli download-llamapack LocalRAGCLIPack --download-dir ./local_rag_cli_pack\n```\n\nYou can then inspect the files at `./local_rag_cli_pack` and use them as a template for your own project!\n\n## Code Usage\n\nYou can download the pack to a directory. **NOTE**: You must specify `skip_load=True` - the pack contains multiple files,\nwhich makes it hard to load directly.\n\nWe will show you how to import the agent from these files!\n\n```python\nfrom llama_index.core.llama_pack import download_llama_pack\n\n# download and install dependencies\ndownload_llama_pack(\"LocalRAGCLIPack\", \"./local_rag_cli_pack\", skip_load=True)\n```\n\nFrom here, you can use the pack. The most straightforward way is through the CLI. You can directly run base.py, or run the `setup_cli.sh` script.\n\n```bash\ncd local_rag_cli_pack\n\n# option 1\npython base.py rag -h\n\n# option 2 - you may need sudo\n# default name is lcli_local\nsudo sh setup_cli.sh\nlcli_local rag -h\n\n```\n\nYou can also directly get modules from the pack.\n\n```python\nfrom local_rag_cli_pack.base import LocalRAGCLIPack\n\npack = LocalRAGCLIPack(\n verbose=True, llm_model_name=\"mistral\", embed_model_name=\"BAAI/bge-m3\"\n)\n# will spin up the CLI\npack.run()\n\n# get modules\nrag_cli = pack.get_modules()[\"rag_cli\"]\nrag_cli.cli()\n```\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "llama-index packs rag cli local integration",
"version": "0.4.0",
"project_urls": null,
"split_keywords": [
"cli",
" local",
" rag"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "3ada2498237873a8644e7332d1465130025681d5281f9c00ee25a4d0fdafb5d9",
"md5": "c72ca0c2bb3e61e7451b77bb9e89ae77",
"sha256": "cfc6d96943b4af3fb5a76635347bb9e39f9aed84e66752750c969bad68a95b08"
},
"downloads": -1,
"filename": "llama_index_packs_rag_cli_local-0.4.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "c72ca0c2bb3e61e7451b77bb9e89ae77",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": "<4.0,>=3.9",
"size": 3718,
"upload_time": "2024-11-18T01:32:18",
"upload_time_iso_8601": "2024-11-18T01:32:18.811885Z",
"url": "https://files.pythonhosted.org/packages/3a/da/2498237873a8644e7332d1465130025681d5281f9c00ee25a4d0fdafb5d9/llama_index_packs_rag_cli_local-0.4.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "8c8d4c9d49aef4b644d493c28ae5a72dd1907f01979bdee2887463b5c2edfec7",
"md5": "c9ef60ec991fab39fe51c7f2ca39355f",
"sha256": "8fa023a32ff10327e2b7ac218f3383d560012059e1db60a5aa8de00e969a4130"
},
"downloads": -1,
"filename": "llama_index_packs_rag_cli_local-0.4.0.tar.gz",
"has_sig": false,
"md5_digest": "c9ef60ec991fab39fe51c7f2ca39355f",
"packagetype": "sdist",
"python_version": "source",
"requires_python": "<4.0,>=3.9",
"size": 3445,
"upload_time": "2024-11-18T01:32:22",
"upload_time_iso_8601": "2024-11-18T01:32:22.717421Z",
"url": "https://files.pythonhosted.org/packages/8c/8d/4c9d49aef4b644d493c28ae5a72dd1907f01979bdee2887463b5c2edfec7/llama_index_packs_rag_cli_local-0.4.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-11-18 01:32:22",
"github": false,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"lcname": "llama-index-packs-rag-cli-local"
}