Name | llama-index-packs-rag-cli-local JSON |
Version |
0.1.4
JSON |
| download |
home_page | |
Summary | llama-index packs rag cli local integration |
upload_time | 2024-02-22 01:28:56 |
maintainer | jerryjliu |
docs_url | None |
author | Your Name |
requires_python | >=3.8.1,<4.0 |
license | MIT |
keywords |
cli
local
rag
|
VCS |
|
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
# RAG Local CLI Pack
This LlamaPack implements a fully local version of our [RAG CLI](https://docs.llamaindex.ai/en/stable/use_cases/q_and_a/rag_cli.html),
with Mistral (through Ollama) and [BGE-M3](https://huggingface.co/BAAI/bge-m3).
## CLI Usage
You can download llamapacks directly using `llamaindex-cli`, which comes installed with the `llama-index` python package:
```bash
llamaindex-cli download-llamapack LocalRAGCLIPack --download-dir ./local_rag_cli_pack
```
You can then inspect the files at `./local_rag_cli_pack` and use them as a template for your own project!
## Code Usage
You can download the pack to a directory. **NOTE**: You must specify `skip_load=True` - the pack contains multiple files,
which makes it hard to load directly.
We will show you how to import the agent from these files!
```python
from llama_index.llama_pack import download_llama_pack
# download and install dependencies
download_llama_pack("LocalRAGCLIPack", "./local_rag_cli_pack", skip_load=True)
```
From here, you can use the pack. The most straightforward way is through the CLI. You can directly run base.py, or run the `setup_cli.sh` script.
```bash
cd local_rag_cli_pack
# option 1
python base.py rag -h
# option 2 - you may need sudo
# default name is lcli_local
sudo sh setup_cli.sh
lcli_local rag -h
```
You can also directly get modules from the pack.
```python
from local_rag_cli_pack.base import LocalRAGCLIPack
pack = LocalRAGCLIPack(
verbose=True, llm_model_name="mistral", embed_model_name="BAAI/bge-m3"
)
# will spin up the CLI
pack.run()
# get modules
rag_cli = pack.get_modules()["rag_cli"]
rag_cli.cli()
```
Raw data
{
"_id": null,
"home_page": "",
"name": "llama-index-packs-rag-cli-local",
"maintainer": "jerryjliu",
"docs_url": null,
"requires_python": ">=3.8.1,<4.0",
"maintainer_email": "",
"keywords": "cli,local,rag",
"author": "Your Name",
"author_email": "you@example.com",
"download_url": "https://files.pythonhosted.org/packages/6e/b0/9a7087f10b578961265c0e018e013035e5d2430b74c8a1d1c429dc100ff8/llama_index_packs_rag_cli_local-0.1.4.tar.gz",
"platform": null,
"description": "# RAG Local CLI Pack\n\nThis LlamaPack implements a fully local version of our [RAG CLI](https://docs.llamaindex.ai/en/stable/use_cases/q_and_a/rag_cli.html),\nwith Mistral (through Ollama) and [BGE-M3](https://huggingface.co/BAAI/bge-m3).\n\n## CLI Usage\n\nYou can download llamapacks directly using `llamaindex-cli`, which comes installed with the `llama-index` python package:\n\n```bash\nllamaindex-cli download-llamapack LocalRAGCLIPack --download-dir ./local_rag_cli_pack\n```\n\nYou can then inspect the files at `./local_rag_cli_pack` and use them as a template for your own project!\n\n## Code Usage\n\nYou can download the pack to a directory. **NOTE**: You must specify `skip_load=True` - the pack contains multiple files,\nwhich makes it hard to load directly.\n\nWe will show you how to import the agent from these files!\n\n```python\nfrom llama_index.llama_pack import download_llama_pack\n\n# download and install dependencies\ndownload_llama_pack(\"LocalRAGCLIPack\", \"./local_rag_cli_pack\", skip_load=True)\n```\n\nFrom here, you can use the pack. The most straightforward way is through the CLI. You can directly run base.py, or run the `setup_cli.sh` script.\n\n```bash\ncd local_rag_cli_pack\n\n# option 1\npython base.py rag -h\n\n# option 2 - you may need sudo\n# default name is lcli_local\nsudo sh setup_cli.sh\nlcli_local rag -h\n\n```\n\nYou can also directly get modules from the pack.\n\n```python\nfrom local_rag_cli_pack.base import LocalRAGCLIPack\n\npack = LocalRAGCLIPack(\n verbose=True, llm_model_name=\"mistral\", embed_model_name=\"BAAI/bge-m3\"\n)\n# will spin up the CLI\npack.run()\n\n# get modules\nrag_cli = pack.get_modules()[\"rag_cli\"]\nrag_cli.cli()\n```\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "llama-index packs rag cli local integration",
"version": "0.1.4",
"project_urls": null,
"split_keywords": [
"cli",
"local",
"rag"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "ca18fe254a0ce8679e05e9e10c394b8da0ec820601da945dcba7cd8cdf13e85e",
"md5": "b391fba46ff55c0207c3ad73ec31e33d",
"sha256": "b5401b70900f61abc1d72874344c806e0aee702efa787cf92d07ea58352840c5"
},
"downloads": -1,
"filename": "llama_index_packs_rag_cli_local-0.1.4-py3-none-any.whl",
"has_sig": false,
"md5_digest": "b391fba46ff55c0207c3ad73ec31e33d",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.8.1,<4.0",
"size": 3775,
"upload_time": "2024-02-22T01:28:55",
"upload_time_iso_8601": "2024-02-22T01:28:55.888535Z",
"url": "https://files.pythonhosted.org/packages/ca/18/fe254a0ce8679e05e9e10c394b8da0ec820601da945dcba7cd8cdf13e85e/llama_index_packs_rag_cli_local-0.1.4-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "6eb09a7087f10b578961265c0e018e013035e5d2430b74c8a1d1c429dc100ff8",
"md5": "2efc3bffe756743ecc2aef93d2b786b2",
"sha256": "d4c8dae9da3e2acc2fd955dbfe8d2b2240f69eb854ec96c97d69dd158430b62f"
},
"downloads": -1,
"filename": "llama_index_packs_rag_cli_local-0.1.4.tar.gz",
"has_sig": false,
"md5_digest": "2efc3bffe756743ecc2aef93d2b786b2",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.8.1,<4.0",
"size": 3510,
"upload_time": "2024-02-22T01:28:56",
"upload_time_iso_8601": "2024-02-22T01:28:56.817323Z",
"url": "https://files.pythonhosted.org/packages/6e/b0/9a7087f10b578961265c0e018e013035e5d2430b74c8a1d1c429dc100ff8/llama_index_packs_rag_cli_local-0.1.4.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-02-22 01:28:56",
"github": false,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"lcname": "llama-index-packs-rag-cli-local"
}