# Chroma AutoRetrieval Pack
This LlamaPack inserts your data into chroma and instantiates an auto-retriever, which will use the LLM at runtime to set metadata filtering, top-k, and query string.
## CLI Usage
You can download llamapacks directly using `llamaindex-cli`, which comes installed with the `llama-index` python package:
```bash
llamaindex-cli download-llamapack ChromaAutoretrievalPack --download-dir ./chroma_pack
```
You can then inspect the files at `./chroma_pack` and use them as a template for your own project!
## Code Usage
You can download the pack to a the `./chroma_pack` directory:
```python
from llama_index.core.llama_pack import download_llama_pack
# download and install dependencies
ChromaAutoretrievalPack = download_llama_pack(
"ChromaAutoretrievalPack", "./chroma_pack"
)
```
From here, you can use the pack, or inspect and modify the pack in `./chroma_pack`.
Then, you can set up the pack like so:
```python
# setup pack arguments
from llama_index.core.vector_stores import MetadataInfo, VectorStoreInfo
vector_store_info = VectorStoreInfo(
content_info="brief biography of celebrities",
metadata_info=[
MetadataInfo(
name="category",
type="str",
description=(
"Category of the celebrity, one of [Sports Entertainment, Business, Music]"
),
),
],
)
import chromadb
client = chromadb.EphemeralClient()
nodes = [...]
# create the pack
chroma_pack = ChromaAutoretrievalPack(
collection_name="test",
vector_store_info=vector_store_index,
nodes=nodes,
client=client,
)
```
The `run()` function is a light wrapper around `query_engine.query()`.
```python
response = chroma_pack.run("Tell me a bout a Music celebritiy.")
```
You can also use modules individually.
```python
# use the retriever
retriever = chroma_pack.retriever
nodes = retriever.retrieve("query_str")
# use the query engine
query_engine = chroma_pack.query_engine
response = query_engine.query("query_str")
```
Raw data
{
"_id": null,
"home_page": null,
"name": "llama-index-packs-chroma-autoretrieval",
"maintainer": "logan-markewich",
"docs_url": null,
"requires_python": "<4.0,>=3.9",
"maintainer_email": null,
"keywords": "chroma, retrieval, vector",
"author": "Your Name",
"author_email": "you@example.com",
"download_url": "https://files.pythonhosted.org/packages/1c/ec/00b4dfeb60e6f9bfca858233e2593044f3bf73b62137ce202eecec9824a0/llama_index_packs_chroma_autoretrieval-0.3.0.tar.gz",
"platform": null,
"description": "# Chroma AutoRetrieval Pack\n\nThis LlamaPack inserts your data into chroma and instantiates an auto-retriever, which will use the LLM at runtime to set metadata filtering, top-k, and query string.\n\n## CLI Usage\n\nYou can download llamapacks directly using `llamaindex-cli`, which comes installed with the `llama-index` python package:\n\n```bash\nllamaindex-cli download-llamapack ChromaAutoretrievalPack --download-dir ./chroma_pack\n```\n\nYou can then inspect the files at `./chroma_pack` and use them as a template for your own project!\n\n## Code Usage\n\nYou can download the pack to a the `./chroma_pack` directory:\n\n```python\nfrom llama_index.core.llama_pack import download_llama_pack\n\n# download and install dependencies\nChromaAutoretrievalPack = download_llama_pack(\n \"ChromaAutoretrievalPack\", \"./chroma_pack\"\n)\n```\n\nFrom here, you can use the pack, or inspect and modify the pack in `./chroma_pack`.\n\nThen, you can set up the pack like so:\n\n```python\n# setup pack arguments\nfrom llama_index.core.vector_stores import MetadataInfo, VectorStoreInfo\n\nvector_store_info = VectorStoreInfo(\n content_info=\"brief biography of celebrities\",\n metadata_info=[\n MetadataInfo(\n name=\"category\",\n type=\"str\",\n description=(\n \"Category of the celebrity, one of [Sports Entertainment, Business, Music]\"\n ),\n ),\n ],\n)\n\nimport chromadb\n\nclient = chromadb.EphemeralClient()\n\nnodes = [...]\n\n# create the pack\nchroma_pack = ChromaAutoretrievalPack(\n collection_name=\"test\",\n vector_store_info=vector_store_index,\n nodes=nodes,\n client=client,\n)\n```\n\nThe `run()` function is a light wrapper around `query_engine.query()`.\n\n```python\nresponse = chroma_pack.run(\"Tell me a bout a Music celebritiy.\")\n```\n\nYou can also use modules individually.\n\n```python\n# use the retriever\nretriever = chroma_pack.retriever\nnodes = retriever.retrieve(\"query_str\")\n\n# use the query engine\nquery_engine = chroma_pack.query_engine\nresponse = query_engine.query(\"query_str\")\n```\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "llama-index packs chroma_autoretrieval integration",
"version": "0.3.0",
"project_urls": null,
"split_keywords": [
"chroma",
" retrieval",
" vector"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "d66f7ed6d81d659d06f7a830d46ce77378630c65d754bc0e56999f008616134b",
"md5": "23d21d78b9bfd0495f3f11c3e3738f95",
"sha256": "e79f09f2aa475fa73c2ded206a6de71be0d35e1ed79b203a70614f04cf55a5b3"
},
"downloads": -1,
"filename": "llama_index_packs_chroma_autoretrieval-0.3.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "23d21d78b9bfd0495f3f11c3e3738f95",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": "<4.0,>=3.9",
"size": 3294,
"upload_time": "2024-11-18T02:12:51",
"upload_time_iso_8601": "2024-11-18T02:12:51.073456Z",
"url": "https://files.pythonhosted.org/packages/d6/6f/7ed6d81d659d06f7a830d46ce77378630c65d754bc0e56999f008616134b/llama_index_packs_chroma_autoretrieval-0.3.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "1cec00b4dfeb60e6f9bfca858233e2593044f3bf73b62137ce202eecec9824a0",
"md5": "896c5f3609dfb9297e9078f15d84cc2d",
"sha256": "1929ca051fc02d80b727b7d14be2fee6637cc10c752fba49bea62acaeedd587e"
},
"downloads": -1,
"filename": "llama_index_packs_chroma_autoretrieval-0.3.0.tar.gz",
"has_sig": false,
"md5_digest": "896c5f3609dfb9297e9078f15d84cc2d",
"packagetype": "sdist",
"python_version": "source",
"requires_python": "<4.0,>=3.9",
"size": 2930,
"upload_time": "2024-11-18T02:12:52",
"upload_time_iso_8601": "2024-11-18T02:12:52.552501Z",
"url": "https://files.pythonhosted.org/packages/1c/ec/00b4dfeb60e6f9bfca858233e2593044f3bf73b62137ce202eecec9824a0/llama_index_packs_chroma_autoretrieval-0.3.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-11-18 02:12:52",
"github": false,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"lcname": "llama-index-packs-chroma-autoretrieval"
}