llama-index-packs-infer-retrieve-rerank


Namellama-index-packs-infer-retrieve-rerank JSON
Version 0.5.0 PyPI version JSON
download
home_pageNone
Summaryllama-index packs infer retrieve rerank integration
upload_time2024-11-18 01:33:51
maintainerjerryjliu
docs_urlNone
authorYour Name
requires_python<4.0,>=3.9
licenseMIT
keywords infer rag rerank retrieve retriever
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # Infer-Retrieve-Rerank LlamaPack

This is our implementation of the paper ["In-Context Learning for Extreme Multi-Label Classification](https://arxiv.org/pdf/2401.12178.pdf) by Oosterlinck et al.

The paper proposes "infer-retrieve-rerank", a simple paradigm using frozen LLM/retriever models that can do "extreme"-label classification (the label space is huge).

1. Given a user query, use an LLM to predict an initial set of labels.
2. For each prediction, retrieve the actual label from the corpus.
3. Given the final set of labels, rerank them using an LLM.

All of these can be implemented as LlamaIndex abstractions.

A full notebook guide can be found [here](https://github.com/run-llama/llama-hub/blob/main/llama_hub/llama_packs/research/infer_retrieve_rerank/infer_retrieve_rerank.ipynb).

## CLI Usage

You can download llamapacks directly using `llamaindex-cli`, which comes installed with the `llama-index` python package:

```bash
llamaindex-cli download-llamapack InferRetrieveRerankPack --download-dir ./infer_retrieve_rerank_pack
```

You can then inspect the files at `./infer_retrieve_rerank_pack` and use them as a template for your own project!

## Code Usage

You can download the pack to a `./infer_retrieve_rerank_pack` directory:

```python
from llama_index.core.llama_pack import download_llama_pack

# download and install dependencies
InferRetrieveRerankPack = download_llama_pack(
    "InferRetrieveRerankPack", "./infer_retrieve_rerank_pack"
)
```

From here, you can use the pack, or inspect and modify the pack in `./infer_retrieve_rerank_pack`.

Then, you can set up the pack like so:

```python
# create the pack
pack = InferRetrieveRerankPack(
    labels,  # list of all label strings
    llm=llm,
    pred_context="<pred_context>",
    reranker_top_n=3,
    verbose=True,
)
```

The `run()` function runs predictions.

```python
pred_reactions = pack.run(inputs=[s["text"] for s in samples])
```

You can also use modules individually.

```python
# call the llm.complete()
llm = pack.llm
label_retriever = pack.label_retriever
```

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "llama-index-packs-infer-retrieve-rerank",
    "maintainer": "jerryjliu",
    "docs_url": null,
    "requires_python": "<4.0,>=3.9",
    "maintainer_email": null,
    "keywords": "infer, rag, rerank, retrieve, retriever",
    "author": "Your Name",
    "author_email": "you@example.com",
    "download_url": "https://files.pythonhosted.org/packages/c9/3d/3fccb50a51c9d76d8abd023b2faf61e016c4a81386108ca5b4012c95fca5/llama_index_packs_infer_retrieve_rerank-0.5.0.tar.gz",
    "platform": null,
    "description": "# Infer-Retrieve-Rerank LlamaPack\n\nThis is our implementation of the paper [\"In-Context Learning for Extreme Multi-Label Classification](https://arxiv.org/pdf/2401.12178.pdf) by Oosterlinck et al.\n\nThe paper proposes \"infer-retrieve-rerank\", a simple paradigm using frozen LLM/retriever models that can do \"extreme\"-label classification (the label space is huge).\n\n1. Given a user query, use an LLM to predict an initial set of labels.\n2. For each prediction, retrieve the actual label from the corpus.\n3. Given the final set of labels, rerank them using an LLM.\n\nAll of these can be implemented as LlamaIndex abstractions.\n\nA full notebook guide can be found [here](https://github.com/run-llama/llama-hub/blob/main/llama_hub/llama_packs/research/infer_retrieve_rerank/infer_retrieve_rerank.ipynb).\n\n## CLI Usage\n\nYou can download llamapacks directly using `llamaindex-cli`, which comes installed with the `llama-index` python package:\n\n```bash\nllamaindex-cli download-llamapack InferRetrieveRerankPack --download-dir ./infer_retrieve_rerank_pack\n```\n\nYou can then inspect the files at `./infer_retrieve_rerank_pack` and use them as a template for your own project!\n\n## Code Usage\n\nYou can download the pack to a `./infer_retrieve_rerank_pack` directory:\n\n```python\nfrom llama_index.core.llama_pack import download_llama_pack\n\n# download and install dependencies\nInferRetrieveRerankPack = download_llama_pack(\n    \"InferRetrieveRerankPack\", \"./infer_retrieve_rerank_pack\"\n)\n```\n\nFrom here, you can use the pack, or inspect and modify the pack in `./infer_retrieve_rerank_pack`.\n\nThen, you can set up the pack like so:\n\n```python\n# create the pack\npack = InferRetrieveRerankPack(\n    labels,  # list of all label strings\n    llm=llm,\n    pred_context=\"<pred_context>\",\n    reranker_top_n=3,\n    verbose=True,\n)\n```\n\nThe `run()` function runs predictions.\n\n```python\npred_reactions = pack.run(inputs=[s[\"text\"] for s in samples])\n```\n\nYou can also use modules individually.\n\n```python\n# call the llm.complete()\nllm = pack.llm\nlabel_retriever = pack.label_retriever\n```\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "llama-index packs infer retrieve rerank integration",
    "version": "0.5.0",
    "project_urls": null,
    "split_keywords": [
        "infer",
        " rag",
        " rerank",
        " retrieve",
        " retriever"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "b8c165779017fb54118f51977f06246ca90ae73f006c58199381583b96f2f15a",
                "md5": "951974b782460284846446893ce651f6",
                "sha256": "6e2f88861dec47727fc4c371962939b9fc6cdef4a8a45353fd6bcfa2c4473c29"
            },
            "downloads": -1,
            "filename": "llama_index_packs_infer_retrieve_rerank-0.5.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "951974b782460284846446893ce651f6",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<4.0,>=3.9",
            "size": 4459,
            "upload_time": "2024-11-18T01:33:50",
            "upload_time_iso_8601": "2024-11-18T01:33:50.173528Z",
            "url": "https://files.pythonhosted.org/packages/b8/c1/65779017fb54118f51977f06246ca90ae73f006c58199381583b96f2f15a/llama_index_packs_infer_retrieve_rerank-0.5.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "c93d3fccb50a51c9d76d8abd023b2faf61e016c4a81386108ca5b4012c95fca5",
                "md5": "ffaa13fc0feff6ed9d064340622f9674",
                "sha256": "eb503bd375a3b9868933599d2c814b8a3d8513487c94aa7728f7d7c50bef02e4"
            },
            "downloads": -1,
            "filename": "llama_index_packs_infer_retrieve_rerank-0.5.0.tar.gz",
            "has_sig": false,
            "md5_digest": "ffaa13fc0feff6ed9d064340622f9674",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<4.0,>=3.9",
            "size": 4103,
            "upload_time": "2024-11-18T01:33:51",
            "upload_time_iso_8601": "2024-11-18T01:33:51.092237Z",
            "url": "https://files.pythonhosted.org/packages/c9/3d/3fccb50a51c9d76d8abd023b2faf61e016c4a81386108ca5b4012c95fca5/llama_index_packs_infer_retrieve_rerank-0.5.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-11-18 01:33:51",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "lcname": "llama-index-packs-infer-retrieve-rerank"
}
        
Elapsed time: 1.97526s