llama-index-postprocessor-xinference-rerank


Namellama-index-postprocessor-xinference-rerank JSON
Version 0.1.0 PyPI version JSON
download
home_pageNone
Summaryllama-index postprocessor xinference rerank integration
upload_time2024-08-27 18:27:03
maintainerNone
docs_urlNone
authorYour Name
requires_python<4.0,>=3.8.1
licenseMIT
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # LlamaIndex Postprocessor Integration: Xinference Rerank

Xorbits Inference (Xinference) is an open-source platform to streamline the operation and integration of a wide array of AI models.

You can find a list of built-in rerank models in Xinference from its document [Rerank Models](https://inference.readthedocs.io/en/latest/models/builtin/rerank/index.html)

To learn more about Xinference in general, visit https://inference.readthedocs.io/en/stable/models/model_abilities/rerank.html

## Installation

```shell
pip install llama-index-postprocessor-xinference-rerank
```

## Usage

**Parameters Description:**

- `model`: Model uid not model name, sometimes they may be the same (e.g., `bge-reranker-base`).
- `base_url`: base url of Xinference (e.g., `http://localhost:9997`).
- `top_n`: Top n nodes to return from reranker. (default 5).

**Nodes Rerank Example**

```python
from llama_index.postprocessor.xinference_rerank import XinferenceRerank

xi_model_uid = "xinference model uid"
xi_base_url = "xinference base url"

xi_rerank = XinferenceRerank(
    top_n=5,
    model=xi_model_uid,
    base_url=xi_base_url,
)


def test_rerank_nodes(nodes, query_str):
    response = xi_rerank.postprocess_nodes(nodes, query_str)
```

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "llama-index-postprocessor-xinference-rerank",
    "maintainer": null,
    "docs_url": null,
    "requires_python": "<4.0,>=3.8.1",
    "maintainer_email": null,
    "keywords": null,
    "author": "Your Name",
    "author_email": "you@example.com",
    "download_url": "https://files.pythonhosted.org/packages/5f/24/60399a92a696129e1bd630ebd294c50e056804cd22721165fe26febfdc2a/llama_index_postprocessor_xinference_rerank-0.1.0.tar.gz",
    "platform": null,
    "description": "# LlamaIndex Postprocessor Integration: Xinference Rerank\n\nXorbits Inference (Xinference) is an open-source platform to streamline the operation and integration of a wide array of AI models.\n\nYou can find a list of built-in rerank models in Xinference from its document [Rerank Models](https://inference.readthedocs.io/en/latest/models/builtin/rerank/index.html)\n\nTo learn more about Xinference in general, visit https://inference.readthedocs.io/en/stable/models/model_abilities/rerank.html\n\n## Installation\n\n```shell\npip install llama-index-postprocessor-xinference-rerank\n```\n\n## Usage\n\n**Parameters Description:**\n\n- `model`: Model uid not model name, sometimes they may be the same (e.g., `bge-reranker-base`).\n- `base_url`: base url of Xinference (e.g., `http://localhost:9997`).\n- `top_n`: Top n nodes to return from reranker. (default 5).\n\n**Nodes Rerank Example**\n\n```python\nfrom llama_index.postprocessor.xinference_rerank import XinferenceRerank\n\nxi_model_uid = \"xinference model uid\"\nxi_base_url = \"xinference base url\"\n\nxi_rerank = XinferenceRerank(\n    top_n=5,\n    model=xi_model_uid,\n    base_url=xi_base_url,\n)\n\n\ndef test_rerank_nodes(nodes, query_str):\n    response = xi_rerank.postprocess_nodes(nodes, query_str)\n```\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "llama-index postprocessor xinference rerank integration",
    "version": "0.1.0",
    "project_urls": null,
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "a6d6c0d0629cdc50fac17adab48a346c8470a84240ddb9b6f79e351ad443d9c8",
                "md5": "d35dc5f024f82db901fe41922f36ed3e",
                "sha256": "3cd97d9e46e15b3a5ea0303eaa14bdf2236f5796f60410d6facfaa4c1486d4bf"
            },
            "downloads": -1,
            "filename": "llama_index_postprocessor_xinference_rerank-0.1.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "d35dc5f024f82db901fe41922f36ed3e",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<4.0,>=3.8.1",
            "size": 3414,
            "upload_time": "2024-08-27T18:27:02",
            "upload_time_iso_8601": "2024-08-27T18:27:02.259738Z",
            "url": "https://files.pythonhosted.org/packages/a6/d6/c0d0629cdc50fac17adab48a346c8470a84240ddb9b6f79e351ad443d9c8/llama_index_postprocessor_xinference_rerank-0.1.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "5f2460399a92a696129e1bd630ebd294c50e056804cd22721165fe26febfdc2a",
                "md5": "3c6016ca6d9ae6df97995f255b16f14b",
                "sha256": "0998354785127c74ac78b5d87d5991d51834b53591cd2ff8b84cd3029e20b4f1"
            },
            "downloads": -1,
            "filename": "llama_index_postprocessor_xinference_rerank-0.1.0.tar.gz",
            "has_sig": false,
            "md5_digest": "3c6016ca6d9ae6df97995f255b16f14b",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<4.0,>=3.8.1",
            "size": 2931,
            "upload_time": "2024-08-27T18:27:03",
            "upload_time_iso_8601": "2024-08-27T18:27:03.351731Z",
            "url": "https://files.pythonhosted.org/packages/5f/24/60399a92a696129e1bd630ebd294c50e056804cd22721165fe26febfdc2a/llama_index_postprocessor_xinference_rerank-0.1.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-08-27 18:27:03",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "lcname": "llama-index-postprocessor-xinference-rerank"
}
        
Elapsed time: 0.30993s