Name | llama-index-embeddings-xinference JSON |
Version |
0.1.1
JSON |
| download |
home_page | None |
Summary | llama-index embeddings xinference integration |
upload_time | 2024-11-05 21:31:58 |
maintainer | None |
docs_url | None |
author | Your Name |
requires_python | <4.0,>=3.8.1 |
license | MIT |
keywords |
|
VCS |
|
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
# LlamaIndex Embeddings Integration: Xinference
Xorbits Inference (Xinference) is an open-source platform to streamline the operation and integration of a wide array of AI models.
You can find a list of built-in embedding models in Xinference from its document [Embedding Models](https://inference.readthedocs.io/en/latest/models/builtin/embedding/index.html)
To learn more about Xinference in general, visit https://inference.readthedocs.io/en/latest/
## Installation
```shell
pip install llama-index-embeddings-xinference
```
## Usage
**Parameters Description:**
- `model_uid`: Model uid not the model name, sometimes they may be the same (e.g., `bce-embedding-base_v1`).
- `base_url`: base url of Xinference (e.g., `http://localhost:9997`).
- `timeout`: request timeout set (default 60s).
- `prompt`: Text to embed.
**Text Embedding Example**
```python
from llama_index.embeddings.xinference import XinferenceEmbedding
xi_model_uid = "xinference model uid"
xi_base_url = "xinference base url"
xi_embed = XinferenceEmbedding(
model_uid=xi_model_uid,
base_url=xi_base_url,
timeout=60,
)
def text_embedding(prompt: str):
embeddings = xi_embed.get_query_embedding(prompt)
print(embeddings)
async def async_text_embedding(prompt: str):
embeddings = await xi_embed.aget_query_embedding(prompt)
print(embeddings)
```
Raw data
{
"_id": null,
"home_page": null,
"name": "llama-index-embeddings-xinference",
"maintainer": null,
"docs_url": null,
"requires_python": "<4.0,>=3.8.1",
"maintainer_email": null,
"keywords": null,
"author": "Your Name",
"author_email": "you@example.com",
"download_url": "https://files.pythonhosted.org/packages/ed/02/60927d0bab5220bfcc495edb65228a156d681547167134d8cd42397bbd9c/llama_index_embeddings_xinference-0.1.1.tar.gz",
"platform": null,
"description": "# LlamaIndex Embeddings Integration: Xinference\n\nXorbits Inference (Xinference) is an open-source platform to streamline the operation and integration of a wide array of AI models.\n\nYou can find a list of built-in embedding models in Xinference from its document [Embedding Models](https://inference.readthedocs.io/en/latest/models/builtin/embedding/index.html)\n\nTo learn more about Xinference in general, visit https://inference.readthedocs.io/en/latest/\n\n## Installation\n\n```shell\npip install llama-index-embeddings-xinference\n```\n\n## Usage\n\n**Parameters Description:**\n\n- `model_uid`: Model uid not the model name, sometimes they may be the same (e.g., `bce-embedding-base_v1`).\n- `base_url`: base url of Xinference (e.g., `http://localhost:9997`).\n- `timeout`: request timeout set (default 60s).\n- `prompt`: Text to embed.\n\n**Text Embedding Example**\n\n```python\nfrom llama_index.embeddings.xinference import XinferenceEmbedding\n\nxi_model_uid = \"xinference model uid\"\nxi_base_url = \"xinference base url\"\n\nxi_embed = XinferenceEmbedding(\n model_uid=xi_model_uid,\n base_url=xi_base_url,\n timeout=60,\n)\n\n\ndef text_embedding(prompt: str):\n embeddings = xi_embed.get_query_embedding(prompt)\n print(embeddings)\n\n\nasync def async_text_embedding(prompt: str):\n embeddings = await xi_embed.aget_query_embedding(prompt)\n print(embeddings)\n```\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "llama-index embeddings xinference integration",
"version": "0.1.1",
"project_urls": null,
"split_keywords": [],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "475a3a7d6067405ad8e180a55992e865296b34e6fd688a097d41e95a8cf73b56",
"md5": "36570151f94ab97a3fcfb3904fb8d036",
"sha256": "6bd5b4e2d02ec08276bf71293f3e6d5e6e554b62c9dcb0a4abe9cf4e47e279dd"
},
"downloads": -1,
"filename": "llama_index_embeddings_xinference-0.1.1-py3-none-any.whl",
"has_sig": false,
"md5_digest": "36570151f94ab97a3fcfb3904fb8d036",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": "<4.0,>=3.8.1",
"size": 3191,
"upload_time": "2024-11-05T21:31:57",
"upload_time_iso_8601": "2024-11-05T21:31:57.514444Z",
"url": "https://files.pythonhosted.org/packages/47/5a/3a7d6067405ad8e180a55992e865296b34e6fd688a097d41e95a8cf73b56/llama_index_embeddings_xinference-0.1.1-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "ed0260927d0bab5220bfcc495edb65228a156d681547167134d8cd42397bbd9c",
"md5": "963d4853e83c4b25e81e72e8a8f28607",
"sha256": "b0e3b9b257b73e52ce0af1b7860a3234b4eda0e3ce9ac4dab05615804edf8aed"
},
"downloads": -1,
"filename": "llama_index_embeddings_xinference-0.1.1.tar.gz",
"has_sig": false,
"md5_digest": "963d4853e83c4b25e81e72e8a8f28607",
"packagetype": "sdist",
"python_version": "source",
"requires_python": "<4.0,>=3.8.1",
"size": 2786,
"upload_time": "2024-11-05T21:31:58",
"upload_time_iso_8601": "2024-11-05T21:31:58.608218Z",
"url": "https://files.pythonhosted.org/packages/ed/02/60927d0bab5220bfcc495edb65228a156d681547167134d8cd42397bbd9c/llama_index_embeddings_xinference-0.1.1.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-11-05 21:31:58",
"github": false,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"lcname": "llama-index-embeddings-xinference"
}