Name | llama-index-embeddings-xinference JSON |
Version |
0.1.0
JSON |
| download |
home_page | None |
Summary | llama-index embeddings xinference integration |
upload_time | 2024-08-25 14:11:36 |
maintainer | None |
docs_url | None |
author | Your Name |
requires_python | <4.0,>=3.8.1 |
license | MIT |
keywords |
|
VCS |
|
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
# LlamaIndex Embeddings Integration: Xinference
Xorbits Inference (Xinference) is an open-source platform to streamline the operation and integration of a wide array of AI models.
You can find a list of built-in embedding models in Xinference from its document [Embedding Models](https://inference.readthedocs.io/en/latest/models/builtin/embedding/index.html)
To learn more about Xinference in general, visit https://inference.readthedocs.io/en/latest/
## Installation
```shell
pip install llama-index-embeddings-xinference
```
## Usage
**Parameters Description:**
- `model_uid`: Model uid not the model name, sometimes they may be the same (e.g., `bce-embedding-base_v1`).
- `base_url`: base url of Xinference (e.g., `http://localhost:9997`).
- `timeout`: request timeout set (default 60s).
- `prompt`: Text to embed.
**Text Embedding Example**
```python
from llama_index.embeddings.xinference import XinferenceEmbedding
xi_model_uid = "xinference model uid"
xi_base_url = "xinference base url"
xi_embed = XinferenceEmbedding(
model_uid=xi_model_uid,
base_url=xi_base_url,
timeout=60,
)
def text_embedding(prompt: str):
embeddings = xi_embed.get_query_embedding(prompt)
print(embeddings)
async def async_text_embedding(prompt: str):
embeddings = await xi_embed.aget_query_embedding(prompt)
print(embeddings)
```
Raw data
{
"_id": null,
"home_page": null,
"name": "llama-index-embeddings-xinference",
"maintainer": null,
"docs_url": null,
"requires_python": "<4.0,>=3.8.1",
"maintainer_email": null,
"keywords": null,
"author": "Your Name",
"author_email": "you@example.com",
"download_url": "https://files.pythonhosted.org/packages/59/c9/32b3958f622f4887104e9b1d51c45f9a4181e4f894f05690fcd3b3e8d07b/llama_index_embeddings_xinference-0.1.0.tar.gz",
"platform": null,
"description": "# LlamaIndex Embeddings Integration: Xinference\n\nXorbits Inference (Xinference) is an open-source platform to streamline the operation and integration of a wide array of AI models.\n\nYou can find a list of built-in embedding models in Xinference from its document [Embedding Models](https://inference.readthedocs.io/en/latest/models/builtin/embedding/index.html)\n\nTo learn more about Xinference in general, visit https://inference.readthedocs.io/en/latest/\n\n## Installation\n\n```shell\npip install llama-index-embeddings-xinference\n```\n\n## Usage\n\n**Parameters Description:**\n\n- `model_uid`: Model uid not the model name, sometimes they may be the same (e.g., `bce-embedding-base_v1`).\n- `base_url`: base url of Xinference (e.g., `http://localhost:9997`).\n- `timeout`: request timeout set (default 60s).\n- `prompt`: Text to embed.\n\n**Text Embedding Example**\n\n```python\nfrom llama_index.embeddings.xinference import XinferenceEmbedding\n\nxi_model_uid = \"xinference model uid\"\nxi_base_url = \"xinference base url\"\n\nxi_embed = XinferenceEmbedding(\n model_uid=xi_model_uid,\n base_url=xi_base_url,\n timeout=60,\n)\n\n\ndef text_embedding(prompt: str):\n embeddings = xi_embed.get_query_embedding(prompt)\n print(embeddings)\n\n\nasync def async_text_embedding(prompt: str):\n embeddings = await xi_embed.aget_query_embedding(prompt)\n print(embeddings)\n```\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "llama-index embeddings xinference integration",
"version": "0.1.0",
"project_urls": null,
"split_keywords": [],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "5eb511d211c96eea1ff51c8cdf5fdf9760ac966f0a669cc1cab23fac9eb1ae31",
"md5": "cfba127c4ace2079f86e33fd918e8e5f",
"sha256": "3095536d3c7ce559d8c153b070f829a702eab6aaa53e859efae7234804ff2451"
},
"downloads": -1,
"filename": "llama_index_embeddings_xinference-0.1.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "cfba127c4ace2079f86e33fd918e8e5f",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": "<4.0,>=3.8.1",
"size": 3187,
"upload_time": "2024-08-25T14:11:34",
"upload_time_iso_8601": "2024-08-25T14:11:34.661059Z",
"url": "https://files.pythonhosted.org/packages/5e/b5/11d211c96eea1ff51c8cdf5fdf9760ac966f0a669cc1cab23fac9eb1ae31/llama_index_embeddings_xinference-0.1.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "59c932b3958f622f4887104e9b1d51c45f9a4181e4f894f05690fcd3b3e8d07b",
"md5": "e4074aebc71e2dfb0df72d9f81ee8af8",
"sha256": "5f24510d4f423ad724998af22d3b0e0a94308757be8b51ab6dfd43e2b3264307"
},
"downloads": -1,
"filename": "llama_index_embeddings_xinference-0.1.0.tar.gz",
"has_sig": false,
"md5_digest": "e4074aebc71e2dfb0df72d9f81ee8af8",
"packagetype": "sdist",
"python_version": "source",
"requires_python": "<4.0,>=3.8.1",
"size": 2775,
"upload_time": "2024-08-25T14:11:36",
"upload_time_iso_8601": "2024-08-25T14:11:36.098183Z",
"url": "https://files.pythonhosted.org/packages/59/c9/32b3958f622f4887104e9b1d51c45f9a4181e4f894f05690fcd3b3e8d07b/llama_index_embeddings_xinference-0.1.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-08-25 14:11:36",
"github": false,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"lcname": "llama-index-embeddings-xinference"
}