Name | llama-index-vector-stores-lindorm JSON |
Version |
0.2.0
JSON |
| download |
home_page | None |
Summary | llama-index vector_stores lindorm integration |
upload_time | 2024-08-22 08:08:15 |
maintainer | None |
docs_url | None |
author | Your Name |
requires_python | <4.0,>=3.8.1 |
license | MIT |
keywords |
|
VCS |
|
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
# LlamaIndex Vector_Stores Integration: Lindorm
- LindormVectorStore support pure vector search, search with metadata filtering, hybrid search, async, etc.
- Please refer to the [notebook](../../../docs/docs/examples/vector_stores/LindormDemo.ipynb) for usage of Lindorm as vector store in LlamaIndex.
# Example Usage
```sh
pip install llama-index
pip install opensearch-py
pip install llama-index-vector-stores-lindorm
```
```python
from llama_index.vector_stores.lindorm import (
LindormVectorStore,
LindormVectorClient,
)
# how to obtain an lindorm search instance:
# https://alibabacloud.com/help/en/lindorm/latest/create-an-instance
# how to access your lindorm search instance:
# https://www.alibabacloud.com/help/en/lindorm/latest/view-endpoints
# run curl commands to connect to and use LindormSearch:
# https://www.alibabacloud.com/help/en/lindorm/latest/connect-and-use-the-search-engine-with-the-curl-command
# lindorm instance info
host = "ld-bp******jm*******-proxy-search-pub.lindorm.aliyuncs.com"
port = 30070
username = "your_username"
password = "your_password"
# index to demonstrate the VectorStore impl
index_name = "lindorm_test_index"
# extension param of lindorm search, number of cluster units to query; between 1 and method.parameters.nlist.
nprobe = "a number(string type)"
# extension param of lindorm search, usually used to improve recall accuracy, but it increases performance overhead;
# between 1 and 200; default: 10.
reorder_factor = "a number(string type)"
# LindormVectorClient encapsulates logic for a single index with vector search enabled
client = LindormVectorClient(
host=host,
port=port,
username=username,
password=password,
index=index_name,
dimension=1536, # match with your embedding model
nprobe=nprobe,
reorder_factor=reorder_factor,
# filter_type="pre_filter/post_filter(default)"
)
# initialize vector store
vector_store = LindormVectorStore(client)
```
Raw data
{
"_id": null,
"home_page": null,
"name": "llama-index-vector-stores-lindorm",
"maintainer": null,
"docs_url": null,
"requires_python": "<4.0,>=3.8.1",
"maintainer_email": null,
"keywords": null,
"author": "Your Name",
"author_email": "you@example.com",
"download_url": "https://files.pythonhosted.org/packages/91/4b/bb540f7cefd0576e917061fb16a4115f39949ce1a9c96aef3d8ac691d75d/llama_index_vector_stores_lindorm-0.2.0.tar.gz",
"platform": null,
"description": "# LlamaIndex Vector_Stores Integration: Lindorm\n\n- LindormVectorStore support pure vector search, search with metadata filtering, hybrid search, async, etc.\n- Please refer to the [notebook](../../../docs/docs/examples/vector_stores/LindormDemo.ipynb) for usage of Lindorm as vector store in LlamaIndex.\n\n# Example Usage\n\n```sh\npip install llama-index\npip install opensearch-py\npip install llama-index-vector-stores-lindorm\n```\n\n```python\nfrom llama_index.vector_stores.lindorm import (\n LindormVectorStore,\n LindormVectorClient,\n)\n\n# how to obtain an lindorm search instance:\n# https://alibabacloud.com/help/en/lindorm/latest/create-an-instance\n\n# how to access your lindorm search instance:\n# https://www.alibabacloud.com/help/en/lindorm/latest/view-endpoints\n\n# run curl commands to connect to and use LindormSearch:\n# https://www.alibabacloud.com/help/en/lindorm/latest/connect-and-use-the-search-engine-with-the-curl-command\n\n# lindorm instance info\nhost = \"ld-bp******jm*******-proxy-search-pub.lindorm.aliyuncs.com\"\nport = 30070\nusername = \"your_username\"\npassword = \"your_password\"\n\n# index to demonstrate the VectorStore impl\nindex_name = \"lindorm_test_index\"\n\n# extension param of lindorm search, number of cluster units to query; between 1 and method.parameters.nlist.\nnprobe = \"a number(string type)\"\n\n# extension param of lindorm search, usually used to improve recall accuracy, but it increases performance overhead;\n# between 1 and 200; default: 10.\nreorder_factor = \"a number(string type)\"\n\n# LindormVectorClient encapsulates logic for a single index with vector search enabled\nclient = LindormVectorClient(\n host=host,\n port=port,\n username=username,\n password=password,\n index=index_name,\n dimension=1536, # match with your embedding model\n nprobe=nprobe,\n reorder_factor=reorder_factor,\n # filter_type=\"pre_filter/post_filter(default)\"\n)\n\n# initialize vector store\nvector_store = LindormVectorStore(client)\n```\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "llama-index vector_stores lindorm integration",
"version": "0.2.0",
"project_urls": null,
"split_keywords": [],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "24cad1400a7b5116a8d859e2e9f0e4c4fb438e0c3e1ec03784323b35788d0834",
"md5": "fa98735ea95bd2ef94788ef4c7563272",
"sha256": "d118cebbdde3b688edbd29060ea59967a6599cd84e5d7f902d5b9a05eaf27c61"
},
"downloads": -1,
"filename": "llama_index_vector_stores_lindorm-0.2.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "fa98735ea95bd2ef94788ef4c7563272",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": "<4.0,>=3.8.1",
"size": 9627,
"upload_time": "2024-08-22T08:08:13",
"upload_time_iso_8601": "2024-08-22T08:08:13.810457Z",
"url": "https://files.pythonhosted.org/packages/24/ca/d1400a7b5116a8d859e2e9f0e4c4fb438e0c3e1ec03784323b35788d0834/llama_index_vector_stores_lindorm-0.2.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "914bbb540f7cefd0576e917061fb16a4115f39949ce1a9c96aef3d8ac691d75d",
"md5": "0619cf0cc6ee2afc7a512a6454d6299b",
"sha256": "a0d224cc0ba043f329f60fc1bb736eeca7e5b992ef789aa3bc80a8b6ca2f9bb6"
},
"downloads": -1,
"filename": "llama_index_vector_stores_lindorm-0.2.0.tar.gz",
"has_sig": false,
"md5_digest": "0619cf0cc6ee2afc7a512a6454d6299b",
"packagetype": "sdist",
"python_version": "source",
"requires_python": "<4.0,>=3.8.1",
"size": 9260,
"upload_time": "2024-08-22T08:08:15",
"upload_time_iso_8601": "2024-08-22T08:08:15.084655Z",
"url": "https://files.pythonhosted.org/packages/91/4b/bb540f7cefd0576e917061fb16a4115f39949ce1a9c96aef3d8ac691d75d/llama_index_vector_stores_lindorm-0.2.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-08-22 08:08:15",
"github": false,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"lcname": "llama-index-vector-stores-lindorm"
}