Name | llama-index-llms-watsonx JSON |
Version |
0.1.8
JSON |
| download |
home_page | None |
Summary | llama-index llms watsonx integration |
upload_time | 2024-06-12 20:50:16 |
maintainer | None |
docs_url | None |
author | Your Name |
requires_python | <4.0,>=3.10 |
license | MIT |
keywords |
|
VCS |
|
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
# LlamaIndex Llms Integration: Watsonx
The usage of `llama-index-llms-watsonx` is deprecated in favor of `llama-index-llms-ibm`. To install recommended package that uses `ibm-watsonx-ai` underneath, run `pip install -qU llama-index-llms-ibm`. Use following to load the model using an IBM watsonx.ai integration
```python
from llama_index.llms.ibm import WatsonxLLM
watsonx_llm = WatsonxLLM(
model_id="PASTE THE CHOSEN MODEL_ID HERE",
url="PASTE YOUR URL HERE",
apikey="PASTE YOUR IBM APIKEY HERE",
project_id="PASTE YOUR PROJECT_ID HERE",
temperature=temperature,
max_new_tokens=max_new_tokens,
additional_params=additional_params,
)
```
Raw data
{
"_id": null,
"home_page": null,
"name": "llama-index-llms-watsonx",
"maintainer": null,
"docs_url": null,
"requires_python": "<4.0,>=3.10",
"maintainer_email": null,
"keywords": null,
"author": "Your Name",
"author_email": "you@example.com",
"download_url": "https://files.pythonhosted.org/packages/a1/ff/2020b33b61629c00bf9a702d193f933684a692e47291bc2e0b555c77fa28/llama_index_llms_watsonx-0.1.8.tar.gz",
"platform": null,
"description": "# LlamaIndex Llms Integration: Watsonx\n\nThe usage of `llama-index-llms-watsonx` is deprecated in favor of `llama-index-llms-ibm`. To install recommended package that uses `ibm-watsonx-ai` underneath, run `pip install -qU llama-index-llms-ibm`. Use following to load the model using an IBM watsonx.ai integration\n\n```python\nfrom llama_index.llms.ibm import WatsonxLLM\n\nwatsonx_llm = WatsonxLLM(\n model_id=\"PASTE THE CHOSEN MODEL_ID HERE\",\n url=\"PASTE YOUR URL HERE\",\n apikey=\"PASTE YOUR IBM APIKEY HERE\",\n project_id=\"PASTE YOUR PROJECT_ID HERE\",\n temperature=temperature,\n max_new_tokens=max_new_tokens,\n additional_params=additional_params,\n)\n```\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "llama-index llms watsonx integration",
"version": "0.1.8",
"project_urls": null,
"split_keywords": [],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "f80d98beb8170b0f464f1ed75d32e801c816df9ba0511bdfa45748e573f38481",
"md5": "081459c6aef07c25bce0e2f827170180",
"sha256": "a6023e43c9c43949db13f46d0ebba923e7beaa4ef2c788a087eb1d951d4d344b"
},
"downloads": -1,
"filename": "llama_index_llms_watsonx-0.1.8-py3-none-any.whl",
"has_sig": false,
"md5_digest": "081459c6aef07c25bce0e2f827170180",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": "<4.0,>=3.10",
"size": 5115,
"upload_time": "2024-06-12T20:50:14",
"upload_time_iso_8601": "2024-06-12T20:50:14.626060Z",
"url": "https://files.pythonhosted.org/packages/f8/0d/98beb8170b0f464f1ed75d32e801c816df9ba0511bdfa45748e573f38481/llama_index_llms_watsonx-0.1.8-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "a1ff2020b33b61629c00bf9a702d193f933684a692e47291bc2e0b555c77fa28",
"md5": "ffe4029dd283bfb95712bc0206b61460",
"sha256": "09cb3517004a3980b0c3649030696d85e408309cf102de45b43d6e113088c8f6"
},
"downloads": -1,
"filename": "llama_index_llms_watsonx-0.1.8.tar.gz",
"has_sig": false,
"md5_digest": "ffe4029dd283bfb95712bc0206b61460",
"packagetype": "sdist",
"python_version": "source",
"requires_python": "<4.0,>=3.10",
"size": 4392,
"upload_time": "2024-06-12T20:50:16",
"upload_time_iso_8601": "2024-06-12T20:50:16.132629Z",
"url": "https://files.pythonhosted.org/packages/a1/ff/2020b33b61629c00bf9a702d193f933684a692e47291bc2e0b555c77fa28/llama_index_llms_watsonx-0.1.8.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-06-12 20:50:16",
"github": false,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"lcname": "llama-index-llms-watsonx"
}