llama-index-embeddings-ibm


Namellama-index-embeddings-ibm JSON
Version 0.3.0 PyPI version JSON
download
home_pageNone
Summaryllama-index embeddings IBM watsonx.ai integration
upload_time2024-11-17 23:07:57
maintainerNone
docs_urlNone
authorIBM
requires_python<4.0,>=3.10
licenseMIT
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # LlamaIndex Embeddings Integration: IBM

This package integrates the LlamaIndex LLMs API with the IBM watsonx.ai Foundation Models API by leveraging `ibm-watsonx-ai` [SDK](https://ibm.github.io/watsonx-ai-python-sdk/index.html). With this integration, you can use one of the embedding models that are available in IBM watsonx.ai to embed a single string or a list of strings.

## Installation

```bash
pip install llama-index-embeddings-ibm
```

## Usage

### Setting up

To use IBM's models, you must have an IBM Cloud user API key. Here's how to obtain and set up your API key:

1. **Obtain an API Key:** For more details on how to create and manage an API key, refer to [Managing user API keys](https://cloud.ibm.com/docs/account?topic=account-userapikey&interface=ui).
2. **Set the API Key as an Environment Variable:** For security reasons, it's recommended to not hard-code your API key directly in your scripts. Instead, set it up as an environment variable. You can use the following code to prompt for the API key and set it as an environment variable:

```python
import os
from getpass import getpass

watsonx_api_key = getpass()
os.environ["WATSONX_APIKEY"] = watsonx_api_key
```

Alternatively, you can set the environment variable in your terminal.

- **Linux/macOS:** Open your terminal and execute the following command:

  ```bash
  export WATSONX_APIKEY='your_ibm_api_key'
  ```

  To make this environment variable persistent across terminal sessions, add the above line to your `~/.bashrc`, `~/.bash_profile`, or `~/.zshrc` file.

- **Windows:** For Command Prompt, use:
  ```cmd
  set WATSONX_APIKEY=your_ibm_api_key
  ```

### Load the model

You might need to adjust embedding parameters for different tasks.

```python
truncate_input_tokens = 3
```

Initialize the `WatsonxEmbeddings` class with the previously set parameters.

**Note**:

- To provide context for the API call, you must pass the `project_id` or `space_id`. To get your project or space ID, open your project or space, go to the **Manage** tab, and click **General**. For more information see: [Project documentation](https://www.ibm.com/docs/en/watsonx-as-a-service?topic=projects) or [Deployment space documentation](https://www.ibm.com/docs/en/watsonx/saas?topic=spaces-creating-deployment).
- Depending on the region of your provisioned service instance, use one of the URLs listed in [watsonx.ai API Authentication](https://ibm.github.io/watsonx-ai-python-sdk/setup_cloud.html#authentication).

In this example, we’ll use the `project_id` and Dallas URL.

You need to specify the `model_id` that will be used for inferencing.

```python
from llama_index.embeddings.ibm import WatsonxEmbeddings

watsonx_embedding = WatsonxEmbeddings(
    model_id="ibm/slate-125m-english-rtrvr",
    url="https://us-south.ml.cloud.ibm.com",
    project_id="PASTE YOUR PROJECT_ID HERE",
    truncate_input_tokens=truncate_input_tokens,
)
```

Alternatively, you can use Cloud Pak for Data credentials. For details, see [watsonx.ai software setup](https://ibm.github.io/watsonx-ai-python-sdk/setup_cpd.html).

```python
watsonx_embedding = WatsonxEmbeddings(
    model_id="ibm/slate-125m-english-rtrvr",
    url="PASTE YOUR URL HERE",
    username="PASTE YOUR USERNAME HERE",
    password="PASTE YOUR PASSWORD HERE",
    instance_id="openshift",
    version="4.8",
    project_id="PASTE YOUR PROJECT_ID HERE",
    truncate_input_tokens=truncate_input_tokens,
)
```

## Usage

### Embed query

```python
query = "Example query."

query_result = watsonx_embedding.get_query_embedding(query)
print(query_result[:5])
```

### Embed list of texts

```python
texts = ["This is a content of one document", "This is another document"]

doc_result = watsonx_embedding.get_text_embedding_batch(texts)
print(doc_result[0][:5])
```

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "llama-index-embeddings-ibm",
    "maintainer": null,
    "docs_url": null,
    "requires_python": "<4.0,>=3.10",
    "maintainer_email": null,
    "keywords": null,
    "author": "IBM",
    "author_email": null,
    "download_url": "https://files.pythonhosted.org/packages/56/f6/55372708cce6169e5fa66782c9290d04435dbace0b3e5e4ded8518d51d63/llama_index_embeddings_ibm-0.3.0.tar.gz",
    "platform": null,
    "description": "# LlamaIndex Embeddings Integration: IBM\n\nThis package integrates the LlamaIndex LLMs API with the IBM watsonx.ai Foundation Models API by leveraging `ibm-watsonx-ai` [SDK](https://ibm.github.io/watsonx-ai-python-sdk/index.html). With this integration, you can use one of the embedding models that are available in IBM watsonx.ai to embed a single string or a list of strings.\n\n## Installation\n\n```bash\npip install llama-index-embeddings-ibm\n```\n\n## Usage\n\n### Setting up\n\nTo use IBM's models, you must have an IBM Cloud user API key. Here's how to obtain and set up your API key:\n\n1. **Obtain an API Key:** For more details on how to create and manage an API key, refer to [Managing user API keys](https://cloud.ibm.com/docs/account?topic=account-userapikey&interface=ui).\n2. **Set the API Key as an Environment Variable:** For security reasons, it's recommended to not hard-code your API key directly in your scripts. Instead, set it up as an environment variable. You can use the following code to prompt for the API key and set it as an environment variable:\n\n```python\nimport os\nfrom getpass import getpass\n\nwatsonx_api_key = getpass()\nos.environ[\"WATSONX_APIKEY\"] = watsonx_api_key\n```\n\nAlternatively, you can set the environment variable in your terminal.\n\n- **Linux/macOS:** Open your terminal and execute the following command:\n\n  ```bash\n  export WATSONX_APIKEY='your_ibm_api_key'\n  ```\n\n  To make this environment variable persistent across terminal sessions, add the above line to your `~/.bashrc`, `~/.bash_profile`, or `~/.zshrc` file.\n\n- **Windows:** For Command Prompt, use:\n  ```cmd\n  set WATSONX_APIKEY=your_ibm_api_key\n  ```\n\n### Load the model\n\nYou might need to adjust embedding parameters for different tasks.\n\n```python\ntruncate_input_tokens = 3\n```\n\nInitialize the `WatsonxEmbeddings` class with the previously set parameters.\n\n**Note**:\n\n- To provide context for the API call, you must pass the `project_id` or `space_id`. To get your project or space ID, open your project or space, go to the **Manage** tab, and click **General**. For more information see: [Project documentation](https://www.ibm.com/docs/en/watsonx-as-a-service?topic=projects) or [Deployment space documentation](https://www.ibm.com/docs/en/watsonx/saas?topic=spaces-creating-deployment).\n- Depending on the region of your provisioned service instance, use one of the URLs listed in [watsonx.ai API Authentication](https://ibm.github.io/watsonx-ai-python-sdk/setup_cloud.html#authentication).\n\nIn this example, we\u2019ll use the `project_id` and Dallas URL.\n\nYou need to specify the `model_id` that will be used for inferencing.\n\n```python\nfrom llama_index.embeddings.ibm import WatsonxEmbeddings\n\nwatsonx_embedding = WatsonxEmbeddings(\n    model_id=\"ibm/slate-125m-english-rtrvr\",\n    url=\"https://us-south.ml.cloud.ibm.com\",\n    project_id=\"PASTE YOUR PROJECT_ID HERE\",\n    truncate_input_tokens=truncate_input_tokens,\n)\n```\n\nAlternatively, you can use Cloud Pak for Data credentials. For details, see [watsonx.ai software setup](https://ibm.github.io/watsonx-ai-python-sdk/setup_cpd.html).\n\n```python\nwatsonx_embedding = WatsonxEmbeddings(\n    model_id=\"ibm/slate-125m-english-rtrvr\",\n    url=\"PASTE YOUR URL HERE\",\n    username=\"PASTE YOUR USERNAME HERE\",\n    password=\"PASTE YOUR PASSWORD HERE\",\n    instance_id=\"openshift\",\n    version=\"4.8\",\n    project_id=\"PASTE YOUR PROJECT_ID HERE\",\n    truncate_input_tokens=truncate_input_tokens,\n)\n```\n\n## Usage\n\n### Embed query\n\n```python\nquery = \"Example query.\"\n\nquery_result = watsonx_embedding.get_query_embedding(query)\nprint(query_result[:5])\n```\n\n### Embed list of texts\n\n```python\ntexts = [\"This is a content of one document\", \"This is another document\"]\n\ndoc_result = watsonx_embedding.get_text_embedding_batch(texts)\nprint(doc_result[0][:5])\n```\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "llama-index embeddings IBM watsonx.ai integration",
    "version": "0.3.0",
    "project_urls": null,
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "5e77b23fc1e40f73421aa65e4ec31c98399c706ff6b70807699911331fdb7e57",
                "md5": "23c60c883ce35508db4a6e8c8f9a85e4",
                "sha256": "d81574b621ddb57172de23b459f13c8fc9a00c55a9daf52ae7d6b1668ccb95f3"
            },
            "downloads": -1,
            "filename": "llama_index_embeddings_ibm-0.3.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "23c60c883ce35508db4a6e8c8f9a85e4",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<4.0,>=3.10",
            "size": 6207,
            "upload_time": "2024-11-17T23:07:56",
            "upload_time_iso_8601": "2024-11-17T23:07:56.646193Z",
            "url": "https://files.pythonhosted.org/packages/5e/77/b23fc1e40f73421aa65e4ec31c98399c706ff6b70807699911331fdb7e57/llama_index_embeddings_ibm-0.3.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "56f655372708cce6169e5fa66782c9290d04435dbace0b3e5e4ded8518d51d63",
                "md5": "2e6225a1c9951861a4a5fab9c7cfd3ee",
                "sha256": "af3d822b9988e12d83528eeeeee5c1e82a2cc52729fdbf3ff33f3463b59d85b5"
            },
            "downloads": -1,
            "filename": "llama_index_embeddings_ibm-0.3.0.tar.gz",
            "has_sig": false,
            "md5_digest": "2e6225a1c9951861a4a5fab9c7cfd3ee",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<4.0,>=3.10",
            "size": 5709,
            "upload_time": "2024-11-17T23:07:57",
            "upload_time_iso_8601": "2024-11-17T23:07:57.469868Z",
            "url": "https://files.pythonhosted.org/packages/56/f6/55372708cce6169e5fa66782c9290d04435dbace0b3e5e4ded8518d51d63/llama_index_embeddings_ibm-0.3.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-11-17 23:07:57",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "lcname": "llama-index-embeddings-ibm"
}
        
IBM
Elapsed time: 0.55950s