gigachain-ibm


Namegigachain-ibm JSON
Version 0.1.8 PyPI version JSON
download
home_pagehttps://github.com/ai-forever/gigachain
SummaryAn integration package connecting IBM watsonx.ai and GigaChain
upload_time2024-07-04 15:48:07
maintainerNone
docs_urlNone
authorIBM
requires_python<4.0,>=3.10
licenseMIT
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # langchain-ibm

This package provides the integration between LangChain and IBM watsonx.ai through the `ibm-watsonx-ai` SDK.

## Installation

To use the `langchain-ibm` package, follow these installation steps:

```bash
pip install gigachain-ibm
```

## Usage

### Setting up

To use IBM's models, you must have an IBM Cloud user API key. Here's how to obtain and set up your API key:

1. **Obtain an API Key:** For more details on how to create and manage an API key, refer to IBM's [documentation](https://cloud.ibm.com/docs/account?topic=account-userapikey&interface=ui).
2. **Set the API Key as an Environment Variable:** For security reasons, it's recommended to not hard-code your API key directly in your scripts. Instead, set it up as an environment variable. You can use the following code to prompt for the API key and set it as an environment variable:

```python
import os
from getpass import getpass

watsonx_api_key = getpass()
os.environ["WATSONX_APIKEY"] = watsonx_api_key
```

In alternative, you can set the environment variable in your terminal.

- **Linux/macOS:** Open your terminal and execute the following command:
     ```bash
     export WATSONX_APIKEY='your_ibm_api_key'
     ```
     To make this environment variable persistent across terminal sessions, add the above line to your `~/.bashrc`, `~/.bash_profile`, or `~/.zshrc` file.

- **Windows:** For Command Prompt, use:
    ```cmd
    set WATSONX_APIKEY=your_ibm_api_key
    ```

### Loading the model

You might need to adjust model parameters for different models or tasks. For more details on the parameters, refer to IBM's [documentation](https://ibm.github.io/watsonx-ai-python-sdk/fm_model.html#metanames.GenTextParamsMetaNames).

```python
parameters = {
    "decoding_method": "sample",
    "max_new_tokens": 100,
    "min_new_tokens": 1,
    "temperature": 0.5,
    "top_k": 50,
    "top_p": 1,
}
```

Initialize the WatsonxLLM class with the previously set parameters.

```python
from langchain_ibm import WatsonxLLM

watsonx_llm = WatsonxLLM(
    model_id="PASTE THE CHOSEN MODEL_ID HERE",
    url="PASTE YOUR URL HERE",
    project_id="PASTE YOUR PROJECT_ID HERE",
    params=parameters,
)
```

**Note:**
- You must provide a `project_id` or `space_id`. For more information refer to IBM's [documentation](https://www.ibm.com/docs/en/watsonx-as-a-service?topic=projects).
- Depending on the region of your provisioned service instance, use one of the urls described [here](https://ibm.github.io/watsonx-ai-python-sdk/setup_cloud.html#authentication).
- You need to specify the model you want to use for inferencing through `model_id`. You can find the list of available models [here](https://ibm.github.io/watsonx-ai-python-sdk/fm_model.html#ibm_watsonx_ai.foundation_models.utils.enums.ModelTypes).


Alternatively you can use Cloud Pak for Data credentials. For more details, refer to IBM's [documentation](https://ibm.github.io/watsonx-ai-python-sdk/setup_cpd.html).

```python
watsonx_llm = WatsonxLLM(
    model_id="ibm/granite-13b-instruct-v2",
    url="PASTE YOUR URL HERE",
    username="PASTE YOUR USERNAME HERE",
    password="PASTE YOUR PASSWORD HERE",
    instance_id="openshift",
    version="4.8",
    project_id="PASTE YOUR PROJECT_ID HERE",
    params=parameters,
)
```

### Create a Chain

Create `PromptTemplate` objects which will be responsible for creating a random question.

```python
from langchain.prompts import PromptTemplate

template = "Generate a random question about {topic}: Question: "
prompt = PromptTemplate.from_template(template)
```

Provide a topic and run the LLMChain.

```python
from langchain.chains import LLMChain

llm_chain = LLMChain(prompt=prompt, llm=watsonx_llm)
response = llm_chain.invoke("dog")
print(response)
```

### Calling the Model Directly
To obtain completions, you can call the model directly using a string prompt.

```python
# Calling a single prompt

response = watsonx_llm.invoke("Who is man's best friend?")
print(response)
```

```python
# Calling multiple prompts

response = watsonx_llm.generate(
    [
        "The fastest dog in the world?",
        "Describe your chosen dog breed",
    ]
)
print(response)
```

### Streaming the Model output

You can stream the model output.

```python
for chunk in watsonx_llm.stream(
    "Describe your favorite breed of dog and why it is your favorite."
):
    print(chunk, end="")
```

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/ai-forever/gigachain",
    "name": "gigachain-ibm",
    "maintainer": null,
    "docs_url": null,
    "requires_python": "<4.0,>=3.10",
    "maintainer_email": null,
    "keywords": null,
    "author": "IBM",
    "author_email": null,
    "download_url": "https://files.pythonhosted.org/packages/4c/3b/503776a0f9856f07a7a03eea3965419e05ec9d33caccc7dde41773cb7d19/gigachain_ibm-0.1.8.tar.gz",
    "platform": null,
    "description": "# langchain-ibm\n\nThis package provides the integration between LangChain and IBM watsonx.ai through the `ibm-watsonx-ai` SDK.\n\n## Installation\n\nTo use the `langchain-ibm` package, follow these installation steps:\n\n```bash\npip install gigachain-ibm\n```\n\n## Usage\n\n### Setting up\n\nTo use IBM's models, you must have an IBM Cloud user API key. Here's how to obtain and set up your API key:\n\n1. **Obtain an API Key:** For more details on how to create and manage an API key, refer to IBM's [documentation](https://cloud.ibm.com/docs/account?topic=account-userapikey&interface=ui).\n2. **Set the API Key as an Environment Variable:** For security reasons, it's recommended to not hard-code your API key directly in your scripts. Instead, set it up as an environment variable. You can use the following code to prompt for the API key and set it as an environment variable:\n\n```python\nimport os\nfrom getpass import getpass\n\nwatsonx_api_key = getpass()\nos.environ[\"WATSONX_APIKEY\"] = watsonx_api_key\n```\n\nIn alternative, you can set the environment variable in your terminal.\n\n- **Linux/macOS:** Open your terminal and execute the following command:\n     ```bash\n     export WATSONX_APIKEY='your_ibm_api_key'\n     ```\n     To make this environment variable persistent across terminal sessions, add the above line to your `~/.bashrc`, `~/.bash_profile`, or `~/.zshrc` file.\n\n- **Windows:** For Command Prompt, use:\n    ```cmd\n    set WATSONX_APIKEY=your_ibm_api_key\n    ```\n\n### Loading the model\n\nYou might need to adjust model parameters for different models or tasks. For more details on the parameters, refer to IBM's [documentation](https://ibm.github.io/watsonx-ai-python-sdk/fm_model.html#metanames.GenTextParamsMetaNames).\n\n```python\nparameters = {\n    \"decoding_method\": \"sample\",\n    \"max_new_tokens\": 100,\n    \"min_new_tokens\": 1,\n    \"temperature\": 0.5,\n    \"top_k\": 50,\n    \"top_p\": 1,\n}\n```\n\nInitialize the WatsonxLLM class with the previously set parameters.\n\n```python\nfrom langchain_ibm import WatsonxLLM\n\nwatsonx_llm = WatsonxLLM(\n    model_id=\"PASTE THE CHOSEN MODEL_ID HERE\",\n    url=\"PASTE YOUR URL HERE\",\n    project_id=\"PASTE YOUR PROJECT_ID HERE\",\n    params=parameters,\n)\n```\n\n**Note:**\n- You must provide a `project_id` or `space_id`. For more information refer to IBM's [documentation](https://www.ibm.com/docs/en/watsonx-as-a-service?topic=projects).\n- Depending on the region of your provisioned service instance, use one of the urls described [here](https://ibm.github.io/watsonx-ai-python-sdk/setup_cloud.html#authentication).\n- You need to specify the model you want to use for inferencing through `model_id`. You can find the list of available models [here](https://ibm.github.io/watsonx-ai-python-sdk/fm_model.html#ibm_watsonx_ai.foundation_models.utils.enums.ModelTypes).\n\n\nAlternatively you can use Cloud Pak for Data credentials. For more details, refer to IBM's [documentation](https://ibm.github.io/watsonx-ai-python-sdk/setup_cpd.html).\n\n```python\nwatsonx_llm = WatsonxLLM(\n    model_id=\"ibm/granite-13b-instruct-v2\",\n    url=\"PASTE YOUR URL HERE\",\n    username=\"PASTE YOUR USERNAME HERE\",\n    password=\"PASTE YOUR PASSWORD HERE\",\n    instance_id=\"openshift\",\n    version=\"4.8\",\n    project_id=\"PASTE YOUR PROJECT_ID HERE\",\n    params=parameters,\n)\n```\n\n### Create a Chain\n\nCreate `PromptTemplate` objects which will be responsible for creating a random question.\n\n```python\nfrom langchain.prompts import PromptTemplate\n\ntemplate = \"Generate a random question about {topic}: Question: \"\nprompt = PromptTemplate.from_template(template)\n```\n\nProvide a topic and run the LLMChain.\n\n```python\nfrom langchain.chains import LLMChain\n\nllm_chain = LLMChain(prompt=prompt, llm=watsonx_llm)\nresponse = llm_chain.invoke(\"dog\")\nprint(response)\n```\n\n### Calling the Model Directly\nTo obtain completions, you can call the model directly using a string prompt.\n\n```python\n# Calling a single prompt\n\nresponse = watsonx_llm.invoke(\"Who is man's best friend?\")\nprint(response)\n```\n\n```python\n# Calling multiple prompts\n\nresponse = watsonx_llm.generate(\n    [\n        \"The fastest dog in the world?\",\n        \"Describe your chosen dog breed\",\n    ]\n)\nprint(response)\n```\n\n### Streaming the Model output\n\nYou can stream the model output.\n\n```python\nfor chunk in watsonx_llm.stream(\n    \"Describe your favorite breed of dog and why it is your favorite.\"\n):\n    print(chunk, end=\"\")\n```\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "An integration package connecting IBM watsonx.ai and GigaChain",
    "version": "0.1.8",
    "project_urls": {
        "Homepage": "https://github.com/ai-forever/gigachain",
        "Repository": "https://github.com/ai-forever/gigachain",
        "Source Code": "https://github.com/ai-forever/gigachain/tree/master/libs/partners/ibm"
    },
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "2ff3af875713e94a34e8162a4df67464a1b04f44fda0c1f6c7bcbcf0ceb50e4a",
                "md5": "aa7f719380a344770b913c7700664cc5",
                "sha256": "e7c85f4d0988c1e3a54cc99af2326256e6d1f0f274c632e69245e884950af82f"
            },
            "downloads": -1,
            "filename": "gigachain_ibm-0.1.8-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "aa7f719380a344770b913c7700664cc5",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<4.0,>=3.10",
            "size": 17962,
            "upload_time": "2024-07-04T15:48:06",
            "upload_time_iso_8601": "2024-07-04T15:48:06.048263Z",
            "url": "https://files.pythonhosted.org/packages/2f/f3/af875713e94a34e8162a4df67464a1b04f44fda0c1f6c7bcbcf0ceb50e4a/gigachain_ibm-0.1.8-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "4c3b503776a0f9856f07a7a03eea3965419e05ec9d33caccc7dde41773cb7d19",
                "md5": "20a649c53ab875318878c1e35970c5c6",
                "sha256": "864175faeecaca241d0000c81c2959e1dad28d5ba3b26a87ec340b483060ba51"
            },
            "downloads": -1,
            "filename": "gigachain_ibm-0.1.8.tar.gz",
            "has_sig": false,
            "md5_digest": "20a649c53ab875318878c1e35970c5c6",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<4.0,>=3.10",
            "size": 16826,
            "upload_time": "2024-07-04T15:48:07",
            "upload_time_iso_8601": "2024-07-04T15:48:07.591976Z",
            "url": "https://files.pythonhosted.org/packages/4c/3b/503776a0f9856f07a7a03eea3965419e05ec9d33caccc7dde41773cb7d19/gigachain_ibm-0.1.8.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-07-04 15:48:07",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "ai-forever",
    "github_project": "gigachain",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "gigachain-ibm"
}
        
IBM
Elapsed time: 0.34127s