langchain-google-vertexai


Namelangchain-google-vertexai JSON
Version 2.0.10 PyPI version JSON
download
home_pagehttps://github.com/langchain-ai/langchain-google
SummaryAn integration package connecting Google VertexAI and LangChain
upload_time2025-01-07 14:11:32
maintainerNone
docs_urlNone
authorNone
requires_python<4.0,>=3.9
licenseMIT
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # langchain-google-vertexai

This package contains the LangChain integrations for Google Cloud generative models.

## Installation

```bash
pip install -U langchain-google-vertexai
```

## Chat Models

`ChatVertexAI` class exposes models such as `gemini-pro` and `chat-bison`.

To use, you should have Google Cloud project with APIs enabled, and configured credentials. Initialize the model as:

```python
from langchain_google_vertexai import ChatVertexAI

llm = ChatVertexAI(model_name="gemini-pro")
llm.invoke("Sing a ballad of LangChain.")
```

You can use other models, e.g. `chat-bison`:

```python
from langchain_google_vertexai import ChatVertexAI

llm = ChatVertexAI(model_name="chat-bison", temperature=0.3)
llm.invoke("Sing a ballad of LangChain.")
```

#### Multimodal inputs

Gemini vision model supports image inputs when providing a single chat message. Example:

```python
from langchain_core.messages import HumanMessage
from langchain_google_vertexai import ChatVertexAI

llm = ChatVertexAI(model_name="gemini-pro-vision")
# example
message = HumanMessage(
    content=[
        {
            "type": "text",
            "text": "What's in this image?",
        },  # You can optionally provide text parts
        {"type": "image_url", "image_url": {"url": "https://picsum.photos/seed/picsum/200/300"}},
    ]
)
llm.invoke([message])
```

The value of `image_url` can be any of the following:

- A public image URL
- An accessible gcs file (e.g., "gcs://path/to/file.png")
- A base64 encoded image (e.g., `data:image/png;base64,abcd124`)

## Embeddings

You can use Google Cloud's embeddings models as:

```python
from langchain_google_vertexai import VertexAIEmbeddings

embeddings = VertexAIEmbeddings()
embeddings.embed_query("hello, world!")
```

## LLMs

You can use Google Cloud's generative AI models as Langchain LLMs:

```python
from langchain_core.prompts import PromptTemplate
from langchain_google_vertexai import ChatVertexAI

template = """Question: {question}

Answer: Let's think step by step."""
prompt = PromptTemplate.from_template(template)

llm = ChatVertexAI(model_name="gemini-pro")
chain = prompt | llm

question = "Who was the president of the USA in 1994?"
print(chain.invoke({"question": question}))
```

You can use Gemini and Palm models, including code-generations ones:

```python
from langchain_google_vertexai import VertexAI

llm = VertexAI(model_name="code-bison", max_output_tokens=1000, temperature=0.3)

question = "Write a python function that checks if a string is a valid email address"

output = llm(question)
```


            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/langchain-ai/langchain-google",
    "name": "langchain-google-vertexai",
    "maintainer": null,
    "docs_url": null,
    "requires_python": "<4.0,>=3.9",
    "maintainer_email": null,
    "keywords": null,
    "author": null,
    "author_email": null,
    "download_url": "https://files.pythonhosted.org/packages/1e/69/41c5a2dd769e782972b82d8279dfdd0ab2bc647248cafaaa2f2f2365a049/langchain_google_vertexai-2.0.10.tar.gz",
    "platform": null,
    "description": "# langchain-google-vertexai\n\nThis package contains the LangChain integrations for Google Cloud generative models.\n\n## Installation\n\n```bash\npip install -U langchain-google-vertexai\n```\n\n## Chat Models\n\n`ChatVertexAI` class exposes models such as `gemini-pro` and `chat-bison`.\n\nTo use, you should have Google Cloud project with APIs enabled, and configured credentials. Initialize the model as:\n\n```python\nfrom langchain_google_vertexai import ChatVertexAI\n\nllm = ChatVertexAI(model_name=\"gemini-pro\")\nllm.invoke(\"Sing a ballad of LangChain.\")\n```\n\nYou can use other models, e.g. `chat-bison`:\n\n```python\nfrom langchain_google_vertexai import ChatVertexAI\n\nllm = ChatVertexAI(model_name=\"chat-bison\", temperature=0.3)\nllm.invoke(\"Sing a ballad of LangChain.\")\n```\n\n#### Multimodal inputs\n\nGemini vision model supports image inputs when providing a single chat message. Example:\n\n```python\nfrom langchain_core.messages import HumanMessage\nfrom langchain_google_vertexai import ChatVertexAI\n\nllm = ChatVertexAI(model_name=\"gemini-pro-vision\")\n# example\nmessage = HumanMessage(\n    content=[\n        {\n            \"type\": \"text\",\n            \"text\": \"What's in this image?\",\n        },  # You can optionally provide text parts\n        {\"type\": \"image_url\", \"image_url\": {\"url\": \"https://picsum.photos/seed/picsum/200/300\"}},\n    ]\n)\nllm.invoke([message])\n```\n\nThe value of `image_url` can be any of the following:\n\n- A public image URL\n- An accessible gcs file (e.g., \"gcs://path/to/file.png\")\n- A base64 encoded image (e.g., `data:image/png;base64,abcd124`)\n\n## Embeddings\n\nYou can use Google Cloud's embeddings models as:\n\n```python\nfrom langchain_google_vertexai import VertexAIEmbeddings\n\nembeddings = VertexAIEmbeddings()\nembeddings.embed_query(\"hello, world!\")\n```\n\n## LLMs\n\nYou can use Google Cloud's generative AI models as Langchain LLMs:\n\n```python\nfrom langchain_core.prompts import PromptTemplate\nfrom langchain_google_vertexai import ChatVertexAI\n\ntemplate = \"\"\"Question: {question}\n\nAnswer: Let's think step by step.\"\"\"\nprompt = PromptTemplate.from_template(template)\n\nllm = ChatVertexAI(model_name=\"gemini-pro\")\nchain = prompt | llm\n\nquestion = \"Who was the president of the USA in 1994?\"\nprint(chain.invoke({\"question\": question}))\n```\n\nYou can use Gemini and Palm models, including code-generations ones:\n\n```python\nfrom langchain_google_vertexai import VertexAI\n\nllm = VertexAI(model_name=\"code-bison\", max_output_tokens=1000, temperature=0.3)\n\nquestion = \"Write a python function that checks if a string is a valid email address\"\n\noutput = llm(question)\n```\n\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "An integration package connecting Google VertexAI and LangChain",
    "version": "2.0.10",
    "project_urls": {
        "Homepage": "https://github.com/langchain-ai/langchain-google",
        "Repository": "https://github.com/langchain-ai/langchain-google",
        "Source Code": "https://github.com/langchain-ai/langchain-google/tree/main/libs/vertexai"
    },
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "283c2f70d73d2cc9a40806a2a96cb44f7ba4999606add4885f28584a09959aa2",
                "md5": "1f5d362ccfce98b7e5c50375229211d7",
                "sha256": "192a5ac1c9165652a9b7bd936740be5d9efbecc6589f9865530d62c9054b3a1c"
            },
            "downloads": -1,
            "filename": "langchain_google_vertexai-2.0.10-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "1f5d362ccfce98b7e5c50375229211d7",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<4.0,>=3.9",
            "size": 92234,
            "upload_time": "2025-01-07T14:11:28",
            "upload_time_iso_8601": "2025-01-07T14:11:28.378488Z",
            "url": "https://files.pythonhosted.org/packages/28/3c/2f70d73d2cc9a40806a2a96cb44f7ba4999606add4885f28584a09959aa2/langchain_google_vertexai-2.0.10-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "1e6941c5a2dd769e782972b82d8279dfdd0ab2bc647248cafaaa2f2f2365a049",
                "md5": "41b931f5a84d50e07df22fe0abbeb86c",
                "sha256": "02e3dca590d1f20e63a6e4aa48693183c551a1e98a0476cdbbdba1e5f80b30f9"
            },
            "downloads": -1,
            "filename": "langchain_google_vertexai-2.0.10.tar.gz",
            "has_sig": false,
            "md5_digest": "41b931f5a84d50e07df22fe0abbeb86c",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<4.0,>=3.9",
            "size": 77205,
            "upload_time": "2025-01-07T14:11:32",
            "upload_time_iso_8601": "2025-01-07T14:11:32.410592Z",
            "url": "https://files.pythonhosted.org/packages/1e/69/41c5a2dd769e782972b82d8279dfdd0ab2bc647248cafaaa2f2f2365a049/langchain_google_vertexai-2.0.10.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-01-07 14:11:32",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "langchain-ai",
    "github_project": "langchain-google",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "langchain-google-vertexai"
}
        
Elapsed time: 0.40005s