langchain-custom-models


Namelangchain-custom-models JSON
Version 0.1.4 PyPI version JSON
download
home_pageNone
SummaryCustom Langchain chat models for various large model service providers
upload_time2025-08-19 14:03:28
maintainerNone
docs_urlNone
authorNone
requires_python>=3.10
licenseNone
keywords langchain volcengine llm agent
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # Langchain Custom Models

This repository provides a collection of custom `ChatModel` integrations for `Langchain`, enabling support for various Large Language Model (LLM) service providers. The goal is to offer a unified interface for models that are not yet officially supported by Langchain.

Currently, the following provider is supported:
- **Volcengine Ark**

## Features

- **Seamless Integration**: Drop-in replacement for any Langchain `ChatModel`.
- **Volcengine Ark Support**: Full support for `ChatVolcEngine` to interact with Volcengine's Ark API.
- **Standard Interface**: Works with standard Langchain message types (`SystemMessage`, `HumanMessage`, `AIMessage`, `ToolMessage`).
- **Tool Binding Support**: Full support for `bind_tools()` method for function calling capabilities.

## Installation

You can install the package directly from this repository:

```bash
pip install git+https://github.com/Hellozaq/langchain-custom-models.git
```

Additionally, ensure you have the Volcengine Ark SDK installed:

```bash
pip install volcengine-python-sdk[ark]
```

## Usage

### ChatVolcEngine

To use the `ChatVolcEngine` model, you need to provide your Volcengine Ark API key. The recommended approach is to use a `.env` file to manage your credentials securely.

**1. Create a `.env` file**

In your project's root directory, create a file named `.env` and add your API key:

```
VOLCANO_API_KEY="your-ark-api-key"
```

**2. Load Credentials and Use the Model**

Now, you can use `dotenv` to load the API key from the `.env` file into your environment.

Here is a basic example:

```python
from dotenv import load_dotenv
from langchain_custom_models import ChatVolcEngine
from langchain_core.messages import HumanMessage, SystemMessage

# Load environment variables from .env file
load_dotenv()

# Initialize the chat model
# The API key is automatically read from the VOLCANO_API_KEY environment variable.
# Replace 'your-model-id' with the actual model ID from Volcengine Ark, e.g., 'deepseek-v3-250324'
llm = ChatVolcEngine(model="your-model-id")

# Prepare messages
messages = [
    SystemMessage(content="You are a helpful assistant."),
    HumanMessage(content="Hello, who are you?"),
]

# Get a response
response = llm.invoke(messages)

print(response.content)
```

### Parameters

- `model` (str): **Required**. The model ID from Volcengine Ark (e.g., `deepseek-v3-250324`).
- `ark_api_key` (Optional[str]): Your Volcengine Ark API key. If not provided, it will be read from the `VOLCANO_API_KEY` environment variable.
- `max_tokens` (int): The maximum number of tokens to generate. Defaults to `4096`.
- `temperature` (float): Controls the randomness of the output. Defaults to `0.7`.
- `top_p` (float): Nucleus sampling parameter. Defaults to `1.0`.

## Contributing

Contributions are welcome! If you would like to add support for a new LLM provider or improve existing integrations, please feel free to open a pull request.

## License

This project is licensed under the MIT License. See the `LICENSE` file for details.

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "langchain-custom-models",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.10",
    "maintainer_email": null,
    "keywords": "langchain, volcengine, llm, agent",
    "author": null,
    "author_email": "zjp <paulzaq@yeah.net>",
    "download_url": "https://files.pythonhosted.org/packages/38/61/9b55957be3e3ae0be5e95a2f246466032bf04d7118f2b10486c19057f3a3/langchain_custom_models-0.1.4.tar.gz",
    "platform": null,
    "description": "# Langchain Custom Models\n\nThis repository provides a collection of custom `ChatModel` integrations for `Langchain`, enabling support for various Large Language Model (LLM) service providers. The goal is to offer a unified interface for models that are not yet officially supported by Langchain.\n\nCurrently, the following provider is supported:\n- **Volcengine Ark**\n\n## Features\n\n- **Seamless Integration**: Drop-in replacement for any Langchain `ChatModel`.\n- **Volcengine Ark Support**: Full support for `ChatVolcEngine` to interact with Volcengine's Ark API.\n- **Standard Interface**: Works with standard Langchain message types (`SystemMessage`, `HumanMessage`, `AIMessage`, `ToolMessage`).\n- **Tool Binding Support**: Full support for `bind_tools()` method for function calling capabilities.\n\n## Installation\n\nYou can install the package directly from this repository:\n\n```bash\npip install git+https://github.com/Hellozaq/langchain-custom-models.git\n```\n\nAdditionally, ensure you have the Volcengine Ark SDK installed:\n\n```bash\npip install volcengine-python-sdk[ark]\n```\n\n## Usage\n\n### ChatVolcEngine\n\nTo use the `ChatVolcEngine` model, you need to provide your Volcengine Ark API key. The recommended approach is to use a `.env` file to manage your credentials securely.\n\n**1. Create a `.env` file**\n\nIn your project's root directory, create a file named `.env` and add your API key:\n\n```\nVOLCANO_API_KEY=\"your-ark-api-key\"\n```\n\n**2. Load Credentials and Use the Model**\n\nNow, you can use `dotenv` to load the API key from the `.env` file into your environment.\n\nHere is a basic example:\n\n```python\nfrom dotenv import load_dotenv\nfrom langchain_custom_models import ChatVolcEngine\nfrom langchain_core.messages import HumanMessage, SystemMessage\n\n# Load environment variables from .env file\nload_dotenv()\n\n# Initialize the chat model\n# The API key is automatically read from the VOLCANO_API_KEY environment variable.\n# Replace 'your-model-id' with the actual model ID from Volcengine Ark, e.g., 'deepseek-v3-250324'\nllm = ChatVolcEngine(model=\"your-model-id\")\n\n# Prepare messages\nmessages = [\n    SystemMessage(content=\"You are a helpful assistant.\"),\n    HumanMessage(content=\"Hello, who are you?\"),\n]\n\n# Get a response\nresponse = llm.invoke(messages)\n\nprint(response.content)\n```\n\n### Parameters\n\n- `model` (str): **Required**. The model ID from Volcengine Ark (e.g., `deepseek-v3-250324`).\n- `ark_api_key` (Optional[str]): Your Volcengine Ark API key. If not provided, it will be read from the `VOLCANO_API_KEY` environment variable.\n- `max_tokens` (int): The maximum number of tokens to generate. Defaults to `4096`.\n- `temperature` (float): Controls the randomness of the output. Defaults to `0.7`.\n- `top_p` (float): Nucleus sampling parameter. Defaults to `1.0`.\n\n## Contributing\n\nContributions are welcome! If you would like to add support for a new LLM provider or improve existing integrations, please feel free to open a pull request.\n\n## License\n\nThis project is licensed under the MIT License. See the `LICENSE` file for details.\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "Custom Langchain chat models for various large model service providers",
    "version": "0.1.4",
    "project_urls": {
        "Bug Tracker": "https://github.com/Hellozaq/langchain-custom-models/issues",
        "Homepage": "https://github.com/Hellozaq/langchain-custom-models"
    },
    "split_keywords": [
        "langchain",
        " volcengine",
        " llm",
        " agent"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "0742577cd9713f26f4f71541ad5c1706170f1f792ecdfb527ff1260106ec53b5",
                "md5": "bccf3dc5dc64bf44634dcd94177f299f",
                "sha256": "97df238521f9954f2a2d6dc167e77bb7cc1b5bea3d6ad31b19af2e8409797e6b"
            },
            "downloads": -1,
            "filename": "langchain_custom_models-0.1.4-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "bccf3dc5dc64bf44634dcd94177f299f",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.10",
            "size": 6705,
            "upload_time": "2025-08-19T14:03:25",
            "upload_time_iso_8601": "2025-08-19T14:03:25.409472Z",
            "url": "https://files.pythonhosted.org/packages/07/42/577cd9713f26f4f71541ad5c1706170f1f792ecdfb527ff1260106ec53b5/langchain_custom_models-0.1.4-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "38619b55957be3e3ae0be5e95a2f246466032bf04d7118f2b10486c19057f3a3",
                "md5": "6613d8b9772087ec09c9dd93af466277",
                "sha256": "bc398d87d070d11174cbcfccec2d3b06884ab81ecc40ee50eff4db88487e516c"
            },
            "downloads": -1,
            "filename": "langchain_custom_models-0.1.4.tar.gz",
            "has_sig": false,
            "md5_digest": "6613d8b9772087ec09c9dd93af466277",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.10",
            "size": 6038,
            "upload_time": "2025-08-19T14:03:28",
            "upload_time_iso_8601": "2025-08-19T14:03:28.042095Z",
            "url": "https://files.pythonhosted.org/packages/38/61/9b55957be3e3ae0be5e95a2f246466032bf04d7118f2b10486c19057f3a3/langchain_custom_models-0.1.4.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-08-19 14:03:28",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "Hellozaq",
    "github_project": "langchain-custom-models",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "lcname": "langchain-custom-models"
}
        
Elapsed time: 0.77751s