watsonx-autodeploy


Namewatsonx-autodeploy JSON
Version 0.1.2 PyPI version JSON
download
home_pageNone
SummaryAutomated deployment library for IBM WatsonX AI services with LangGraph support
upload_time2025-08-27 04:16:19
maintainerNone
docs_urlNone
authorNone
requires_python>=3.11
licenseNone
keywords watsonx ai deployment langgraph automation
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # WatsonX AutoDeploy

Automated deployment library for IBM WatsonX AI services with LangGraph support.

## Overview

WatsonX AutoDeploy simplifies the process of deploying AI services to IBM WatsonX by providing a clean, pythonic interface for:

- Creating custom environments with specified dependencies
- Building software specifications 
- Storing and deploying AI services
- Managing the entire deployment lifecycle

## Installation

### From PyPI (when published)
```bash
pip install watsonx-autodeploy
```

### From Source
```bash
git clone https://github.com/nicknochnack/watsonx-autodeploy.git
cd watsonx-autodeploy
pip install -e .
```

### Development Installation
```bash
git clone https://github.com/nicknochnack/watsonx-autodeploy.git
cd watsonx-autodeploy
pip install -e ".[dev]"
```

## Quick Start

### Environment Setup

First, set up your environment variables:

```bash
# .env file
WATSONX_URL=your_watsonx_url
WATSONX_APIKEY=your_api_key
WATSONX_SPACEID=your_space_id
MODEL_ID=your_model_id
```

### Basic Usage

```python
import os
from dotenv import load_dotenv
from autodeploy import Deployer

load_dotenv()

def my_ai_service(context, **kwargs):
    # Your AI service implementation
    def generate(context):
        return {"body": {"message": "Hello from WatsonX!"}}
    
    def generate_stream(context):
        pass
        
    return generate, generate_stream

# Deploy with one line
deployer = Deployer()
deployer.autodeploy(my_ai_service)
```

### Advanced Usage

```python
from autodeploy import Deployer

deployer = Deployer()

# Step-by-step deployment with custom parameters
deployer.export_config(
    python_version="3.11",
    channels="conda-forge",
    dependencies=["custom-package==1.0.0"],
    prefix="/opt/anaconda3/envs/custom"
)

deployer.build_environment(
    environment_name="my-custom-env",
    base_runtime="runtime-24.1-py3.11"
)

deployer.build_software_spec(
    spec_name="my-spec",
    spec_description="Custom specification"
)

deployer.store_service(
    deployable_ai_service=my_ai_service,
    service_name="my-service"
)

deployer.deploy_service(deployment_name="production-deployment")
```

## API Reference

### Deployer Class

The main class for managing WatsonX deployments.

#### `__init__()`
Initializes the deployer with WatsonX credentials from environment variables.

#### `autodeploy(deployable_function)`
One-step deployment method that handles the entire deployment pipeline.

**Parameters:**
- `deployable_function`: The AI service function to deploy

#### `export_config(python_version=None, channels=None, dependencies=None, prefix=None)`
Creates a conda environment configuration file.

**Parameters:**
- `python_version` (str, optional): Python version (default: "3.11")
- `channels` (str, optional): Conda channels (default: "empty") 
- `dependencies` (list, optional): List of pip dependencies
- `prefix` (str, optional): Environment prefix path

#### `build_environment(python_version=None, environment_name=None, base_runtime=None)`
Creates a package extension for the environment.

**Parameters:**
- `python_version` (str, optional): Python version
- `environment_name` (str, optional): Name for the environment
- `base_runtime` (str, optional): Base runtime specification

#### `build_software_spec(spec_name=None, spec_description=None)`
Creates a software specification.

**Parameters:**
- `spec_name` (str, optional): Name for the software specification
- `spec_description` (str, optional): Description of the specification

#### `store_service(deployable_ai_service, service_name=None)`
Stores the AI service in WatsonX repository.

**Parameters:**
- `deployable_ai_service`: The AI service function
- `service_name` (str, optional): Name for the service

#### `deploy_service(deployment_name=None)`
Deploys the stored AI service.

**Parameters:**
- `deployment_name` (str, optional): Name for the deployment

## Examples

See the `examples/` directory for complete usage examples:

- `examples/basic_usage.py` - Simple one-line deployment
- `examples/advanced_usage.py` - Step-by-step deployment with custom parameters

## Requirements

- Python 3.11+
- IBM WatsonX AI account and credentials
- Required dependencies (automatically installed):
  - ibm-watsonx-ai>=1.3.34
  - langchain>=0.3.27
  - langchain-ibm>=0.3.15
  - langgraph>=0.6.5
  - python-dotenv>=1.1.1

## Development

### Running Tests
```bash
pytest
```

### Code Formatting
```bash
black src/ examples/
```

### Type Checking
```bash
mypy src/
```

## Contributing

1. Fork the repository
2. Create a feature branch
3. Make your changes
4. Add tests if applicable
5. Run the test suite
6. Submit a pull request

## License

MIT License. See LICENSE file for details.

## Support

For issues and questions:
- Open an issue on GitHub

## Changelog

### v0.1.0
- Initial release
- Basic deployment functionality
- Support for custom environments and dependencies
- One-step autodeploy feature

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "watsonx-autodeploy",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.11",
    "maintainer_email": null,
    "keywords": "watsonx, ai, deployment, langgraph, automation",
    "author": null,
    "author_email": "Nicholas Renotte <nick@nichollasrenotte.com>",
    "download_url": "https://files.pythonhosted.org/packages/60/20/1642352cc75f2773c8d56cf063e4285eda0c94d2044e1455150c9a9f478e/watsonx_autodeploy-0.1.2.tar.gz",
    "platform": null,
    "description": "# WatsonX AutoDeploy\n\nAutomated deployment library for IBM WatsonX AI services with LangGraph support.\n\n## Overview\n\nWatsonX AutoDeploy simplifies the process of deploying AI services to IBM WatsonX by providing a clean, pythonic interface for:\n\n- Creating custom environments with specified dependencies\n- Building software specifications \n- Storing and deploying AI services\n- Managing the entire deployment lifecycle\n\n## Installation\n\n### From PyPI (when published)\n```bash\npip install watsonx-autodeploy\n```\n\n### From Source\n```bash\ngit clone https://github.com/nicknochnack/watsonx-autodeploy.git\ncd watsonx-autodeploy\npip install -e .\n```\n\n### Development Installation\n```bash\ngit clone https://github.com/nicknochnack/watsonx-autodeploy.git\ncd watsonx-autodeploy\npip install -e \".[dev]\"\n```\n\n## Quick Start\n\n### Environment Setup\n\nFirst, set up your environment variables:\n\n```bash\n# .env file\nWATSONX_URL=your_watsonx_url\nWATSONX_APIKEY=your_api_key\nWATSONX_SPACEID=your_space_id\nMODEL_ID=your_model_id\n```\n\n### Basic Usage\n\n```python\nimport os\nfrom dotenv import load_dotenv\nfrom autodeploy import Deployer\n\nload_dotenv()\n\ndef my_ai_service(context, **kwargs):\n    # Your AI service implementation\n    def generate(context):\n        return {\"body\": {\"message\": \"Hello from WatsonX!\"}}\n    \n    def generate_stream(context):\n        pass\n        \n    return generate, generate_stream\n\n# Deploy with one line\ndeployer = Deployer()\ndeployer.autodeploy(my_ai_service)\n```\n\n### Advanced Usage\n\n```python\nfrom autodeploy import Deployer\n\ndeployer = Deployer()\n\n# Step-by-step deployment with custom parameters\ndeployer.export_config(\n    python_version=\"3.11\",\n    channels=\"conda-forge\",\n    dependencies=[\"custom-package==1.0.0\"],\n    prefix=\"/opt/anaconda3/envs/custom\"\n)\n\ndeployer.build_environment(\n    environment_name=\"my-custom-env\",\n    base_runtime=\"runtime-24.1-py3.11\"\n)\n\ndeployer.build_software_spec(\n    spec_name=\"my-spec\",\n    spec_description=\"Custom specification\"\n)\n\ndeployer.store_service(\n    deployable_ai_service=my_ai_service,\n    service_name=\"my-service\"\n)\n\ndeployer.deploy_service(deployment_name=\"production-deployment\")\n```\n\n## API Reference\n\n### Deployer Class\n\nThe main class for managing WatsonX deployments.\n\n#### `__init__()`\nInitializes the deployer with WatsonX credentials from environment variables.\n\n#### `autodeploy(deployable_function)`\nOne-step deployment method that handles the entire deployment pipeline.\n\n**Parameters:**\n- `deployable_function`: The AI service function to deploy\n\n#### `export_config(python_version=None, channels=None, dependencies=None, prefix=None)`\nCreates a conda environment configuration file.\n\n**Parameters:**\n- `python_version` (str, optional): Python version (default: \"3.11\")\n- `channels` (str, optional): Conda channels (default: \"empty\") \n- `dependencies` (list, optional): List of pip dependencies\n- `prefix` (str, optional): Environment prefix path\n\n#### `build_environment(python_version=None, environment_name=None, base_runtime=None)`\nCreates a package extension for the environment.\n\n**Parameters:**\n- `python_version` (str, optional): Python version\n- `environment_name` (str, optional): Name for the environment\n- `base_runtime` (str, optional): Base runtime specification\n\n#### `build_software_spec(spec_name=None, spec_description=None)`\nCreates a software specification.\n\n**Parameters:**\n- `spec_name` (str, optional): Name for the software specification\n- `spec_description` (str, optional): Description of the specification\n\n#### `store_service(deployable_ai_service, service_name=None)`\nStores the AI service in WatsonX repository.\n\n**Parameters:**\n- `deployable_ai_service`: The AI service function\n- `service_name` (str, optional): Name for the service\n\n#### `deploy_service(deployment_name=None)`\nDeploys the stored AI service.\n\n**Parameters:**\n- `deployment_name` (str, optional): Name for the deployment\n\n## Examples\n\nSee the `examples/` directory for complete usage examples:\n\n- `examples/basic_usage.py` - Simple one-line deployment\n- `examples/advanced_usage.py` - Step-by-step deployment with custom parameters\n\n## Requirements\n\n- Python 3.11+\n- IBM WatsonX AI account and credentials\n- Required dependencies (automatically installed):\n  - ibm-watsonx-ai>=1.3.34\n  - langchain>=0.3.27\n  - langchain-ibm>=0.3.15\n  - langgraph>=0.6.5\n  - python-dotenv>=1.1.1\n\n## Development\n\n### Running Tests\n```bash\npytest\n```\n\n### Code Formatting\n```bash\nblack src/ examples/\n```\n\n### Type Checking\n```bash\nmypy src/\n```\n\n## Contributing\n\n1. Fork the repository\n2. Create a feature branch\n3. Make your changes\n4. Add tests if applicable\n5. Run the test suite\n6. Submit a pull request\n\n## License\n\nMIT License. See LICENSE file for details.\n\n## Support\n\nFor issues and questions:\n- Open an issue on GitHub\n\n## Changelog\n\n### v0.1.0\n- Initial release\n- Basic deployment functionality\n- Support for custom environments and dependencies\n- One-step autodeploy feature\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "Automated deployment library for IBM WatsonX AI services with LangGraph support",
    "version": "0.1.2",
    "project_urls": {
        "Homepage": "https://github.com/nicknochnack/watsonx-autodeploy",
        "Issues": "https://github.com/nicknochnack/watsonx-autodeploy/issues",
        "Repository": "https://github.com/nicknochnack/watsonx-autodeploy.git"
    },
    "split_keywords": [
        "watsonx",
        " ai",
        " deployment",
        " langgraph",
        " automation"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "a9380b4f0fb3b32ab7b3b0fbfb0d390d2f4efd52461ecc0cef393e573da3158b",
                "md5": "603187a30c8ce66de4845dffd6af0795",
                "sha256": "1b1af5c61f9405fc2748ec7125849c3997328170530c098442cea744a6746476"
            },
            "downloads": -1,
            "filename": "watsonx_autodeploy-0.1.2-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "603187a30c8ce66de4845dffd6af0795",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.11",
            "size": 6077,
            "upload_time": "2025-08-27T04:16:18",
            "upload_time_iso_8601": "2025-08-27T04:16:18.203180Z",
            "url": "https://files.pythonhosted.org/packages/a9/38/0b4f0fb3b32ab7b3b0fbfb0d390d2f4efd52461ecc0cef393e573da3158b/watsonx_autodeploy-0.1.2-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "60201642352cc75f2773c8d56cf063e4285eda0c94d2044e1455150c9a9f478e",
                "md5": "71e1199bd209a606383173774720d4c2",
                "sha256": "007ea1b14cfddd6a3235a20dba6de9a6bdd3d2a2b1fc9a665f5c1709f9c7ee5b"
            },
            "downloads": -1,
            "filename": "watsonx_autodeploy-0.1.2.tar.gz",
            "has_sig": false,
            "md5_digest": "71e1199bd209a606383173774720d4c2",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.11",
            "size": 8470,
            "upload_time": "2025-08-27T04:16:19",
            "upload_time_iso_8601": "2025-08-27T04:16:19.466937Z",
            "url": "https://files.pythonhosted.org/packages/60/20/1642352cc75f2773c8d56cf063e4285eda0c94d2044e1455150c9a9f478e/watsonx_autodeploy-0.1.2.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-08-27 04:16:19",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "nicknochnack",
    "github_project": "watsonx-autodeploy",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "lcname": "watsonx-autodeploy"
}
        
Elapsed time: 1.53548s