superb-ai-onprem


Namesuperb-ai-onprem JSON
Version 0.4.4 PyPI version JSON
download
home_pagehttps://github.com/Superb-AI-Suite/superb-ai-onprem-python
SummaryPython SDK for Superb AI On-premise
upload_time2025-08-06 09:52:50
maintainerNone
docs_urlNone
authorSuperb AI
requires_python>=3.8
licenseNone
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # Superb AI On-premise SDK

Python SDK for Superb AI's On-premise solution. This SDK provides a simple interface to interact with your on-premise Superb AI installation.

## Installation

```bash
pip install superb-ai-onprem
```

## Quick Start

```python
from spb_onprem import DatasetService, DataService
from spb_onprem.data.enums import DataType

# Initialize services
dataset_service = DatasetService()
data_service = DataService()

# Create a dataset
dataset = dataset_service.create_dataset(
    name="my-dataset",
    description="My first dataset"
)

# Upload an image with annotation
with open("image.jpg", "rb") as f:
    image_data = BytesIO(f.read())

data = data_service.create_image_data(
    dataset_id=dataset.id,
    key="image_1",
    image_content=image_data,
    annotation={
        "labels": ["car", "person"],
        "boxes": [
            {"x": 100, "y": 100, "width": 200, "height": 200}
        ]
    }
)
```

## Features

- Dataset Management
  - Create, update, and delete datasets
  - List and filter datasets
- Data Management
  - Upload images with annotations
  - Update annotations
  - Add/remove data from slices
  - Manage metadata
- Slice Management
  - Create and manage data slices
  - Filter and organize your data

## Usage Examples

### Dataset Operations

```python
from spb_onprem import DatasetService
from spb_onprem import DatasetsFilter, DatasetsFilterOptions

# Initialize service
dataset_service = DatasetService()

# Create a dataset
dataset = dataset_service.create_dataset(
    name="my-dataset",
    description="Dataset description"
)

# List datasets with filtering
filter = DatasetsFilter(
    must_filter=DatasetsFilterOptions(
        name_contains="test"
    )
)
datasets = dataset_service.get_datasets(filter=filter)
```

### Data Operations

```python
from spb_onprem import DataService
from spb_onprem import DataListFilter, DataFilterOptions

# Initialize service
data_service = DataService()

# List data with filtering
filter = DataListFilter(
    must_filter=DataFilterOptions(
        key_contains="image_",
        annotation_exists=True
    )
)
data_list = data_service.get_data_list(
    dataset_id="your-dataset-id",
    filter=filter
)

# Update annotation
data_service.update_annotation(
    dataset_id="your-dataset-id",
    data_id="your-data-id",
    annotation={
        "labels": ["updated_label"],
        "boxes": [...]
    }
)
```

### Slice Operations

```python
from spb_onprem import SliceService

# Initialize service
slice_service = SliceService()

# Create a slice
slice = slice_service.create_slice(
    dataset_id="your-dataset-id",
    name="validation-set",
    description="Validation data slice"
)

# Add data to slice
data_service.add_data_to_slice(
    dataset_id="your-dataset-id",
    data_id="your-data-id",
    slice_id=slice.id
)
```

## Error Handling

The SDK provides specific error types for different scenarios:

```python
from spb_onprem.exceptions import (
    BadParameterError,
    NotFoundError,
    UnknownError
)

try:
    dataset = dataset_service.get_dataset(dataset_id="non-existent-id")
except NotFoundError:
    print("Dataset not found")
except BadParameterError as e:
    print(f"Invalid parameter: {e}")
except UnknownError as e:
    print(f"An unexpected error occurred: {e}")
```

## Configuration

The SDK supports two authentication methods:

### 1. Config File Authentication (Default)

Create a config file at `~/.spb/onprem-config`:

```ini
[default]
host=https://your-onprem-host
access_key=your-access-key
access_key_secret=your-access-key-secret
```

This is the default authentication method when `SUPERB_SYSTEM_SDK=false` or not set.

### 2. Environment Variables (for Airflow DAGs)

When running in an Airflow DAG or other system environments, you can use environment variables for authentication. This method is activated by setting `SUPERB_SYSTEM_SDK=true`.

Required environment variables:
```bash
# Enable system SDK mode
export SUPERB_SYSTEM_SDK=true

# Set the host URL (either one is required)
export SUPERB_SYSTEM_SDK_HOST=https://your-superb-ai-host
# or
export SUNRISE_SERVER_URL=https://your-superb-ai-host

# Set the user email
export SUPERB_SYSTEM_SDK_USER_EMAIL=user@example.com
```

You can set these environment variables:
- Directly in your shell
- In your Airflow DAG configuration
- Through your deployment environment
- Using a `.env` file with your preferred method of loading environment variables

Note: 
- When `SUPERB_SYSTEM_SDK=true`, the SDK will ignore the config file (`~/.spb/onprem-config`) and use environment variables exclusively.
- When `SUPERB_SYSTEM_SDK=false` or not set, the SDK will look for authentication credentials in `~/.spb/onprem-config`.

## Requirements

- Python >= 3.7
- requests >= 2.22.0
- urllib3 >= 1.21.1
- pydantic >= 1.8.0

## License

This project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.

## Support

For support or feature requests, please contact the Superb AI team or create an issue in this repository.


            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/Superb-AI-Suite/superb-ai-onprem-python",
    "name": "superb-ai-onprem",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.8",
    "maintainer_email": null,
    "keywords": null,
    "author": "Superb AI",
    "author_email": "support@superb-ai.com",
    "download_url": "https://files.pythonhosted.org/packages/73/60/04271ecc52994b1c305eb5bdb4b9605d4e68eb3cb76a6e964fa37666c914/superb_ai_onprem-0.4.4.tar.gz",
    "platform": null,
    "description": "# Superb AI On-premise SDK\n\nPython SDK for Superb AI's On-premise solution. This SDK provides a simple interface to interact with your on-premise Superb AI installation.\n\n## Installation\n\n```bash\npip install superb-ai-onprem\n```\n\n## Quick Start\n\n```python\nfrom spb_onprem import DatasetService, DataService\nfrom spb_onprem.data.enums import DataType\n\n# Initialize services\ndataset_service = DatasetService()\ndata_service = DataService()\n\n# Create a dataset\ndataset = dataset_service.create_dataset(\n    name=\"my-dataset\",\n    description=\"My first dataset\"\n)\n\n# Upload an image with annotation\nwith open(\"image.jpg\", \"rb\") as f:\n    image_data = BytesIO(f.read())\n\ndata = data_service.create_image_data(\n    dataset_id=dataset.id,\n    key=\"image_1\",\n    image_content=image_data,\n    annotation={\n        \"labels\": [\"car\", \"person\"],\n        \"boxes\": [\n            {\"x\": 100, \"y\": 100, \"width\": 200, \"height\": 200}\n        ]\n    }\n)\n```\n\n## Features\n\n- Dataset Management\n  - Create, update, and delete datasets\n  - List and filter datasets\n- Data Management\n  - Upload images with annotations\n  - Update annotations\n  - Add/remove data from slices\n  - Manage metadata\n- Slice Management\n  - Create and manage data slices\n  - Filter and organize your data\n\n## Usage Examples\n\n### Dataset Operations\n\n```python\nfrom spb_onprem import DatasetService\nfrom spb_onprem import DatasetsFilter, DatasetsFilterOptions\n\n# Initialize service\ndataset_service = DatasetService()\n\n# Create a dataset\ndataset = dataset_service.create_dataset(\n    name=\"my-dataset\",\n    description=\"Dataset description\"\n)\n\n# List datasets with filtering\nfilter = DatasetsFilter(\n    must_filter=DatasetsFilterOptions(\n        name_contains=\"test\"\n    )\n)\ndatasets = dataset_service.get_datasets(filter=filter)\n```\n\n### Data Operations\n\n```python\nfrom spb_onprem import DataService\nfrom spb_onprem import DataListFilter, DataFilterOptions\n\n# Initialize service\ndata_service = DataService()\n\n# List data with filtering\nfilter = DataListFilter(\n    must_filter=DataFilterOptions(\n        key_contains=\"image_\",\n        annotation_exists=True\n    )\n)\ndata_list = data_service.get_data_list(\n    dataset_id=\"your-dataset-id\",\n    filter=filter\n)\n\n# Update annotation\ndata_service.update_annotation(\n    dataset_id=\"your-dataset-id\",\n    data_id=\"your-data-id\",\n    annotation={\n        \"labels\": [\"updated_label\"],\n        \"boxes\": [...]\n    }\n)\n```\n\n### Slice Operations\n\n```python\nfrom spb_onprem import SliceService\n\n# Initialize service\nslice_service = SliceService()\n\n# Create a slice\nslice = slice_service.create_slice(\n    dataset_id=\"your-dataset-id\",\n    name=\"validation-set\",\n    description=\"Validation data slice\"\n)\n\n# Add data to slice\ndata_service.add_data_to_slice(\n    dataset_id=\"your-dataset-id\",\n    data_id=\"your-data-id\",\n    slice_id=slice.id\n)\n```\n\n## Error Handling\n\nThe SDK provides specific error types for different scenarios:\n\n```python\nfrom spb_onprem.exceptions import (\n    BadParameterError,\n    NotFoundError,\n    UnknownError\n)\n\ntry:\n    dataset = dataset_service.get_dataset(dataset_id=\"non-existent-id\")\nexcept NotFoundError:\n    print(\"Dataset not found\")\nexcept BadParameterError as e:\n    print(f\"Invalid parameter: {e}\")\nexcept UnknownError as e:\n    print(f\"An unexpected error occurred: {e}\")\n```\n\n## Configuration\n\nThe SDK supports two authentication methods:\n\n### 1. Config File Authentication (Default)\n\nCreate a config file at `~/.spb/onprem-config`:\n\n```ini\n[default]\nhost=https://your-onprem-host\naccess_key=your-access-key\naccess_key_secret=your-access-key-secret\n```\n\nThis is the default authentication method when `SUPERB_SYSTEM_SDK=false` or not set.\n\n### 2. Environment Variables (for Airflow DAGs)\n\nWhen running in an Airflow DAG or other system environments, you can use environment variables for authentication. This method is activated by setting `SUPERB_SYSTEM_SDK=true`.\n\nRequired environment variables:\n```bash\n# Enable system SDK mode\nexport SUPERB_SYSTEM_SDK=true\n\n# Set the host URL (either one is required)\nexport SUPERB_SYSTEM_SDK_HOST=https://your-superb-ai-host\n# or\nexport SUNRISE_SERVER_URL=https://your-superb-ai-host\n\n# Set the user email\nexport SUPERB_SYSTEM_SDK_USER_EMAIL=user@example.com\n```\n\nYou can set these environment variables:\n- Directly in your shell\n- In your Airflow DAG configuration\n- Through your deployment environment\n- Using a `.env` file with your preferred method of loading environment variables\n\nNote: \n- When `SUPERB_SYSTEM_SDK=true`, the SDK will ignore the config file (`~/.spb/onprem-config`) and use environment variables exclusively.\n- When `SUPERB_SYSTEM_SDK=false` or not set, the SDK will look for authentication credentials in `~/.spb/onprem-config`.\n\n## Requirements\n\n- Python >= 3.7\n- requests >= 2.22.0\n- urllib3 >= 1.21.1\n- pydantic >= 1.8.0\n\n## License\n\nThis project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.\n\n## Support\n\nFor support or feature requests, please contact the Superb AI team or create an issue in this repository.\n\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "Python SDK for Superb AI On-premise",
    "version": "0.4.4",
    "project_urls": {
        "Homepage": "https://github.com/Superb-AI-Suite/superb-ai-onprem-python"
    },
    "split_keywords": [],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "f601325977e792834b119549009f47a942dcde73fd8cb96d304dd54dbe01cb85",
                "md5": "68a04b8e145a7840f3f4fcd62f325dac",
                "sha256": "7b6a7cb87cd84c5db03d67d4389e17f37058a59625cb1d74e1cd8fbc12198265"
            },
            "downloads": -1,
            "filename": "superb_ai_onprem-0.4.4-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "68a04b8e145a7840f3f4fcd62f325dac",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.8",
            "size": 79019,
            "upload_time": "2025-08-06T09:52:49",
            "upload_time_iso_8601": "2025-08-06T09:52:49.029448Z",
            "url": "https://files.pythonhosted.org/packages/f6/01/325977e792834b119549009f47a942dcde73fd8cb96d304dd54dbe01cb85/superb_ai_onprem-0.4.4-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "736004271ecc52994b1c305eb5bdb4b9605d4e68eb3cb76a6e964fa37666c914",
                "md5": "d22cef95ce43c9449d4f8bdf72153f54",
                "sha256": "1d9c7360dd116846138129ab0b9bafab8c95a584f2da108ba1d1a0c35130d4e7"
            },
            "downloads": -1,
            "filename": "superb_ai_onprem-0.4.4.tar.gz",
            "has_sig": false,
            "md5_digest": "d22cef95ce43c9449d4f8bdf72153f54",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.8",
            "size": 57737,
            "upload_time": "2025-08-06T09:52:50",
            "upload_time_iso_8601": "2025-08-06T09:52:50.572865Z",
            "url": "https://files.pythonhosted.org/packages/73/60/04271ecc52994b1c305eb5bdb4b9605d4e68eb3cb76a6e964fa37666c914/superb_ai_onprem-0.4.4.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-08-06 09:52:50",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "Superb-AI-Suite",
    "github_project": "superb-ai-onprem-python",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "superb-ai-onprem"
}
        
Elapsed time: 1.35824s