supersayan


Namesupersayan JSON
Version 0.1.3 PyPI version JSON
download
home_pageNone
SummaryA high-performance Python library for fully homomorphic encryption (FHE) in deep learning, seamlessly integrated with PyTorch.
upload_time2025-08-28 12:33:07
maintainerNone
docs_urlNone
authorNone
requires_python>=3.9
licenseNone
keywords homomorphic-encryption fhe deep-learning pytorch privacy-preserving cryptography neural-networks secure-computation machine-learning julia
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # Supersayan

A high‑performance Python library that integrates Fully Homomorphic Encryption (FHE) into PyTorch workflows. Supersayan provides:

- A client/server path to offload selected layers to a remote FHE server (hybrid mode).
- Zero‑copy tensor bridges across PyTorch, NumPy, CuPy, and Julia.

The Julia backend is managed via `juliacall` and initialized automatically on first import, or explicitly with a helper CLI.

## Installation

From PyPI:

```bash
pip install supersayan
# or with uv
uv add supersayan
```

Default behavior:
- After install, importing `supersayan` triggers a one‑time Julia backend setup automatically. No extra step is needed for typical users.

Optional (CI/Docker or troubleshooting):

```bash
# Manually initialize the Julia backend if you want explicit control
supersayan-setup
```

Advanced control (not recommended for regular users):
- To prevent network access during import in CI images, set `SUPERSAYAN_SKIP_JULIA_SETUP=1` and run `supersayan-setup` in a controlled build step.

Notes:
- GPU support uses CuPy when available; code runs on CPU otherwise.

## Hybrid Remote Inference

Run the TCP server:

```bash
python scripts/run_server.py --host 127.0.0.1 --port 8000 --models-dir /tmp/supersayan/models
```

Use the client to execute only selected layers remotely in FHE while keeping other ops local:

```python
import torch
import torch.nn as nn
from supersayan.core.types import SupersayanTensor
from supersayan.remote import SupersayanClient

class SmallCNN(nn.Module):
    def __init__(self):
        super().__init__()
        self.conv = nn.Conv2d(1, 4, 3, padding=1)
        self.fc = nn.Linear(4*28*28, 10)
    def forward(self, x):
        x = torch.relu(self.conv(x))
        return self.fc(x.view(x.size(0), -1))

cnn = SmallCNN().eval()
client = SupersayanClient(
    server_url="127.0.0.1:8000",
    torch_model=cnn,
    fhe_modules=[nn.Conv2d, nn.Linear],  # offload these layers
)

x = SupersayanTensor(torch.randn(1, 1, 28, 28))
y = client(x)
```

For a runnable example of the TCP server, see `scripts/run_server.py`.

Supported offloaded layers: `nn.Linear`, `nn.Conv2d`.

## Tensors and Interop

- `SupersayanTensor(data, device=...)` accepts `torch.Tensor`, `numpy.ndarray`, or `cupy.ndarray` and preserves dtype float32.
- Helpers: `SupersayanTensor.zeros(...)`, `ones(...)`, `randn(...)`.
- Interop: `.to_numpy()`, `.to_dlpack()`, and zero‑copy conversion to Julia via `.to_julia()`.

## Project Layout

- `src/supersayan/` core, layers, remote client/server, Julia backend
- `scripts/` runnable examples (`run_server.py`)

## Troubleshooting

- If Julia setup fails on first import, run `supersayan-setup` manually.
- In CI or headless environments, set `SUPERSAYAN_SKIP_JULIA_SETUP=1` during import and run `supersayan-setup` explicitly in a build step.

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "supersayan",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.9",
    "maintainer_email": null,
    "keywords": "homomorphic-encryption, FHE, deep-learning, pytorch, privacy-preserving, cryptography, neural-networks, secure-computation, machine-learning, julia",
    "author": null,
    "author_email": "bonsainoodle <3sprix.exe@gmail.com>, franklintra <franklin.tranie@gmail.com>",
    "download_url": "https://files.pythonhosted.org/packages/d0/dc/b9e649cfc0406319b311485ac48ea5abf72bd8eaedb8bd3805abc5af7281/supersayan-0.1.3.tar.gz",
    "platform": null,
    "description": "# Supersayan\n\nA high\u2011performance Python library that integrates Fully Homomorphic Encryption (FHE) into PyTorch workflows. Supersayan provides:\n\n- A client/server path to offload selected layers to a remote FHE server (hybrid mode).\n- Zero\u2011copy tensor bridges across PyTorch, NumPy, CuPy, and Julia.\n\nThe Julia backend is managed via `juliacall` and initialized automatically on first import, or explicitly with a helper CLI.\n\n## Installation\n\nFrom PyPI:\n\n```bash\npip install supersayan\n# or with uv\nuv add supersayan\n```\n\nDefault behavior:\n- After install, importing `supersayan` triggers a one\u2011time Julia backend setup automatically. No extra step is needed for typical users.\n\nOptional (CI/Docker or troubleshooting):\n\n```bash\n# Manually initialize the Julia backend if you want explicit control\nsupersayan-setup\n```\n\nAdvanced control (not recommended for regular users):\n- To prevent network access during import in CI images, set `SUPERSAYAN_SKIP_JULIA_SETUP=1` and run `supersayan-setup` in a controlled build step.\n\nNotes:\n- GPU support uses CuPy when available; code runs on CPU otherwise.\n\n## Hybrid Remote Inference\n\nRun the TCP server:\n\n```bash\npython scripts/run_server.py --host 127.0.0.1 --port 8000 --models-dir /tmp/supersayan/models\n```\n\nUse the client to execute only selected layers remotely in FHE while keeping other ops local:\n\n```python\nimport torch\nimport torch.nn as nn\nfrom supersayan.core.types import SupersayanTensor\nfrom supersayan.remote import SupersayanClient\n\nclass SmallCNN(nn.Module):\n    def __init__(self):\n        super().__init__()\n        self.conv = nn.Conv2d(1, 4, 3, padding=1)\n        self.fc = nn.Linear(4*28*28, 10)\n    def forward(self, x):\n        x = torch.relu(self.conv(x))\n        return self.fc(x.view(x.size(0), -1))\n\ncnn = SmallCNN().eval()\nclient = SupersayanClient(\n    server_url=\"127.0.0.1:8000\",\n    torch_model=cnn,\n    fhe_modules=[nn.Conv2d, nn.Linear],  # offload these layers\n)\n\nx = SupersayanTensor(torch.randn(1, 1, 28, 28))\ny = client(x)\n```\n\nFor a runnable example of the TCP server, see `scripts/run_server.py`.\n\nSupported offloaded layers: `nn.Linear`, `nn.Conv2d`.\n\n## Tensors and Interop\n\n- `SupersayanTensor(data, device=...)` accepts `torch.Tensor`, `numpy.ndarray`, or `cupy.ndarray` and preserves dtype float32.\n- Helpers: `SupersayanTensor.zeros(...)`, `ones(...)`, `randn(...)`.\n- Interop: `.to_numpy()`, `.to_dlpack()`, and zero\u2011copy conversion to Julia via `.to_julia()`.\n\n## Project Layout\n\n- `src/supersayan/` core, layers, remote client/server, Julia backend\n- `scripts/` runnable examples (`run_server.py`)\n\n## Troubleshooting\n\n- If Julia setup fails on first import, run `supersayan-setup` manually.\n- In CI or headless environments, set `SUPERSAYAN_SKIP_JULIA_SETUP=1` during import and run `supersayan-setup` explicitly in a build step.\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "A high-performance Python library for fully homomorphic encryption (FHE) in deep learning, seamlessly integrated with PyTorch.",
    "version": "0.1.3",
    "project_urls": {
        "GitHub": "https://github.com/supersayan-labs/supersayan"
    },
    "split_keywords": [
        "homomorphic-encryption",
        " fhe",
        " deep-learning",
        " pytorch",
        " privacy-preserving",
        " cryptography",
        " neural-networks",
        " secure-computation",
        " machine-learning",
        " julia"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "07ab5f2b8bdc9969ad894fc3c8e9dde60b224d311ceeab6b3d05773ac7db23d4",
                "md5": "e1bb56dce297a5c3eadebfe656e167d6",
                "sha256": "e803cb8cfe8ceb99d931e64490eeafcc1c054d34e49a85771d814ce94b48f7bc"
            },
            "downloads": -1,
            "filename": "supersayan-0.1.3-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "e1bb56dce297a5c3eadebfe656e167d6",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.9",
            "size": 26374,
            "upload_time": "2025-08-28T12:33:05",
            "upload_time_iso_8601": "2025-08-28T12:33:05.948713Z",
            "url": "https://files.pythonhosted.org/packages/07/ab/5f2b8bdc9969ad894fc3c8e9dde60b224d311ceeab6b3d05773ac7db23d4/supersayan-0.1.3-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "d0dcb9e649cfc0406319b311485ac48ea5abf72bd8eaedb8bd3805abc5af7281",
                "md5": "5fbdec954fe48ecb221a34db716c1ed1",
                "sha256": "03dd80e34963906b7a66af6b8edbcb115ebfdd6d1e086fad77e5a58bdf1882cb"
            },
            "downloads": -1,
            "filename": "supersayan-0.1.3.tar.gz",
            "has_sig": false,
            "md5_digest": "5fbdec954fe48ecb221a34db716c1ed1",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.9",
            "size": 22942,
            "upload_time": "2025-08-28T12:33:07",
            "upload_time_iso_8601": "2025-08-28T12:33:07.121100Z",
            "url": "https://files.pythonhosted.org/packages/d0/dc/b9e649cfc0406319b311485ac48ea5abf72bd8eaedb8bd3805abc5af7281/supersayan-0.1.3.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-08-28 12:33:07",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "supersayan-labs",
    "github_project": "supersayan",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "lcname": "supersayan"
}
        
Elapsed time: 1.13130s