
Nabla is a Machine Learning library for the emerging Mojo/Python ecosystem, featuring:
- Gradient computation the PyTorch way (imperatively via .backward())
- Purely-functional, JAX-like composable function transformations: `grad`, `vmap`, `jit`, etc.
- Custom differentiable CPU/GPU kernels
For tutorials and API reference, visit: [nablaml.com](https://nablaml.com/index.html)
## Installation
```bash
pip install nabla-ml
```
## Quick Start
*The most simple, but fully functional Neural Network training setup:*
```python
import nabla as nb
# Defines MLP forward pass and loss.
def loss_fn(params, x, y):
for i in range(0, len(params) - 2, 2):
x = nb.relu(x @ params[i] + params[i + 1])
predictions = x @ params[-2] + params[-1]
return nb.mean((predictions - y) ** 2)
# JIT-compiled training step via SGD
@nb.jit(auto_device=True)
def train_step(params, x, y, lr):
loss, grads = nb.value_and_grad(loss_fn)(params, x, y)
return loss, [p - g * lr for p, g in zip(params, grads)]
# Setup network (hyper)parameters.
LAYERS = [1, 32, 64, 32, 1]
params = [p for i in range(len(LAYERS) - 1) for p in (nb.glorot_uniform((LAYERS[i], LAYERS[i + 1])), nb.zeros((1, LAYERS[i + 1])),)]
# Run training loop.
x, y = nb.rand((256, 1)), nb.rand((256, 1))
for i in range(1001):
loss, params = train_step(params, x, y, 0.01)
if i % 100 == 0: print(i, loss.to_numpy())
```
## For Developers
1. Clone the repository
2. Create a virtual environment (recommended)
3. Install dependencies
```bash
git clone https://github.com/nabla-ml/nabla.git
cd nabla
python3 -m venv venv
source venv/bin/activate
pip install -r requirements-dev.txt
pip install -e ".[dev]"
```
## Repository Structure
<!--  -->
```text
nabla/
├── nabla/ # Core Python library
│ ├── core/ # Tensor class and MAX compiler integration
│ ├── nn/ # Neural network modules and models
│ ├── ops/ # Mathematical operations (binary, unary, linalg, etc.)
│ ├── transforms/ # Function transformations (vmap, grad, jit, etc.)
│ └── utils/ # Utilities (formatting, types, MAX-interop, etc.)
├── tests/ # Comprehensive test suite
├── tutorials/ # Notebooks on Nabla usage for ML tasks
└── examples/ # Example scripts for common use cases
```
## Contributing
Contributions welcome! Discuss significant changes in Issues first. Submit PRs for bugs, docs, and smaller features.
## License
Nabla is licensed under the [Apache-2.0 license](https://github.com/nabla-ml/nabla/blob/main/LICENSE).
---
*Thank you for checking out Nabla!*
[](https://github.com/nabla-ml/nabla)
[](https://badge.fury.io/py/nabla-ml)
[](https://www.python.org/downloads/)
[](https://www.apache.org/licenses/LICENSE-2.0)
Raw data
{
"_id": null,
"home_page": null,
"name": "nabla-ml",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.10",
"maintainer_email": null,
"keywords": "deep learning, machine learning, jax, autodiff, nabla, mojo, max, gpu, vmap, grad",
"author": null,
"author_email": "TilliFe <tillmann.fehrenbach@gmail.com>",
"download_url": "https://files.pythonhosted.org/packages/e9/ff/3282d74a8b871689a152a38cb9d6ac62eae9cecfa89302bf8bf805cb9fc9/nabla_ml-25.10191431.tar.gz",
"platform": null,
"description": "\n\nNabla is a Machine Learning library for the emerging Mojo/Python ecosystem, featuring:\n\n- Gradient computation the PyTorch way (imperatively via .backward())\n- Purely-functional, JAX-like composable function transformations: `grad`, `vmap`, `jit`, etc.\n- Custom differentiable CPU/GPU kernels\n\nFor tutorials and API reference, visit: [nablaml.com](https://nablaml.com/index.html)\n\n## Installation\n\n```bash\npip install nabla-ml\n```\n\n## Quick Start\n\n*The most simple, but fully functional Neural Network training setup:*\n\n```python\nimport nabla as nb\n\n# Defines MLP forward pass and loss.\ndef loss_fn(params, x, y):\n for i in range(0, len(params) - 2, 2):\n x = nb.relu(x @ params[i] + params[i + 1])\n predictions = x @ params[-2] + params[-1]\n return nb.mean((predictions - y) ** 2)\n\n# JIT-compiled training step via SGD\n@nb.jit(auto_device=True)\ndef train_step(params, x, y, lr):\n loss, grads = nb.value_and_grad(loss_fn)(params, x, y)\n return loss, [p - g * lr for p, g in zip(params, grads)]\n\n# Setup network (hyper)parameters.\nLAYERS = [1, 32, 64, 32, 1]\nparams = [p for i in range(len(LAYERS) - 1) for p in (nb.glorot_uniform((LAYERS[i], LAYERS[i + 1])), nb.zeros((1, LAYERS[i + 1])),)]\n\n# Run training loop.\nx, y = nb.rand((256, 1)), nb.rand((256, 1))\nfor i in range(1001):\n loss, params = train_step(params, x, y, 0.01)\n if i % 100 == 0: print(i, loss.to_numpy())\n```\n\n## For Developers\n\n1. Clone the repository\n2. Create a virtual environment (recommended)\n3. Install dependencies\n\n```bash\ngit clone https://github.com/nabla-ml/nabla.git\ncd nabla\n\npython3 -m venv venv\nsource venv/bin/activate\n\npip install -r requirements-dev.txt\npip install -e \".[dev]\"\n```\n\n## Repository Structure\n\n<!--  -->\n\n```text\nnabla/\n\u251c\u2500\u2500 nabla/ # Core Python library\n\u2502 \u251c\u2500\u2500 core/ # Tensor class and MAX compiler integration\n\u2502 \u251c\u2500\u2500 nn/ # Neural network modules and models\n\u2502 \u251c\u2500\u2500 ops/ # Mathematical operations (binary, unary, linalg, etc.)\n\u2502 \u251c\u2500\u2500 transforms/ # Function transformations (vmap, grad, jit, etc.)\n\u2502 \u2514\u2500\u2500 utils/ # Utilities (formatting, types, MAX-interop, etc.)\n\u251c\u2500\u2500 tests/ # Comprehensive test suite\n\u251c\u2500\u2500 tutorials/ # Notebooks on Nabla usage for ML tasks\n\u2514\u2500\u2500 examples/ # Example scripts for common use cases\n```\n\n## Contributing\n\nContributions welcome! Discuss significant changes in Issues first. Submit PRs for bugs, docs, and smaller features.\n\n## License\n\nNabla is licensed under the [Apache-2.0 license](https://github.com/nabla-ml/nabla/blob/main/LICENSE).\n\n---\n\n*Thank you for checking out Nabla!*\n\n[](https://github.com/nabla-ml/nabla)\n[](https://badge.fury.io/py/nabla-ml)\n[](https://www.python.org/downloads/)\n[](https://www.apache.org/licenses/LICENSE-2.0)\n",
"bugtrack_url": null,
"license": null,
"summary": "Dynamic neural networks and function transformations in Python + Mojo",
"version": "25.10191431",
"project_urls": {
"Bug Tracker": "https://github.com/nabla-ml/nabla/issues",
"Homepage": "https://github.com/nabla-ml/nabla",
"Repository": "https://github.com/nabla-ml/nabla"
},
"split_keywords": [
"deep learning",
" machine learning",
" jax",
" autodiff",
" nabla",
" mojo",
" max",
" gpu",
" vmap",
" grad"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "2001e5f2d7a44125fdafe7d8cebf94e8e008aa65894d77d4c677b22ca9e4b875",
"md5": "89b2b2d21ec0b3019e1bbb75bfaeee50",
"sha256": "5d9ae0ef1bb8460bd03b304e5d4df5ebcc2013a55011c45f5c1e464c0bc02a1b"
},
"downloads": -1,
"filename": "nabla_ml-25.10191431-py3-none-any.whl",
"has_sig": false,
"md5_digest": "89b2b2d21ec0b3019e1bbb75bfaeee50",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.10",
"size": 134851,
"upload_time": "2025-10-19T12:31:35",
"upload_time_iso_8601": "2025-10-19T12:31:35.430031Z",
"url": "https://files.pythonhosted.org/packages/20/01/e5f2d7a44125fdafe7d8cebf94e8e008aa65894d77d4c677b22ca9e4b875/nabla_ml-25.10191431-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "e9ff3282d74a8b871689a152a38cb9d6ac62eae9cecfa89302bf8bf805cb9fc9",
"md5": "3da1b4ec8517ded0e9a8045addc5f87c",
"sha256": "d5ef96818822e1d9611a2449ecd33617f5e8f23545aefeada4e10460ff4505ab"
},
"downloads": -1,
"filename": "nabla_ml-25.10191431.tar.gz",
"has_sig": false,
"md5_digest": "3da1b4ec8517ded0e9a8045addc5f87c",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.10",
"size": 104550,
"upload_time": "2025-10-19T12:31:37",
"upload_time_iso_8601": "2025-10-19T12:31:37.010242Z",
"url": "https://files.pythonhosted.org/packages/e9/ff/3282d74a8b871689a152a38cb9d6ac62eae9cecfa89302bf8bf805cb9fc9/nabla_ml-25.10191431.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-10-19 12:31:37",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "nabla-ml",
"github_project": "nabla",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"requirements": [
{
"name": "numpy",
"specs": [
[
">=",
"2.0.0"
]
]
},
{
"name": "modular",
"specs": [
[
">=",
"25.0.0"
]
]
}
],
"lcname": "nabla-ml"
}