# nnetflow — lightweight neural networks for learning
[](https://www.python.org/)
[](LICENSE)
[](https://github.com/lewisnjue/nnetflow/actions)
nnetflow is a small, opinionated deep learning library implemented with NumPy for education and experimentation. It focuses on readability and a small, correct autodiff core so you can learn how deep learning frameworks work under the hood.
Key design goals:
- Minimal API surface: easy to read and reason about
- Correct reverse-mode autodiff (dynamic graphs)
- Small, focused feature set for learning (Linear layer, losses, optimizers)
- Well-tested: unit tests exercise core pieces (Tensor, Linear, losses, optimizers)
This repository represents the v2.0.3 release — a cleaned-up, documented, and tested baseline with bug fixes and improvements.
## Highlights
- Tensor: NumPy-backed tensor with reverse-mode autodiff and many activations
- Linear layer: fully-connected layer with sensible initialization
- Losses: MSE, RMSE, Cross-Entropy, Binary Cross-Entropy (logits and probs)
- Optimizers: SGD (+momentum), Adam
- Examples: runnable scripts under `examples/`
- CI: GitHub Actions runs full test matrix on push and PRs
- Local checks: `pre-commit` configured to run tests before pushing changes
## Install
Install from PyPI (when published):
```bash
pip install nnetflow
```
Or install editable from source (recommended for contributors):
```bash
git clone https://github.com/lewisnjue/nnetflow.git
cd nnetflow
pip install -e .
pip install -r requirements.dev
pre-commit install --install-hooks
```
Note: `pre-commit install` sets up git hooks locally. This repo includes a pre-push hook that runs `pytest` to help prevent regressions before pushing.
## Examples
Examples are included in the `examples/` folder and are runnable directly:
```bash
python examples/simple_regression.py
python examples/binary_classification.py
python examples/gpt2.py
```
They demonstrate model definition, training loops, loss computation and parameter updates using the library primitives.
## Quick usage
```python
import numpy as np
from nnetflow import Tensor, Linear, mse_loss, Adam
X = np.random.randn(128, 3)
y = np.random.randn(128, 1)
layer = Linear(3, 1)
opt = Adam(layer.parameters(), lr=1e-2)
X_t = Tensor(X, requires_grad=False)
y_t = Tensor(y, requires_grad=False)
for epoch in range(100):
preds = layer(X_t)
loss = mse_loss(preds, y_t)
opt.zero_grad()
loss.backward()
opt.step()
if (epoch + 1) % 10 == 0:
print(f"epoch {epoch+1}: loss={loss.item():.4f}")
```
You can also import components individually:
```python
from nnetflow.engine import Tensor
from nnetflow.layers import Linear
from nnetflow.losses import mse_loss, cross_entropy_loss
from nnetflow.optim import SGD, Adam
```
## Testing
Run unit tests locally with:
```bash
pytest tests/ -q
```
CI runs tests automatically on push and pull requests.
## Pre-commit
This project uses `pre-commit` to run basic checks and to run `pytest` before pushing. After cloning run:
```bash
pip install pre-commit
pre-commit install --install-hooks
```
To run the hooks locally (including the pytest hook configured for pre-push):
```bash
pre-commit run --all-files
```
## API Reference
### Tensor Operations
The `Tensor` class is the core of nnetflow, providing automatic differentiation:
```python
from nnetflow import Tensor
# Create tensors
x = Tensor([1.0, 2.0, 3.0], requires_grad=True)
y = Tensor([4.0, 5.0, 6.0], requires_grad=True)
# Operations
z = x + y # Addition
z = x * y # Multiplication
z = x / y # Division
z = x @ y # Matrix multiplication (if compatible shapes)
z = x.sum() # Sum reduction
z = x.mean() # Mean reduction
# Activations
z = x.relu() # ReLU
z = x.sigmoid() # Sigmoid
z = x.tanh() # Tanh
z = x.softmax() # Softmax
# Backward pass
z.backward() # Compute gradients
```
### Layers
```python
from nnetflow import Linear
# Linear layer
layer = Linear(in_features=10, out_features=5, bias=True)
output = layer(input_tensor)
params = layer.parameters() # Get trainable parameters
```
### Loss Functions
```python
from nnetflow import (
mse_loss,
rmse_loss,
cross_entropy_loss,
binary_cross_entropy_loss,
logits_binary_cross_entropy_loss
)
# Regression losses
loss = mse_loss(predictions, targets)
loss = rmse_loss(predictions, targets)
# Classification losses
loss = cross_entropy_loss(logits, one_hot_targets)
loss = binary_cross_entropy_loss(probabilities, targets)
loss = logits_binary_cross_entropy_loss(logits, targets)
```
### Optimizers
```python
from nnetflow import SGD, Adam
# SGD with optional momentum
optimizer = SGD(params, lr=0.01, momentum=0.9)
# Adam optimizer
optimizer = Adam(params, lr=0.001, beta1=0.9, beta2=0.999)
# Training step
optimizer.zero_grad() # Clear gradients
loss.backward() # Compute gradients
optimizer.step() # Update parameters
```
## Project structure
```
nnetflow/ # package source
engine.py # Tensor & autodiff engine
layers.py # Linear, BatchNorm1d, LayerNorm, etc.
losses.py # loss functions
optim.py # optimizers
examples/ # runnable examples
tests/ # unit tests
```
## Contributing
Contributions are welcome. Please follow these steps:
1. Fork the repository and create a feature branch
2. Write tests for your change
3. Run `pytest` and `pre-commit` locally
4. Open a pull request with a clear description
See `CONTRIBUTING.md` for more details.
## Changelog
See `CHANGELOG.md` for details on releases. The repository is now at v2.0.3.
## License
MIT — see `LICENSE`.
---
Maintained by Lewis Njue — aimed at learners and educators building intuition about how neural networks work.
Raw data
{
"_id": null,
"home_page": "https://github.com/lewisnjue/nnetflow",
"name": "nnetflow",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.8",
"maintainer_email": null,
"keywords": "neural network, autodiff, deep learning, machine learning, numpy, backpropagation, AI, educational",
"author": "Lewis Njue",
"author_email": "Lewis Njue <lewiskinyuanjue.ke@gmail.com>",
"download_url": "https://files.pythonhosted.org/packages/ce/aa/d49ad5406233c1682ebace3c9f08d56ad581fdc130db7d1e6bbe47c11de8/nnetflow-2.0.3.tar.gz",
"platform": null,
"description": "# nnetflow \u2014 lightweight neural networks for learning\n\n[](https://www.python.org/)\n[](LICENSE)\n[](https://github.com/lewisnjue/nnetflow/actions)\n\nnnetflow is a small, opinionated deep learning library implemented with NumPy for education and experimentation. It focuses on readability and a small, correct autodiff core so you can learn how deep learning frameworks work under the hood.\n\nKey design goals:\n- Minimal API surface: easy to read and reason about\n- Correct reverse-mode autodiff (dynamic graphs)\n- Small, focused feature set for learning (Linear layer, losses, optimizers)\n- Well-tested: unit tests exercise core pieces (Tensor, Linear, losses, optimizers)\n\nThis repository represents the v2.0.3 release \u2014 a cleaned-up, documented, and tested baseline with bug fixes and improvements.\n\n## Highlights\n\n- Tensor: NumPy-backed tensor with reverse-mode autodiff and many activations\n- Linear layer: fully-connected layer with sensible initialization\n- Losses: MSE, RMSE, Cross-Entropy, Binary Cross-Entropy (logits and probs)\n- Optimizers: SGD (+momentum), Adam\n- Examples: runnable scripts under `examples/`\n- CI: GitHub Actions runs full test matrix on push and PRs\n- Local checks: `pre-commit` configured to run tests before pushing changes\n\n## Install\n\nInstall from PyPI (when published):\n\n```bash\npip install nnetflow\n```\n\nOr install editable from source (recommended for contributors):\n\n```bash\ngit clone https://github.com/lewisnjue/nnetflow.git\ncd nnetflow\npip install -e .\npip install -r requirements.dev\npre-commit install --install-hooks\n```\n\nNote: `pre-commit install` sets up git hooks locally. This repo includes a pre-push hook that runs `pytest` to help prevent regressions before pushing.\n\n## Examples\n\nExamples are included in the `examples/` folder and are runnable directly:\n\n```bash\npython examples/simple_regression.py\npython examples/binary_classification.py\npython examples/gpt2.py\n```\n\nThey demonstrate model definition, training loops, loss computation and parameter updates using the library primitives.\n\n## Quick usage\n\n```python\nimport numpy as np\nfrom nnetflow import Tensor, Linear, mse_loss, Adam\n\nX = np.random.randn(128, 3)\ny = np.random.randn(128, 1)\n\nlayer = Linear(3, 1)\nopt = Adam(layer.parameters(), lr=1e-2)\n\nX_t = Tensor(X, requires_grad=False)\ny_t = Tensor(y, requires_grad=False)\n\nfor epoch in range(100):\n preds = layer(X_t)\n loss = mse_loss(preds, y_t)\n opt.zero_grad()\n loss.backward()\n opt.step()\n\n if (epoch + 1) % 10 == 0:\n print(f\"epoch {epoch+1}: loss={loss.item():.4f}\")\n```\n\nYou can also import components individually:\n```python\nfrom nnetflow.engine import Tensor\nfrom nnetflow.layers import Linear\nfrom nnetflow.losses import mse_loss, cross_entropy_loss\nfrom nnetflow.optim import SGD, Adam\n```\n\n## Testing\n\nRun unit tests locally with:\n\n```bash\npytest tests/ -q\n```\n\nCI runs tests automatically on push and pull requests.\n\n## Pre-commit\n\nThis project uses `pre-commit` to run basic checks and to run `pytest` before pushing. After cloning run:\n\n```bash\npip install pre-commit\npre-commit install --install-hooks\n```\n\nTo run the hooks locally (including the pytest hook configured for pre-push):\n\n```bash\npre-commit run --all-files\n```\n\n## API Reference\n\n### Tensor Operations\n\nThe `Tensor` class is the core of nnetflow, providing automatic differentiation:\n\n```python\nfrom nnetflow import Tensor\n\n# Create tensors\nx = Tensor([1.0, 2.0, 3.0], requires_grad=True)\ny = Tensor([4.0, 5.0, 6.0], requires_grad=True)\n\n# Operations\nz = x + y # Addition\nz = x * y # Multiplication\nz = x / y # Division\nz = x @ y # Matrix multiplication (if compatible shapes)\nz = x.sum() # Sum reduction\nz = x.mean() # Mean reduction\n\n# Activations\nz = x.relu() # ReLU\nz = x.sigmoid() # Sigmoid\nz = x.tanh() # Tanh\nz = x.softmax() # Softmax\n\n# Backward pass\nz.backward() # Compute gradients\n```\n\n### Layers\n\n```python\nfrom nnetflow import Linear\n\n# Linear layer\nlayer = Linear(in_features=10, out_features=5, bias=True)\noutput = layer(input_tensor)\nparams = layer.parameters() # Get trainable parameters\n```\n\n### Loss Functions\n\n```python\nfrom nnetflow import (\n mse_loss, \n rmse_loss, \n cross_entropy_loss, \n binary_cross_entropy_loss,\n logits_binary_cross_entropy_loss\n)\n\n# Regression losses\nloss = mse_loss(predictions, targets)\nloss = rmse_loss(predictions, targets)\n\n# Classification losses\nloss = cross_entropy_loss(logits, one_hot_targets)\nloss = binary_cross_entropy_loss(probabilities, targets)\nloss = logits_binary_cross_entropy_loss(logits, targets)\n```\n\n### Optimizers\n\n```python\nfrom nnetflow import SGD, Adam\n\n# SGD with optional momentum\noptimizer = SGD(params, lr=0.01, momentum=0.9)\n\n# Adam optimizer\noptimizer = Adam(params, lr=0.001, beta1=0.9, beta2=0.999)\n\n# Training step\noptimizer.zero_grad() # Clear gradients\nloss.backward() # Compute gradients\noptimizer.step() # Update parameters\n```\n\n## Project structure\n\n```\nnnetflow/ # package source\n engine.py # Tensor & autodiff engine\n layers.py # Linear, BatchNorm1d, LayerNorm, etc.\n losses.py # loss functions\n optim.py # optimizers\nexamples/ # runnable examples\ntests/ # unit tests\n```\n\n## Contributing\n\nContributions are welcome. Please follow these steps:\n\n1. Fork the repository and create a feature branch\n2. Write tests for your change\n3. Run `pytest` and `pre-commit` locally\n4. Open a pull request with a clear description\n\nSee `CONTRIBUTING.md` for more details.\n\n## Changelog\n\nSee `CHANGELOG.md` for details on releases. The repository is now at v2.0.3.\n\n## License\n\nMIT \u2014 see `LICENSE`.\n\n---\n\nMaintained by Lewis Njue \u2014 aimed at learners and educators building intuition about how neural networks work.\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "A minimal neural network framework with autodiff and NumPy",
"version": "2.0.3",
"project_urls": {
"Documentation": "https://github.com/lewisnjue/nnetflow#readme",
"Homepage": "https://github.com/lewisnjue/nnetflow",
"Issues": "https://github.com/lewisnjue/nnetflow/issues",
"Source": "https://github.com/lewisnjue/nnetflow"
},
"split_keywords": [
"neural network",
" autodiff",
" deep learning",
" machine learning",
" numpy",
" backpropagation",
" ai",
" educational"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "fc1cf1a66070d6b924a9ed10bf70e146bccfb3e45146d1fbb87383a227c89211",
"md5": "7d4fe559bdabf4535ff3ac6892cfab3d",
"sha256": "dc949d83785eee7968a55787a0ee888f121241af6c1a2997e4befb05f7fd5be2"
},
"downloads": -1,
"filename": "nnetflow-2.0.3-py3-none-any.whl",
"has_sig": false,
"md5_digest": "7d4fe559bdabf4535ff3ac6892cfab3d",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.8",
"size": 40867,
"upload_time": "2025-11-15T08:54:40",
"upload_time_iso_8601": "2025-11-15T08:54:40.050727Z",
"url": "https://files.pythonhosted.org/packages/fc/1c/f1a66070d6b924a9ed10bf70e146bccfb3e45146d1fbb87383a227c89211/nnetflow-2.0.3-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "ceaad49ad5406233c1682ebace3c9f08d56ad581fdc130db7d1e6bbe47c11de8",
"md5": "8ce92e85759091e344f5e3ecb08d9e45",
"sha256": "2a42309f9ec00921cbc0df779d1898d42aa3cce3251686cfcfcb12166dd5fd3d"
},
"downloads": -1,
"filename": "nnetflow-2.0.3.tar.gz",
"has_sig": false,
"md5_digest": "8ce92e85759091e344f5e3ecb08d9e45",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.8",
"size": 38171,
"upload_time": "2025-11-15T08:54:40",
"upload_time_iso_8601": "2025-11-15T08:54:40.926757Z",
"url": "https://files.pythonhosted.org/packages/ce/aa/d49ad5406233c1682ebace3c9f08d56ad581fdc130db7d1e6bbe47c11de8/nnetflow-2.0.3.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-11-15 08:54:40",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "lewisnjue",
"github_project": "nnetflow",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"requirements": [
{
"name": "numpy",
"specs": []
},
{
"name": "scipy",
"specs": []
}
],
"lcname": "nnetflow"
}