torchy-nn


Nametorchy-nn JSON
Version 0.3.5.3 PyPI version JSON
download
home_pagehttps://github.com/chuvalniy/Torchy
SummaryNumPy based neural network package with PyTorch-like API
upload_time2023-10-19 17:11:12
maintainer
docs_urlNone
authorValentin Belyaev
requires_python
licenseMIT
keywords python neural net from scratch numpy pytorch-like cnn dense
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            
[![Python Version](https://img.shields.io/badge/python-3.11-blue.svg)](https://www.python.org/downloads/release/python-360/)
[![PyPI Version](https://img.shields.io/pypi/v/torchy-nn.svg)](https://pypi.org/project/torchy-nn/)
![Status](https://img.shields.io/badge/status-alpha-orange.svg)

![изображение](https://github.com/chuvalniy/Torchy/assets/85331232/e0ab8cfe-4e12-42f9-b90e-37fb93f8ffd0)


## Overview
Torchy is a neural network framework implemented only using NumPy and based on PyTorch API but with manual backpropogation calculations. The main idea was to build a neural network from scratch for educational purposes.

## Installation
```python
pip install torchy-nn
```
## Getting started
I suggest you to take a look at [currently implemented stuff](https://github.com/chuvalniy/Torchy/blob/main/docs/Implemented.md) to be familiar with current possibilities for building neural network models with Torchy. Also I've created [package structure](https://github.com/chuvalniy/Torchy/blob/main/docs/PackageStructure.md) in case if you stuck where to get specific layers.

## Example usage
First we can define our model using Torchy with its PyTorch-like API

```python
from nn.modules.sequential import Sequential  # Same as nn.Sequential
import nn.modules.module as layer

# Define 2-layer wtth 100 neurons hidden layer.
model = Sequential(
    layer.Linear(n_input=10, n_output=100),
    layer.BatchNorm1d(n_output=100),
    layer.ReLU(),
    layer.Linear(n_input=100, n_output=2)
)
```

Next step is to create instances of optimizer and criterion for loss function and scheduler for fun

```python
import nn.modules.loss as loss
import optimizers.optim as optim
import optimizers.scheduler as sched

optimizer = optim.SGD(model.params(), lr=1e-3)
criterion = loss.CrossEntropyLoss()
scheduler = sched.StepLR(optimizer, step_size=10)
```

I won't cover whole training process like loops and stuff, just show you main differences while training

```python
...
predictions = model(X)  # Nothing changed

loss, grad = criterion(predictions, y)  # Now return tuple of (loss, grad) instead of only loss 

optimizer.zero_grad()
model.backward(grad)  # Call backward on model object and pass gradient from loss as argument
optimizer.forward_step()
```


## Demonstration
The [demo notebook](https://github.com/chuvalniy/Torchy/blob/main/torchy-demo.ipynb) showcases what Torchy currently can do.

## Roadmap
There is still a lot of work to be done, but here are the main points that will be completed soon
- Docstring every entity & add type hinting
- Add evaluation & inference for model 

## Resources
The opportunity to create such a project was given to me thanks to these people

- [PyTorch](https://github.com/pytorch/pytorch)
- [CS231n - 2016](https://youtube.com/playlist?list=PLkt2uSq6rBVctENoVBg1TpCC7OQi31AlC)
- [Deep Learning на пальцах - 2019](https://youtube.com/playlist?list=PL5FkQ0AF9O_o2Eb5Qn8pwCDg7TniyV1Wb)
- [Neural Networks: Zero to Hero](https://youtube.com/playlist?list=PLAqhIrjkxbuWI23v9cThsA9GvCAUhRvKZ)


## License
[MIT License](https://github.com/chuvalniy/Torchy/blob/main/LICENSE)


            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/chuvalniy/Torchy",
    "name": "torchy-nn",
    "maintainer": "",
    "docs_url": null,
    "requires_python": "",
    "maintainer_email": "",
    "keywords": "python,neural net,from scratch,numpy,pytorch-like,cnn,dense",
    "author": "Valentin Belyaev",
    "author_email": "chuvalik.work@gmail.com",
    "download_url": "https://files.pythonhosted.org/packages/2c/b4/5ead303ceff61764eb22a27cc594720288856ee8cc7efa78a190f84fcfe7/torchy-nn-0.3.5.3.tar.gz",
    "platform": null,
    "description": "\n[![Python Version](https://img.shields.io/badge/python-3.11-blue.svg)](https://www.python.org/downloads/release/python-360/)\n[![PyPI Version](https://img.shields.io/pypi/v/torchy-nn.svg)](https://pypi.org/project/torchy-nn/)\n![Status](https://img.shields.io/badge/status-alpha-orange.svg)\n\n![\u0438\u0437\u043e\u0431\u0440\u0430\u0436\u0435\u043d\u0438\u0435](https://github.com/chuvalniy/Torchy/assets/85331232/e0ab8cfe-4e12-42f9-b90e-37fb93f8ffd0)\n\n\n## Overview\nTorchy is a neural network framework implemented only using NumPy and based on PyTorch API but with manual backpropogation calculations. The main idea was to build a neural network from scratch for educational purposes.\n\n## Installation\n```python\npip install torchy-nn\n```\n## Getting started\nI suggest you to take a look at [currently implemented stuff](https://github.com/chuvalniy/Torchy/blob/main/docs/Implemented.md) to be familiar with current possibilities for building neural network models with Torchy. Also I've created [package structure](https://github.com/chuvalniy/Torchy/blob/main/docs/PackageStructure.md) in case if you stuck where to get specific layers.\n\n## Example usage\nFirst we can define our model using Torchy with its PyTorch-like API\n\n```python\nfrom nn.modules.sequential import Sequential  # Same as nn.Sequential\nimport nn.modules.module as layer\n\n# Define 2-layer wtth 100 neurons hidden layer.\nmodel = Sequential(\n    layer.Linear(n_input=10, n_output=100),\n    layer.BatchNorm1d(n_output=100),\n    layer.ReLU(),\n    layer.Linear(n_input=100, n_output=2)\n)\n```\n\nNext step is to create instances of optimizer and criterion for loss function and scheduler for fun\n\n```python\nimport nn.modules.loss as loss\nimport optimizers.optim as optim\nimport optimizers.scheduler as sched\n\noptimizer = optim.SGD(model.params(), lr=1e-3)\ncriterion = loss.CrossEntropyLoss()\nscheduler = sched.StepLR(optimizer, step_size=10)\n```\n\nI won't cover whole training process like loops and stuff, just show you main differences while training\n\n```python\n...\npredictions = model(X)  # Nothing changed\n\nloss, grad = criterion(predictions, y)  # Now return tuple of (loss, grad) instead of only loss \n\noptimizer.zero_grad()\nmodel.backward(grad)  # Call backward on model object and pass gradient from loss as argument\noptimizer.forward_step()\n```\n\n\n## Demonstration\nThe [demo notebook](https://github.com/chuvalniy/Torchy/blob/main/torchy-demo.ipynb) showcases what Torchy currently can do.\n\n## Roadmap\nThere is still a lot of work to be done, but here are the main points that will be completed soon\n- Docstring every entity & add type hinting\n- Add evaluation & inference for model \n\n## Resources\nThe opportunity to create such a project was given to me thanks to these people\n\n- [PyTorch](https://github.com/pytorch/pytorch)\n- [CS231n - 2016](https://youtube.com/playlist?list=PLkt2uSq6rBVctENoVBg1TpCC7OQi31AlC)\n- [Deep Learning \u043d\u0430 \u043f\u0430\u043b\u044c\u0446\u0430\u0445 - 2019](https://youtube.com/playlist?list=PL5FkQ0AF9O_o2Eb5Qn8pwCDg7TniyV1Wb)\n- [Neural Networks: Zero to Hero](https://youtube.com/playlist?list=PLAqhIrjkxbuWI23v9cThsA9GvCAUhRvKZ)\n\n\n## License\n[MIT License](https://github.com/chuvalniy/Torchy/blob/main/LICENSE)\n\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "NumPy based neural network package with PyTorch-like API",
    "version": "0.3.5.3",
    "project_urls": {
        "Homepage": "https://github.com/chuvalniy/Torchy"
    },
    "split_keywords": [
        "python",
        "neural net",
        "from scratch",
        "numpy",
        "pytorch-like",
        "cnn",
        "dense"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "0ac3cae1c93872245c04c78cd60f289c2ae199366c3ccfc280bd9b681fba4a9c",
                "md5": "b2c56c467a65d73f342618f74b460a14",
                "sha256": "f394083044302df1c5fa939fd6c96552bb608e55ab6b030655d27a5a4dee1e88"
            },
            "downloads": -1,
            "filename": "torchy_nn-0.3.5.3-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "b2c56c467a65d73f342618f74b460a14",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": null,
            "size": 28757,
            "upload_time": "2023-10-19T17:11:10",
            "upload_time_iso_8601": "2023-10-19T17:11:10.390979Z",
            "url": "https://files.pythonhosted.org/packages/0a/c3/cae1c93872245c04c78cd60f289c2ae199366c3ccfc280bd9b681fba4a9c/torchy_nn-0.3.5.3-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "2cb45ead303ceff61764eb22a27cc594720288856ee8cc7efa78a190f84fcfe7",
                "md5": "8f1d1ec6d0b863f8c711f5301b3cce8a",
                "sha256": "5ebf9e9946698a10f62e542d95b050d040be34c2063e5023a09aad974d3b622c"
            },
            "downloads": -1,
            "filename": "torchy-nn-0.3.5.3.tar.gz",
            "has_sig": false,
            "md5_digest": "8f1d1ec6d0b863f8c711f5301b3cce8a",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": null,
            "size": 15415,
            "upload_time": "2023-10-19T17:11:12",
            "upload_time_iso_8601": "2023-10-19T17:11:12.638782Z",
            "url": "https://files.pythonhosted.org/packages/2c/b4/5ead303ceff61764eb22a27cc594720288856ee8cc7efa78a190f84fcfe7/torchy-nn-0.3.5.3.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-10-19 17:11:12",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "chuvalniy",
    "github_project": "Torchy",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "lcname": "torchy-nn"
}
        
Elapsed time: 0.14504s