nano-autograds


Namenano-autograds JSON
Version 1.1.1 PyPI version JSON
download
home_pagehttps://github.com/deep-matter/Nano-AutoGrad/tree/main/autograd
SummaryA tinyTroch scalar-Engine Nano-autograd a Micro-Framework with a small PyTorch-like neural network library on top.
upload_time2023-06-27 21:21:33
maintainer
docs_urlNone
authorYouness EL BRAG
requires_python>=3.7
license
keywords
VCS
bugtrack_url
requirements pandas plotly numpy matplotlib random2 scipy scikit-learn kaleido ipython nbformat ipywidgets imageio graphviz gradio nano-autograds sphinx
Travis-CI No Travis.
coveralls test coverage No coveralls.
            ### Nano-AutoGrad

This project provides a lightweight Python micro-framework for building and training neural networks from scratch based on automatic differentiation and computational graph engine.

<div align="center">
  <img src="logo.png" alt="Nano-AutoGrad Logo" width="200">
</div>

### Installation

[![Documentation](https://img.shields.io/badge/Documentation-Read%20the%20Docs-blue.svg)](https://nano-autograd.readthedocs.io/en/latest/)
[![Examples](https://img.shields.io/badge/Examples-GitHub-green.svg)](https://nano-autograd.readthedocs.io/en/latest/README.html)

## Introduction

Nano-AutoGrad is a micro-framework that allows you to build and train neural networks from scratch based on automatic differentiation and computational graphs.

## Installation

You can install Nano-AutoGrad using pip:

```bash
pip install nano-autograds
```

### Features

1. Nano-AutoGrad offers the following features:

    * Automatic Differentiation: Nano-AutoGrad automatically * computes gradients, making it easy to perform gradient-based optimization.
    * Computational Graph Engine: It leverages a computational graph representation to efficiently compute gradients and perform backpropagation.
    * Lightweight and Efficient: Nano-AutoGrad is designed to be lightweight and efficient, suitable for small to medium-sized neural networks.
    * Easy-to-Use API: The framework provides a simple and intuitive API, allowing users to define and train neural networks with ease.
    * Integration with NumPy: Nano-AutoGrad seamlessly integrates with NumPy, enabling efficient array operations and computations.

### Usage

To get started with Nano-AutoGrad, refer to the documentation for detailed usage instructions, examples, and API reference. Here are some basic steps to build and train a neural network using Nano-AutoGrad:
*  examples 1 :

    ```python
    import numpy as np
    import autograd.core.nn as nn
    import autograd.torch.optim as nn

    class MyNeuralNetwork(na.Module):
        def __init__(self):
            self.linear = na.Linear(2, 1)

        def forward(self, x):
            return self.linear(x)

    network = MyNeuralNetwork()
    optimizer = na.SGD(network.parameters(), lr=0.1)

    ```
* Example 2 :
building 'Linear Model' using torch autograd engine 

    ```python  

    import autograd.torch.nn as nn 
    import autograd.torch.tensor as Tensor
    import autograd.torch.optim as SGD
    import autograd.functiona as F

    class Model(nn.Module):
        def __init__(self):
            super().__init__()
            self.l1 = nn.Linear(784, 1568, name='l1')
            self.l2 = nn.Linear(1568, 392, name='l2')
            self.l3 = nn.Linear(392, 10, name='l3')

        def forward(self, x):
            z = F.relu(self.l1(x))
            z = F.relu(self.l2(z))
            out = F.log_softmax(self.l3(z))
            return out

    model = Model()
    optimizer = autograd.optim.SGD(model.parameters(), lr=5e-2, weight_decay=1e-4)
    scheduler = autograd.optim.lr_scheduler.LinearLR(optimizer, start_factor=1.0, end_factor=0.75, total_iters=num_epochs)

    ```
### Examples

The Nano-AutoGrad repository provides various examples demonstrating the usage of the framework for different tasks, such as linear regression, classification, and more. You can explore the examples directory in the repository to gain a better understanding of how to use Nano-AutoGrad in practice.
Contributing please 

**Nano_AutoGrads_tutorial_Linear_model** [![Open Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/drive/1o7USeyjTLjmgjjGXkQLt96HYNAbc_r7j)
</br>

**Nano_AutoGrads_tutorial_Sparse_Networks** [![Open Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/drive/1wibEbcfqI4r3e8b28TleP562uCooxPw_#scrollTo=_y-pwg1_fNus)
</br>

**Using Nano-AutoGrads to classify MINIST handwritten digits** [![Open Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/drive/1wvJQYmYT8-7On7tto3klDN7_Zx4Xgf3_#scrollTo=d4qP5clo9CT5)
</br>

### Contributions 

Nano-AutoGrad are welcome! If you have any bug reports, feature requests, or want to contribute code, please open an issue or submit a pull request on the official GitHub repository.
License

Nano-AutoGrad is released under the MIT License. Please see the LICENSE file in the repository for more details.
Acknowledgements

We would like to thank the contributors and the open-source community for their valuable contributions to Nano-AutoGrad.
Contact

For any inquiries or further information, you can reach out to the project maintainer, Youness El Brag, via email at youness.elbrag@example.com.


Please note that you may need to update the contact email address with the appropriate one.


### Credits :

1. [micrograd](https://github.com/karpathy/micrograd) Andrej karpathy
2. [ugrad](https://github.com/conscell/ugrad/tree/main)  conscell 




            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/deep-matter/Nano-AutoGrad/tree/main/autograd",
    "name": "nano-autograds",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.7",
    "maintainer_email": "",
    "keywords": "",
    "author": "Youness EL BRAG",
    "author_email": "younsselbrag@gmail.com",
    "download_url": "https://files.pythonhosted.org/packages/48/cd/40458f91de54a6c8b91fcded5883d7bd1b90cc6febcee35ca3d050b9e7a9/nano-autograds-1.1.1.tar.gz",
    "platform": null,
    "description": "### Nano-AutoGrad\n\nThis project provides a lightweight Python micro-framework for building and training neural networks from scratch based on automatic differentiation and computational graph engine.\n\n<div align=\"center\">\n  <img src=\"logo.png\" alt=\"Nano-AutoGrad Logo\" width=\"200\">\n</div>\n\n### Installation\n\n[![Documentation](https://img.shields.io/badge/Documentation-Read%20the%20Docs-blue.svg)](https://nano-autograd.readthedocs.io/en/latest/)\n[![Examples](https://img.shields.io/badge/Examples-GitHub-green.svg)](https://nano-autograd.readthedocs.io/en/latest/README.html)\n\n## Introduction\n\nNano-AutoGrad is a micro-framework that allows you to build and train neural networks from scratch based on automatic differentiation and computational graphs.\n\n## Installation\n\nYou can install Nano-AutoGrad using pip:\n\n```bash\npip install nano-autograds\n```\n\n### Features\n\n1. Nano-AutoGrad offers the following features:\n\n    * Automatic Differentiation: Nano-AutoGrad automatically * computes gradients, making it easy to perform gradient-based optimization.\n    * Computational Graph Engine: It leverages a computational graph representation to efficiently compute gradients and perform backpropagation.\n    * Lightweight and Efficient: Nano-AutoGrad is designed to be lightweight and efficient, suitable for small to medium-sized neural networks.\n    * Easy-to-Use API: The framework provides a simple and intuitive API, allowing users to define and train neural networks with ease.\n    * Integration with NumPy: Nano-AutoGrad seamlessly integrates with NumPy, enabling efficient array operations and computations.\n\n### Usage\n\nTo get started with Nano-AutoGrad, refer to the documentation for detailed usage instructions, examples, and API reference. Here are some basic steps to build and train a neural network using Nano-AutoGrad:\n*  examples 1 :\n\n    ```python\n    import numpy as np\n    import autograd.core.nn as nn\n    import autograd.torch.optim as nn\n\n    class MyNeuralNetwork(na.Module):\n        def __init__(self):\n            self.linear = na.Linear(2, 1)\n\n        def forward(self, x):\n            return self.linear(x)\n\n    network = MyNeuralNetwork()\n    optimizer = na.SGD(network.parameters(), lr=0.1)\n\n    ```\n* Example 2 :\nbuilding 'Linear Model' using torch autograd engine \n\n    ```python  \n\n    import autograd.torch.nn as nn \n    import autograd.torch.tensor as Tensor\n    import autograd.torch.optim as SGD\n    import autograd.functiona as F\n\n    class Model(nn.Module):\n        def __init__(self):\n            super().__init__()\n            self.l1 = nn.Linear(784, 1568, name='l1')\n            self.l2 = nn.Linear(1568, 392, name='l2')\n            self.l3 = nn.Linear(392, 10, name='l3')\n\n        def forward(self, x):\n            z = F.relu(self.l1(x))\n            z = F.relu(self.l2(z))\n            out = F.log_softmax(self.l3(z))\n            return out\n\n    model = Model()\n    optimizer = autograd.optim.SGD(model.parameters(), lr=5e-2, weight_decay=1e-4)\n    scheduler = autograd.optim.lr_scheduler.LinearLR(optimizer, start_factor=1.0, end_factor=0.75, total_iters=num_epochs)\n\n    ```\n### Examples\n\nThe Nano-AutoGrad repository provides various examples demonstrating the usage of the framework for different tasks, such as linear regression, classification, and more. You can explore the examples directory in the repository to gain a better understanding of how to use Nano-AutoGrad in practice.\nContributing please \n\n**Nano_AutoGrads_tutorial_Linear_model** [![Open Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/drive/1o7USeyjTLjmgjjGXkQLt96HYNAbc_r7j)\n</br>\n\n**Nano_AutoGrads_tutorial_Sparse_Networks** [![Open Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/drive/1wibEbcfqI4r3e8b28TleP562uCooxPw_#scrollTo=_y-pwg1_fNus)\n</br>\n\n**Using Nano-AutoGrads to classify MINIST handwritten digits** [![Open Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/drive/1wvJQYmYT8-7On7tto3klDN7_Zx4Xgf3_#scrollTo=d4qP5clo9CT5)\n</br>\n\n### Contributions \n\nNano-AutoGrad are welcome! If you have any bug reports, feature requests, or want to contribute code, please open an issue or submit a pull request on the official GitHub repository.\nLicense\n\nNano-AutoGrad is released under the MIT License. Please see the LICENSE file in the repository for more details.\nAcknowledgements\n\nWe would like to thank the contributors and the open-source community for their valuable contributions to Nano-AutoGrad.\nContact\n\nFor any inquiries or further information, you can reach out to the project maintainer, Youness El Brag, via email at youness.elbrag@example.com.\n\n\nPlease note that you may need to update the contact email address with the appropriate one.\n\n\n### Credits :\n\n1. [micrograd](https://github.com/karpathy/micrograd) Andrej karpathy\n2. [ugrad](https://github.com/conscell/ugrad/tree/main)  conscell \n\n\n\n",
    "bugtrack_url": null,
    "license": "",
    "summary": "A tinyTroch scalar-Engine Nano-autograd a Micro-Framework with a small PyTorch-like neural network library on top.",
    "version": "1.1.1",
    "project_urls": {
        "Homepage": "https://github.com/deep-matter/Nano-AutoGrad/tree/main/autograd"
    },
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "6fe4edc62cca4371866089cf1167c3a6b172a36a8d546100ad3bc2f35d50aa8e",
                "md5": "e3b28f709131413b39d7d4d780501878",
                "sha256": "c059ed8d942bba2d83900150c74774256afc0aad27c0e846c796b522d56f7699"
            },
            "downloads": -1,
            "filename": "nano_autograds-1.1.1-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "e3b28f709131413b39d7d4d780501878",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.7",
            "size": 19122,
            "upload_time": "2023-06-27T21:21:31",
            "upload_time_iso_8601": "2023-06-27T21:21:31.509562Z",
            "url": "https://files.pythonhosted.org/packages/6f/e4/edc62cca4371866089cf1167c3a6b172a36a8d546100ad3bc2f35d50aa8e/nano_autograds-1.1.1-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "48cd40458f91de54a6c8b91fcded5883d7bd1b90cc6febcee35ca3d050b9e7a9",
                "md5": "e9ec4d0e277755f5cfe4c05153610fed",
                "sha256": "4c11044a483a7411bb0c102af50bb1343fa5ac7ad43492a0b293e114af65a6b4"
            },
            "downloads": -1,
            "filename": "nano-autograds-1.1.1.tar.gz",
            "has_sig": false,
            "md5_digest": "e9ec4d0e277755f5cfe4c05153610fed",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.7",
            "size": 16710,
            "upload_time": "2023-06-27T21:21:33",
            "upload_time_iso_8601": "2023-06-27T21:21:33.147596Z",
            "url": "https://files.pythonhosted.org/packages/48/cd/40458f91de54a6c8b91fcded5883d7bd1b90cc6febcee35ca3d050b9e7a9/nano-autograds-1.1.1.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-06-27 21:21:33",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "deep-matter",
    "github_project": "Nano-AutoGrad",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "requirements": [
        {
            "name": "pandas",
            "specs": []
        },
        {
            "name": "plotly",
            "specs": []
        },
        {
            "name": "numpy",
            "specs": []
        },
        {
            "name": "matplotlib",
            "specs": []
        },
        {
            "name": "random2",
            "specs": []
        },
        {
            "name": "scipy",
            "specs": []
        },
        {
            "name": "scikit-learn",
            "specs": []
        },
        {
            "name": "kaleido",
            "specs": []
        },
        {
            "name": "ipython",
            "specs": []
        },
        {
            "name": "nbformat",
            "specs": [
                [
                    ">=",
                    "4.2.0"
                ]
            ]
        },
        {
            "name": "ipywidgets",
            "specs": [
                [
                    ">=",
                    "7.0.0"
                ]
            ]
        },
        {
            "name": "imageio",
            "specs": []
        },
        {
            "name": "graphviz",
            "specs": []
        },
        {
            "name": "gradio",
            "specs": []
        },
        {
            "name": "nano-autograds",
            "specs": []
        },
        {
            "name": "sphinx",
            "specs": []
        }
    ],
    "lcname": "nano-autograds"
}
        
Elapsed time: 0.27352s