nanograd-aman


Namenanograd-aman JSON
Version 0.1.0 PyPI version JSON
download
home_pagehttps://github.com/NeuralNoble/nanograd
SummaryA lightweight backpropagation package for neural networks
upload_time2025-01-31 18:10:41
maintainerNone
docs_urlNone
authorAman Anand
requires_python>=3.6
licenseNone
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # 🧠 Nanograd - Lightweight Autograd Engine for Deep Learning  

**Nanograd** is a minimalistic automatic differentiation engine for building and training neural networks. Inspired by PyTorch’s autograd, it provides an easy-to-use framework for defining computational graphs, performing backpropagation, and training models.  

## 🚀 Features  
✅ **Automatic Differentiation** - Compute gradients with ease using backpropagation.  
✅ **Graph-Based Computation** - Uses a dynamic computation graph to track operations.  
✅ **Lightweight & Fast** - No unnecessary dependencies, optimized for speed.  
✅ **Custom Neural Networks** - Build and train models from scratch.  
✅ **Graph Visualization** - Visualize computational graphs using `graphviz`.  

---

## 📦 Installation  

You can install **Nanograd** directly from PyPI:  

```bash
pip install nanograd
```


## 🔧 Usage

### 1️⃣ Defining Computation

```python
from nanograd.engine import Value

a = Value(2.0)
b = Value(3.0)
c = a * b + 5
c.backward()

print(f"Value of c: {c.data}")        # Output: 11.0
print(f"Gradient of a: {a.grad}")     # Output: 3.0
print(f"Gradient of b: {b.grad}")     # Output: 2.0

```

### 2️⃣ Building a Neural Network
```python
from nanograd.nn import MLP
import numpy as np

# Create a 2-layer neural network (2 inputs, 4 hidden, 1 output)
model = MLP(2, [4, 1])

# Dummy data
X = np.array([[1.0, 2.0]])
y = np.array([1.0])

# Forward pass
pred = model.forward(X)
print(f"Prediction: {pred}")

```

### 3️⃣ Visualizing Computational Graph

```python
from nanograd.graph import draw_graph
draw_graph(c)  # Generates a graph of computations

```



            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/NeuralNoble/nanograd",
    "name": "nanograd-aman",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.6",
    "maintainer_email": null,
    "keywords": null,
    "author": "Aman Anand",
    "author_email": "coursesxyz403@gmail.com",
    "download_url": "https://files.pythonhosted.org/packages/e3/5b/2468ef9f48111e2821bffbd1f60a7d336fab2e8d492036511e05073ab6eb/nanograd-aman-0.1.0.tar.gz",
    "platform": null,
    "description": "# \ud83e\udde0 Nanograd - Lightweight Autograd Engine for Deep Learning  \n\n**Nanograd** is a minimalistic automatic differentiation engine for building and training neural networks. Inspired by PyTorch\u2019s autograd, it provides an easy-to-use framework for defining computational graphs, performing backpropagation, and training models.  \n\n## \ud83d\ude80 Features  \n\u2705 **Automatic Differentiation** - Compute gradients with ease using backpropagation.  \n\u2705 **Graph-Based Computation** - Uses a dynamic computation graph to track operations.  \n\u2705 **Lightweight & Fast** - No unnecessary dependencies, optimized for speed.  \n\u2705 **Custom Neural Networks** - Build and train models from scratch.  \n\u2705 **Graph Visualization** - Visualize computational graphs using `graphviz`.  \n\n---\n\n## \ud83d\udce6 Installation  \n\nYou can install **Nanograd** directly from PyPI:  \n\n```bash\npip install nanograd\n```\n\n\n## \ud83d\udd27 Usage\n\n### 1\ufe0f\u20e3 Defining Computation\n\n```python\nfrom nanograd.engine import Value\n\na = Value(2.0)\nb = Value(3.0)\nc = a * b + 5\nc.backward()\n\nprint(f\"Value of c: {c.data}\")        # Output: 11.0\nprint(f\"Gradient of a: {a.grad}\")     # Output: 3.0\nprint(f\"Gradient of b: {b.grad}\")     # Output: 2.0\n\n```\n\n### 2\ufe0f\u20e3 Building a Neural Network\n```python\nfrom nanograd.nn import MLP\nimport numpy as np\n\n# Create a 2-layer neural network (2 inputs, 4 hidden, 1 output)\nmodel = MLP(2, [4, 1])\n\n# Dummy data\nX = np.array([[1.0, 2.0]])\ny = np.array([1.0])\n\n# Forward pass\npred = model.forward(X)\nprint(f\"Prediction: {pred}\")\n\n```\n\n### 3\ufe0f\u20e3 Visualizing Computational Graph\n\n```python\nfrom nanograd.graph import draw_graph\ndraw_graph(c)  # Generates a graph of computations\n\n```\n\n\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "A lightweight backpropagation package for neural networks",
    "version": "0.1.0",
    "project_urls": {
        "Homepage": "https://github.com/NeuralNoble/nanograd"
    },
    "split_keywords": [],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "e35b2468ef9f48111e2821bffbd1f60a7d336fab2e8d492036511e05073ab6eb",
                "md5": "50286f4a9a538f0ef0c9384fb904d9ba",
                "sha256": "d44b5e9936f74fde791ad2149d0d5d0fff1d41a4fa9aacc70a89ec1e3c6d2b33"
            },
            "downloads": -1,
            "filename": "nanograd-aman-0.1.0.tar.gz",
            "has_sig": false,
            "md5_digest": "50286f4a9a538f0ef0c9384fb904d9ba",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.6",
            "size": 3896,
            "upload_time": "2025-01-31T18:10:41",
            "upload_time_iso_8601": "2025-01-31T18:10:41.321920Z",
            "url": "https://files.pythonhosted.org/packages/e3/5b/2468ef9f48111e2821bffbd1f60a7d336fab2e8d492036511e05073ab6eb/nanograd-aman-0.1.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-01-31 18:10:41",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "NeuralNoble",
    "github_project": "nanograd",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "lcname": "nanograd-aman"
}
        
Elapsed time: 0.77689s