skinnygrad


Nameskinnygrad JSON
Version 0.1.3 PyPI version JSON
download
home_pageNone
Summaryautodiff engine inspired by tinygrad
upload_time2024-08-19 04:17:56
maintainerNone
docs_urlNone
authorArthur
requires_python<4.0,>=3.11
licenseMIT
keywords autodiff automatic differentiation machine learning deep learning tensor
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # SkinnyGrad
![python](https://img.shields.io/badge/python-3.11%5E-blue.svg) ![pypi](https://img.shields.io/pypi/v/skinnygrad.svg) ![license](https://img.shields.io/github/license/ArthurBook/skinnygrad) ![tests](https://github.com/ArthurBook/skinnygrad/actions/workflows/tests.yaml/badge.svg)


**SkinnyGrad** is a tensor autodifferentiation library that I wrote as a side project for fun and learning. By default, a computational graph is built and evaluated lazily with [NumPy](https://github.com/numpy/numpy). GPU acceleration is also available with the [CuPy backend extension](./extensions/cupy_engine/). At ~1300 lines, skinnygrad is written with simplicity and extensibility in mind. It nevertheless covers a [good subset](./src/skinnygrad/tensors.py) of the features of a `torch.Tensor`. Kudos to [tinygrad](https://github.com/tinygrad/tinygrad) which inspired the RISC-like design of mapping all operations to [19 low level ops](./src/skinnygrad/llops.py) that the runtime engine optimizes and executes.

# Try it out!
```bash
pip install skinnygrad
```
```python
import skinnygrad

a = skinnygrad.Tensor(((1, 2, 3)))
b = skinnygrad.Tensor(10)
x = skinnygrad.Tensor(((4,), (5,), (6,)))
y = a @ x + b
print(y)
# <skinnygrad.tensors.Tensor(
#   <skinnygrad.llops.Symbol(UNREALIZED <Op(ADD)>, shape=(1, 1))>,
#   self.requires_grad=False,
#   self.gradient=None,
# )>
print(y.realize())
# [[42]]
```

# LeNet-5 as a convergence test
As an end-to-end test for the engine, I replicated the [LeNet-5 paper](http://vision.stanford.edu/cs598_spring07/papers/Lecun98.pdf) -- a convolutional neural network (CNN) designed for handwritten digit recognition. Trained on [MNIST](https://yann.lecun.com/exdb/mnist/), the model recovers 98% accuracy on the evaluation set after about 5 epochs. With a batch size of 64 it takes a few minutes per training epoch (60k images) using the CuPy GPU acceleration backend on a Nvidia A100 GPU. The code for the experiment can be found in the [examples folder](./examples/le_net.py).

## BONUS: The computational graph pass built up by the skinnygrad engine for LeNet-5
![lenet-fwd](./static/lenet-forward.png)

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "skinnygrad",
    "maintainer": null,
    "docs_url": null,
    "requires_python": "<4.0,>=3.11",
    "maintainer_email": null,
    "keywords": "autodiff, automatic differentiation, machine learning, deep learning, tensor",
    "author": "Arthur",
    "author_email": "atte.book@gmail.com",
    "download_url": "https://files.pythonhosted.org/packages/df/de/b5220f4a10bc6df1dc0f56690d2f0364d826d2a8bf4dc816edbf6565047c/skinnygrad-0.1.3.tar.gz",
    "platform": null,
    "description": "# SkinnyGrad\n![python](https://img.shields.io/badge/python-3.11%5E-blue.svg) ![pypi](https://img.shields.io/pypi/v/skinnygrad.svg) ![license](https://img.shields.io/github/license/ArthurBook/skinnygrad) ![tests](https://github.com/ArthurBook/skinnygrad/actions/workflows/tests.yaml/badge.svg)\n\n\n**SkinnyGrad** is a tensor autodifferentiation library that I wrote as a side project for fun and learning. By default, a computational graph is built and evaluated lazily with [NumPy](https://github.com/numpy/numpy). GPU acceleration is also available with the [CuPy backend extension](./extensions/cupy_engine/). At ~1300 lines, skinnygrad is written with simplicity and extensibility in mind. It nevertheless covers a [good subset](./src/skinnygrad/tensors.py) of the features of a `torch.Tensor`. Kudos to [tinygrad](https://github.com/tinygrad/tinygrad) which inspired the RISC-like design of mapping all operations to [19 low level ops](./src/skinnygrad/llops.py) that the runtime engine optimizes and executes.\n\n# Try it out!\n```bash\npip install skinnygrad\n```\n```python\nimport skinnygrad\n\na = skinnygrad.Tensor(((1, 2, 3)))\nb = skinnygrad.Tensor(10)\nx = skinnygrad.Tensor(((4,), (5,), (6,)))\ny = a @ x + b\nprint(y)\n# <skinnygrad.tensors.Tensor(\n#   <skinnygrad.llops.Symbol(UNREALIZED <Op(ADD)>, shape=(1, 1))>,\n#   self.requires_grad=False,\n#   self.gradient=None,\n# )>\nprint(y.realize())\n# [[42]]\n```\n\n# LeNet-5 as a convergence test\nAs an end-to-end test for the engine, I replicated the [LeNet-5 paper](http://vision.stanford.edu/cs598_spring07/papers/Lecun98.pdf) -- a convolutional neural network (CNN) designed for handwritten digit recognition. Trained on [MNIST](https://yann.lecun.com/exdb/mnist/), the model recovers 98% accuracy on the evaluation set after about 5 epochs. With a batch size of 64 it takes a few minutes per training epoch (60k images) using the CuPy GPU acceleration backend on a Nvidia A100 GPU. The code for the experiment can be found in the [examples folder](./examples/le_net.py).\n\n## BONUS: The computational graph pass built up by the skinnygrad engine for LeNet-5\n![lenet-fwd](./static/lenet-forward.png)\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "autodiff engine inspired by tinygrad",
    "version": "0.1.3",
    "project_urls": {
        "homepage": "https://github.com/ArthurBook/skinnygrad",
        "repository": "https://github.com/ArthurBook/skinnygrad"
    },
    "split_keywords": [
        "autodiff",
        " automatic differentiation",
        " machine learning",
        " deep learning",
        " tensor"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "d40e1bc7e06bb3c0453b0d46827e308619c42af70740835aa5ca4fa398c75be3",
                "md5": "161f414e5f5e1fa7d90275f69f981012",
                "sha256": "1977d9a3f10975bee39911999d375e8964974342f79767cad844b35478d8a623"
            },
            "downloads": -1,
            "filename": "skinnygrad-0.1.3-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "161f414e5f5e1fa7d90275f69f981012",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<4.0,>=3.11",
            "size": 19298,
            "upload_time": "2024-08-19T04:17:54",
            "upload_time_iso_8601": "2024-08-19T04:17:54.866230Z",
            "url": "https://files.pythonhosted.org/packages/d4/0e/1bc7e06bb3c0453b0d46827e308619c42af70740835aa5ca4fa398c75be3/skinnygrad-0.1.3-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "dfdeb5220f4a10bc6df1dc0f56690d2f0364d826d2a8bf4dc816edbf6565047c",
                "md5": "5e3872480b40bccbb93aa2aa939a6193",
                "sha256": "d2a2db4460e681ae72777233bc82e38d324ec2dbc308cd6008dc0eaf7fbfd81f"
            },
            "downloads": -1,
            "filename": "skinnygrad-0.1.3.tar.gz",
            "has_sig": false,
            "md5_digest": "5e3872480b40bccbb93aa2aa939a6193",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<4.0,>=3.11",
            "size": 17346,
            "upload_time": "2024-08-19T04:17:56",
            "upload_time_iso_8601": "2024-08-19T04:17:56.235137Z",
            "url": "https://files.pythonhosted.org/packages/df/de/b5220f4a10bc6df1dc0f56690d2f0364d826d2a8bf4dc816edbf6565047c/skinnygrad-0.1.3.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-08-19 04:17:56",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "ArthurBook",
    "github_project": "skinnygrad",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "skinnygrad"
}
        
Elapsed time: 9.48976s