mdgrad


Namemdgrad JSON
Version 0.3 PyPI version JSON
download
home_pageNone
SummaryTensor-based autdiff engine and neural network API
upload_time2024-07-13 17:55:21
maintainerNone
docs_urlNone
authorJordan Madden
requires_python>=3.8
licenseNone
keywords python tensors neural networks automatic differentiation
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # mdgrad

A small autograd engine that implements backpropagation (reverse-mode autodiff). Heavily inspired by karpathy's [micrograd](https://github.com/karpathy/micrograd/tree/master), and extended to support operations on tensors instead of scalars. Includes a small neural network api for building and training neural networks. Has a PyTorch-like API.

Hopefully useful as an educational resource.

## Installation

``` bash
pip install mdgrad
```

## Example Usage

A dumb example showing supported operations

```python

import mdgrad
import mdgrad.nn as nn

a = 3 * mdgrad.randn(3, 2)
b = mdgrad.ones(shape=(2, 2))
c = a @ b
d = c * 3 / 2
e = d ** 2
f = e.sum()
print(f.data) 
f.backward()
print(a.grad) 
```

An example showing how to define and run a neural network. See the files in `examples/` for more details on building and training models.

```python

import mdgrad
import mdgrad.nn as nn

# Define the model and loss function
model = nn.Sequential(
    nn.Linear(2, 20),
    nn.ReLU(),
    nn.Linear(20, 50), 
    nn.ReLU(),
    nn.Linear(50, 15),
    nn.ReLU(),
    nn.Linear(15, 1),
    nn.Sigmoid()
)
loss_fn = nn.MSELoss()

# Create dummy data
X = mdgrad.randn(100, 2)
target = mdgrad.randn(100, 1)

# Compute output and loss
out = model(X)
loss = loss_fn(out, target)

# Compute gradients of parameters
loss.backward()
```

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "mdgrad",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.8",
    "maintainer_email": null,
    "keywords": "python, tensors, neural networks, automatic differentiation",
    "author": "Jordan Madden",
    "author_email": "<jordanmadden285@gmail.com>",
    "download_url": "https://files.pythonhosted.org/packages/53/ec/42b7dfba183465ea519c7f7407b39a53cb7b9a4215cfacca62da07cddc63/mdgrad-0.3.tar.gz",
    "platform": null,
    "description": "# mdgrad\n\nA small autograd engine that implements backpropagation (reverse-mode autodiff). Heavily inspired by karpathy's [micrograd](https://github.com/karpathy/micrograd/tree/master), and extended to support operations on tensors instead of scalars. Includes a small neural network api for building and training neural networks. Has a PyTorch-like API.\n\nHopefully useful as an educational resource.\n\n## Installation\n\n``` bash\npip install mdgrad\n```\n\n## Example Usage\n\nA dumb example showing supported operations\n\n```python\n\nimport mdgrad\nimport mdgrad.nn as nn\n\na = 3 * mdgrad.randn(3, 2)\nb = mdgrad.ones(shape=(2, 2))\nc = a @ b\nd = c * 3 / 2\ne = d ** 2\nf = e.sum()\nprint(f.data) \nf.backward()\nprint(a.grad) \n```\n\nAn example showing how to define and run a neural network. See the files in `examples/` for more details on building and training models.\n\n```python\n\nimport mdgrad\nimport mdgrad.nn as nn\n\n# Define the model and loss function\nmodel = nn.Sequential(\n    nn.Linear(2, 20),\n    nn.ReLU(),\n    nn.Linear(20, 50), \n    nn.ReLU(),\n    nn.Linear(50, 15),\n    nn.ReLU(),\n    nn.Linear(15, 1),\n    nn.Sigmoid()\n)\nloss_fn = nn.MSELoss()\n\n# Create dummy data\nX = mdgrad.randn(100, 2)\ntarget = mdgrad.randn(100, 1)\n\n# Compute output and loss\nout = model(X)\nloss = loss_fn(out, target)\n\n# Compute gradients of parameters\nloss.backward()\n```\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "Tensor-based autdiff engine and neural network API",
    "version": "0.3",
    "project_urls": null,
    "split_keywords": [
        "python",
        " tensors",
        " neural networks",
        " automatic differentiation"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "72d5c51cc8724e7355b7720efbdca99470073609b40e64013a6fbda48c1d48d8",
                "md5": "9aa352c2b2517be67d4e9e0a79583027",
                "sha256": "fad050d5ef563d6b5348d276420c394aacac7abe2a7e14a1cd1ebfefc01f0d52"
            },
            "downloads": -1,
            "filename": "mdgrad-0.3-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "9aa352c2b2517be67d4e9e0a79583027",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.8",
            "size": 12018,
            "upload_time": "2024-07-13T17:55:19",
            "upload_time_iso_8601": "2024-07-13T17:55:19.964692Z",
            "url": "https://files.pythonhosted.org/packages/72/d5/c51cc8724e7355b7720efbdca99470073609b40e64013a6fbda48c1d48d8/mdgrad-0.3-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "53ec42b7dfba183465ea519c7f7407b39a53cb7b9a4215cfacca62da07cddc63",
                "md5": "5cc438e9c496ad59e0b6b5a4f0854742",
                "sha256": "f4457a9d014a65e7f5972d147db4d0edc248727d316c3cba7f5575d7e4a548fc"
            },
            "downloads": -1,
            "filename": "mdgrad-0.3.tar.gz",
            "has_sig": false,
            "md5_digest": "5cc438e9c496ad59e0b6b5a4f0854742",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.8",
            "size": 11792,
            "upload_time": "2024-07-13T17:55:21",
            "upload_time_iso_8601": "2024-07-13T17:55:21.571212Z",
            "url": "https://files.pythonhosted.org/packages/53/ec/42b7dfba183465ea519c7f7407b39a53cb7b9a4215cfacca62da07cddc63/mdgrad-0.3.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-07-13 17:55:21",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "lcname": "mdgrad"
}
        
Elapsed time: 0.69425s