clumsygrad


Nameclumsygrad JSON
Version 0.2.0 PyPI version JSON
download
home_pageNone
SummaryA minimal python library for automatic differentiation, built on top of NumPy.
upload_time2025-07-12 18:57:35
maintainerNone
docs_urlNone
authorNone
requires_python>=3.9
licenseNone
keywords autograd automatic-differentiation machine-learning
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # ClumsyGrad

[![PyPI version](https://badge.fury.io/py/clumsygrad.svg)](https://badge.fury.io/py/clumsygrad)
[![Docs](https://readthedocs.org/projects/clumsygrad/badge/?version=latest)](https://clumsygrad.readthedocs.io/en/latest/)
[![Tests](https://github.com/Sayan-001/ClumsyGrad/actions/workflows/tests.yml/badge.svg)](https://github.com/Sayan-001/ClumsyGrad/actions/workflows/tests.yml)

A minimal Python library for automatic differentiation, built on top of NumPy. The `Tensor` class has support for creating and expanding a computation graph dynamically with each operation.

## Features

- **Dynamic Computational Graphs**: Graphs are created on the fly.
- **Automatic Differentiation**: Compute gradients automatically using the chain rule.
- **Basic Tensor Operations**: Supports addition, subtraction, multiplication, matrix multiplication, power, exp, etc.

## Installation

You can install the library using pip:

```shell
pip install clumsygrad
```

## Basics

Here's a brief overview of how to use the library:

### Creating Tensors

```python
from clumsygrad.tensor import Tensor, TensorType
import numpy as np

# Create a tensor from a list (defaults to TensorType.INPUT)
a = Tensor([1.0, 2.0, 3.0])
print(a)
# Output: Tensor(id=0, shape=(3,), tensor_type=INPUT, grad_fn=None, requires_grad=False)

# Create a tensor that requires gradients (e.g., a parameter)
b = Tensor([[4.0], [5.0], [6.0]], tensor_type=TensorType.PARAMETER)
print(b)
# Output: Tensor(id=1, shape=(3, 1), tensor_type=PARAMETER, grad_fn=None, requires_grad=True)
```

### Performing Operations

```python
from clumsygrad.tensor import Tensor, TensorType
from clumsygrad.math import exp, sin

x = Tensor([1.0, 2.0, 3.0])
y = exp(x**2 + 3*x + 2)
z = sin(y)

# As implicitly tensors are of type INPUT, the computational graph is not built, signified by
# grad_fn = None.
print(z) # Tensor(id=6, shape=(3,), tensor_type=INPUT, grad_fn=None, requires_grad=False)
print(z.data) # [0.9648606  0.99041617 0.83529955]

x = Tensor([1.0, 2.0, 3.0], tensor_type=TensorType.PARAMETER)
y = exp(x**2 + 3*x + 2)
z = sin(y)

# Now, the tensor is of type PARAMETER, and the computational graph is built.
print(z) # Tensor(id=13, shape=(3,), tensor_type=INTERMEDIATE, grad_fn=sin_backward, requires_grad=True)
print(z.data) # [0.9648606  0.99041617 0.83529955]
```

### Automatic Differentiation (Backpropagation)

Consider the function $~z = e^{sin(x)^2 + cos(y)}$. We can evaluate $\frac{dz}{dx}$ and $\frac{dz}{dy}$ at particular point as:

```python
from clumsygrad.tensor import Tensor, TensorType
from clumsygrad.math import exp, sin, cos, sum

# Set tensor_type to PARAMETER to ensure gradients are tracked
x = Tensor(1.0, tensor_type=TensorType.PARAMETER)
y = Tensor(0.5, tensor_type=TensorType.PARAMETER)
z = exp(sin(x)**2 + cos(y))

# Calculating dz/dx and dz/dy
z.backward()

# Value of dz/dx
print(x.grad) # [4.43963]
# Value of dz/dy
print(y.grad) # [-2.34079]
```

## Contributing

Contributions are welcome! If you'd like to contribute, please feel free to fork the repository, make your changes, and submit a pull request. You can also open an issue if you find a bug.

## License

This project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.

## Documentation

For more detailed information, tutorials, and API reference, you can check out the [official documentation](https://clumsygrad.readthedocs.io/en/latest/).

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "clumsygrad",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.9",
    "maintainer_email": "Sayan Gupta <sayangupta540@gmail.com>",
    "keywords": "autograd, automatic-differentiation, machine-learning",
    "author": null,
    "author_email": "Sayan Gupta <sayangupta540@gmail.com>",
    "download_url": "https://files.pythonhosted.org/packages/92/81/dbdc6109b5c90dce00e8de4d2198cd43d051cf68f662b5e26aad2d2136d8/clumsygrad-0.2.0.tar.gz",
    "platform": null,
    "description": "# ClumsyGrad\r\n\r\n[![PyPI version](https://badge.fury.io/py/clumsygrad.svg)](https://badge.fury.io/py/clumsygrad)\r\n[![Docs](https://readthedocs.org/projects/clumsygrad/badge/?version=latest)](https://clumsygrad.readthedocs.io/en/latest/)\r\n[![Tests](https://github.com/Sayan-001/ClumsyGrad/actions/workflows/tests.yml/badge.svg)](https://github.com/Sayan-001/ClumsyGrad/actions/workflows/tests.yml)\r\n\r\nA minimal Python library for automatic differentiation, built on top of NumPy. The `Tensor` class has support for creating and expanding a computation graph dynamically with each operation.\r\n\r\n## Features\r\n\r\n- **Dynamic Computational Graphs**: Graphs are created on the fly.\r\n- **Automatic Differentiation**: Compute gradients automatically using the chain rule.\r\n- **Basic Tensor Operations**: Supports addition, subtraction, multiplication, matrix multiplication, power, exp, etc.\r\n\r\n## Installation\r\n\r\nYou can install the library using pip:\r\n\r\n```shell\r\npip install clumsygrad\r\n```\r\n\r\n## Basics\r\n\r\nHere's a brief overview of how to use the library:\r\n\r\n### Creating Tensors\r\n\r\n```python\r\nfrom clumsygrad.tensor import Tensor, TensorType\r\nimport numpy as np\r\n\r\n# Create a tensor from a list (defaults to TensorType.INPUT)\r\na = Tensor([1.0, 2.0, 3.0])\r\nprint(a)\r\n# Output: Tensor(id=0, shape=(3,), tensor_type=INPUT, grad_fn=None, requires_grad=False)\r\n\r\n# Create a tensor that requires gradients (e.g., a parameter)\r\nb = Tensor([[4.0], [5.0], [6.0]], tensor_type=TensorType.PARAMETER)\r\nprint(b)\r\n# Output: Tensor(id=1, shape=(3, 1), tensor_type=PARAMETER, grad_fn=None, requires_grad=True)\r\n```\r\n\r\n### Performing Operations\r\n\r\n```python\r\nfrom clumsygrad.tensor import Tensor, TensorType\r\nfrom clumsygrad.math import exp, sin\r\n\r\nx = Tensor([1.0, 2.0, 3.0])\r\ny = exp(x**2 + 3*x + 2)\r\nz = sin(y)\r\n\r\n# As implicitly tensors are of type INPUT, the computational graph is not built, signified by\r\n# grad_fn = None.\r\nprint(z) # Tensor(id=6, shape=(3,), tensor_type=INPUT, grad_fn=None, requires_grad=False)\r\nprint(z.data) # [0.9648606  0.99041617 0.83529955]\r\n\r\nx = Tensor([1.0, 2.0, 3.0], tensor_type=TensorType.PARAMETER)\r\ny = exp(x**2 + 3*x + 2)\r\nz = sin(y)\r\n\r\n# Now, the tensor is of type PARAMETER, and the computational graph is built.\r\nprint(z) # Tensor(id=13, shape=(3,), tensor_type=INTERMEDIATE, grad_fn=sin_backward, requires_grad=True)\r\nprint(z.data) # [0.9648606  0.99041617 0.83529955]\r\n```\r\n\r\n### Automatic Differentiation (Backpropagation)\r\n\r\nConsider the function $~z = e^{sin(x)^2 + cos(y)}$. We can evaluate $\\frac{dz}{dx}$ and $\\frac{dz}{dy}$ at particular point as:\r\n\r\n```python\r\nfrom clumsygrad.tensor import Tensor, TensorType\r\nfrom clumsygrad.math import exp, sin, cos, sum\r\n\r\n# Set tensor_type to PARAMETER to ensure gradients are tracked\r\nx = Tensor(1.0, tensor_type=TensorType.PARAMETER)\r\ny = Tensor(0.5, tensor_type=TensorType.PARAMETER)\r\nz = exp(sin(x)**2 + cos(y))\r\n\r\n# Calculating dz/dx and dz/dy\r\nz.backward()\r\n\r\n# Value of dz/dx\r\nprint(x.grad) # [4.43963]\r\n# Value of dz/dy\r\nprint(y.grad) # [-2.34079]\r\n```\r\n\r\n## Contributing\r\n\r\nContributions are welcome! If you'd like to contribute, please feel free to fork the repository, make your changes, and submit a pull request. You can also open an issue if you find a bug.\r\n\r\n## License\r\n\r\nThis project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.\r\n\r\n## Documentation\r\n\r\nFor more detailed information, tutorials, and API reference, you can check out the [official documentation](https://clumsygrad.readthedocs.io/en/latest/).\r\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "A minimal python library for automatic differentiation, built on top of NumPy.",
    "version": "0.2.0",
    "project_urls": {
        "Homepage": "https://github.com/Sayan-001/ClumsyGrad",
        "Issues": "https://github.com/Sayan-001/ClumsyGrad/issues"
    },
    "split_keywords": [
        "autograd",
        " automatic-differentiation",
        " machine-learning"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "43dbd08074e8aa652b4514580be73cd714459b8d4d5266a255ee31554bb917d6",
                "md5": "3b7c44e1d0d09d6b363c29e1f8ad77c0",
                "sha256": "6141cd9c084740015577b55c586bf20a40c9ae58e6a9ea57766e60729816cff4"
            },
            "downloads": -1,
            "filename": "clumsygrad-0.2.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "3b7c44e1d0d09d6b363c29e1f8ad77c0",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.9",
            "size": 15685,
            "upload_time": "2025-07-12T18:57:34",
            "upload_time_iso_8601": "2025-07-12T18:57:34.080498Z",
            "url": "https://files.pythonhosted.org/packages/43/db/d08074e8aa652b4514580be73cd714459b8d4d5266a255ee31554bb917d6/clumsygrad-0.2.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "9281dbdc6109b5c90dce00e8de4d2198cd43d051cf68f662b5e26aad2d2136d8",
                "md5": "e6abd1d08099495d076aedff0cc9d10e",
                "sha256": "a8713c42102aad4760a872d99550a6dd763ed9f28a810b3509a0b47aa6064e02"
            },
            "downloads": -1,
            "filename": "clumsygrad-0.2.0.tar.gz",
            "has_sig": false,
            "md5_digest": "e6abd1d08099495d076aedff0cc9d10e",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.9",
            "size": 20044,
            "upload_time": "2025-07-12T18:57:35",
            "upload_time_iso_8601": "2025-07-12T18:57:35.995970Z",
            "url": "https://files.pythonhosted.org/packages/92/81/dbdc6109b5c90dce00e8de4d2198cd43d051cf68f662b5e26aad2d2136d8/clumsygrad-0.2.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-07-12 18:57:35",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "Sayan-001",
    "github_project": "ClumsyGrad",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "clumsygrad"
}
        
Elapsed time: 1.11652s