clumsygrad


Nameclumsygrad JSON
Version 0.1.2 PyPI version JSON
download
home_pageNone
SummaryA minimal python library for automatic differentiation, built on top of NumPy.
upload_time2025-07-08 18:22:56
maintainerNone
docs_urlNone
authorNone
requires_python>=3.9
licenseNone
keywords autograd automatic-differentiation machine-learning
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # ClumsyGrad

[![PyPI version](https://badge.fury.io/py/clumsygrad.svg)](https://badge.fury.io/py/clumsygrad)
[![Docs](https://readthedocs.org/projects/clumsygrad/badge/?version=latest)](https://clumsygrad.readthedocs.io/en/latest/)

**ClumsyGrad** is a minimal Python library for automatic differentiation, built on top of NumPy. The `Tensor` class has support for creating and expanding a computation graph dynamically with each operation.

## Features

- **Dynamic Computational Graphs**: Graphs are created on the fly.
- **Automatic Differentiation**: Compute gradients automatically using the chain rule.
- **Basic Tensor Operations**: Supports addition, subtraction, multiplication, matrix multiplication, power, exp, etc.

## Installation

You can install ClumsyGrad using pip:

```shell
pip install clumsygrad
```

## Basics

Here's a brief overview of how to use the library:

### Creating Tensors

```python
from clumsygrad.tensor import Tensor, TensorType
import numpy as np

# Create a tensor from a list (defaults to TensorType.INPUT)
a = Tensor([1.0, 2.0, 3.0])
print(a)
# Output: Tensor(id=0, shape=(3,), tensor_type=INPUT, grad_fn=None, requires_grad=False)

# Create a tensor that requires gradients (e.g., a parameter)
b = Tensor([[4.0], [5.0], [6.0]], tensor_type=TensorType.PARAMETER)
print(b)
# Output: Tensor(id=1, shape=(3, 1), tensor_type=PARAMETER, grad_fn=None, requires_grad=True)

# NumPy arrays can also be directly passed to the Tensor object
```

### Performing Operations

```python
from clumsygrad.tensor import Tensor, TensorType

# Define some tensors
x = Tensor([2.0, 3.0], tensor_type=TensorType.PARAMETER)
y = Tensor([4.0, 5.0], tensor_type=TensorType.PARAMETER)
s = Tensor(10.0, tensor_type=TensorType.PARAMETER) # A scalar tensor

# Addition
z_add = x + y
print(f"x + y = {z_add.data}")
# Output: x + y = [6. 7.]

# Element-wise multiplication
z_mul = x * y
print(f"x * y = {z_mul.data}")
# Output: x * y = [ 8. 15.]

# Scalar multiplication
z_scalar_mul = x * s # or x * 10.0
print(f"x * s = {z_scalar_mul.data}")
# Output: x * s = [20. 30.]

# Power
z_pow = x ** 2
print(f"x ** 2 = {z_pow.data}")
# Output: x ** 2 = [4. 9.]

# Matrix multiplication
mat_a = Tensor([[1, 2], [3, 4]], tensor_type=TensorType.PARAMETER)
mat_b = Tensor([[5, 6], [7, 8]], tensor_type=TensorType.PARAMETER)
mat_c = mat_a @ mat_b
print(f"mat_a @ mat_b = \n{mat_c.data}")
# Output: mat_a @ mat_b =
# [[19. 22.]
#  [43. 50.]]
```

### Automatic Differentiation (Backpropagation)

For the function `L = sum(a * b + c)`. The gradient of L wrt. a, b and c is calculated as:

```python
from clumsygrad.tensor import Tensor
from clumsygrad.types import TensorType
import numpy as np

# Define input tensors that require gradients
a = Tensor([2.0, 3.0], tensor_type=TensorType.PARAMETER)
b = Tensor([4.0, 1.0], tensor_type=TensorType.PARAMETER)
c = Tensor([-1.0, 2.0], tensor_type=TensorType.PARAMETER)

# Define the computation
# x = a * b  => x = [8.0, 3.0]
# y = x + c  => y = [7.0, 5.0]
# L = sum(y) => L = 12.0
x = a * b
y = x + c
L = sum(y)

print(f"L = {L.data}")
# Output: L = 12.0

# Perform backpropagation
L.backward()

# Access the gradients
print(f"Gradient of L with respect to a (dL/da): {a.grad}")
# Output: Gradient of L with respect to a (dL/da): [4. 1.]
# (dL/da_i = b_i)

print(f"Gradient of L with respect to b (dL/db): {b.grad}")
# Output: Gradient of L with respect to b (dL/db): [2. 3.]
# (dL/db_i = a_i)

print(f"Gradient of L with respect to c (dL/dc): {c.grad}")
# Output: Gradient of L with respect to c (dL/dc): [1. 1.]
# (dL/dc_i = 1)
```

## Contributing

Contributions are welcome! If you'd like to contribute, please feel free to fork the repository, make your changes, and submit a pull request. You can also open an issue if you find a bug.

## License

This project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.

## Documentation

For more detailed information, tutorials, and API reference, you can check out the [official documentation](https://clumsygrad.readthedocs.io/en/latest/).

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "clumsygrad",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.9",
    "maintainer_email": "Sayan Gupta <sayangupta540@gmail.com>",
    "keywords": "autograd, automatic-differentiation, machine-learning",
    "author": null,
    "author_email": "Sayan Gupta <sayangupta540@gmail.com>",
    "download_url": "https://files.pythonhosted.org/packages/ff/a3/2d344c17112fc5c58644cae52962c0fe115d33db91195c030ebf789d2b77/clumsygrad-0.1.2.tar.gz",
    "platform": null,
    "description": "# ClumsyGrad\r\n\r\n[![PyPI version](https://badge.fury.io/py/clumsygrad.svg)](https://badge.fury.io/py/clumsygrad)\r\n[![Docs](https://readthedocs.org/projects/clumsygrad/badge/?version=latest)](https://clumsygrad.readthedocs.io/en/latest/)\r\n\r\n**ClumsyGrad** is a minimal Python library for automatic differentiation, built on top of NumPy. The `Tensor` class has support for creating and expanding a computation graph dynamically with each operation.\r\n\r\n## Features\r\n\r\n- **Dynamic Computational Graphs**: Graphs are created on the fly.\r\n- **Automatic Differentiation**: Compute gradients automatically using the chain rule.\r\n- **Basic Tensor Operations**: Supports addition, subtraction, multiplication, matrix multiplication, power, exp, etc.\r\n\r\n## Installation\r\n\r\nYou can install ClumsyGrad using pip:\r\n\r\n```shell\r\npip install clumsygrad\r\n```\r\n\r\n## Basics\r\n\r\nHere's a brief overview of how to use the library:\r\n\r\n### Creating Tensors\r\n\r\n```python\r\nfrom clumsygrad.tensor import Tensor, TensorType\r\nimport numpy as np\r\n\r\n# Create a tensor from a list (defaults to TensorType.INPUT)\r\na = Tensor([1.0, 2.0, 3.0])\r\nprint(a)\r\n# Output: Tensor(id=0, shape=(3,), tensor_type=INPUT, grad_fn=None, requires_grad=False)\r\n\r\n# Create a tensor that requires gradients (e.g., a parameter)\r\nb = Tensor([[4.0], [5.0], [6.0]], tensor_type=TensorType.PARAMETER)\r\nprint(b)\r\n# Output: Tensor(id=1, shape=(3, 1), tensor_type=PARAMETER, grad_fn=None, requires_grad=True)\r\n\r\n# NumPy arrays can also be directly passed to the Tensor object\r\n```\r\n\r\n### Performing Operations\r\n\r\n```python\r\nfrom clumsygrad.tensor import Tensor, TensorType\r\n\r\n# Define some tensors\r\nx = Tensor([2.0, 3.0], tensor_type=TensorType.PARAMETER)\r\ny = Tensor([4.0, 5.0], tensor_type=TensorType.PARAMETER)\r\ns = Tensor(10.0, tensor_type=TensorType.PARAMETER) # A scalar tensor\r\n\r\n# Addition\r\nz_add = x + y\r\nprint(f\"x + y = {z_add.data}\")\r\n# Output: x + y = [6. 7.]\r\n\r\n# Element-wise multiplication\r\nz_mul = x * y\r\nprint(f\"x * y = {z_mul.data}\")\r\n# Output: x * y = [ 8. 15.]\r\n\r\n# Scalar multiplication\r\nz_scalar_mul = x * s # or x * 10.0\r\nprint(f\"x * s = {z_scalar_mul.data}\")\r\n# Output: x * s = [20. 30.]\r\n\r\n# Power\r\nz_pow = x ** 2\r\nprint(f\"x ** 2 = {z_pow.data}\")\r\n# Output: x ** 2 = [4. 9.]\r\n\r\n# Matrix multiplication\r\nmat_a = Tensor([[1, 2], [3, 4]], tensor_type=TensorType.PARAMETER)\r\nmat_b = Tensor([[5, 6], [7, 8]], tensor_type=TensorType.PARAMETER)\r\nmat_c = mat_a @ mat_b\r\nprint(f\"mat_a @ mat_b = \\n{mat_c.data}\")\r\n# Output: mat_a @ mat_b =\r\n# [[19. 22.]\r\n#  [43. 50.]]\r\n```\r\n\r\n### Automatic Differentiation (Backpropagation)\r\n\r\nFor the function `L = sum(a * b + c)`. The gradient of L wrt. a, b and c is calculated as:\r\n\r\n```python\r\nfrom clumsygrad.tensor import Tensor\r\nfrom clumsygrad.types import TensorType\r\nimport numpy as np\r\n\r\n# Define input tensors that require gradients\r\na = Tensor([2.0, 3.0], tensor_type=TensorType.PARAMETER)\r\nb = Tensor([4.0, 1.0], tensor_type=TensorType.PARAMETER)\r\nc = Tensor([-1.0, 2.0], tensor_type=TensorType.PARAMETER)\r\n\r\n# Define the computation\r\n# x = a * b  => x = [8.0, 3.0]\r\n# y = x + c  => y = [7.0, 5.0]\r\n# L = sum(y) => L = 12.0\r\nx = a * b\r\ny = x + c\r\nL = sum(y)\r\n\r\nprint(f\"L = {L.data}\")\r\n# Output: L = 12.0\r\n\r\n# Perform backpropagation\r\nL.backward()\r\n\r\n# Access the gradients\r\nprint(f\"Gradient of L with respect to a (dL/da): {a.grad}\")\r\n# Output: Gradient of L with respect to a (dL/da): [4. 1.]\r\n# (dL/da_i = b_i)\r\n\r\nprint(f\"Gradient of L with respect to b (dL/db): {b.grad}\")\r\n# Output: Gradient of L with respect to b (dL/db): [2. 3.]\r\n# (dL/db_i = a_i)\r\n\r\nprint(f\"Gradient of L with respect to c (dL/dc): {c.grad}\")\r\n# Output: Gradient of L with respect to c (dL/dc): [1. 1.]\r\n# (dL/dc_i = 1)\r\n```\r\n\r\n## Contributing\r\n\r\nContributions are welcome! If you'd like to contribute, please feel free to fork the repository, make your changes, and submit a pull request. You can also open an issue if you find a bug.\r\n\r\n## License\r\n\r\nThis project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.\r\n\r\n## Documentation\r\n\r\nFor more detailed information, tutorials, and API reference, you can check out the [official documentation](https://clumsygrad.readthedocs.io/en/latest/).\r\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "A minimal python library for automatic differentiation, built on top of NumPy.",
    "version": "0.1.2",
    "project_urls": {
        "Homepage": "https://github.com/Sayan-001/ClumsyGrad",
        "Issues": "https://github.com/Sayan-001/ClumsyGrad/issues"
    },
    "split_keywords": [
        "autograd",
        " automatic-differentiation",
        " machine-learning"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "668eb5ed645a8ddcf5c9b972b0657500eb5f03adfd22221df574e2f4f840d206",
                "md5": "cb7edb9babc4cd1d34886b9942ae20bd",
                "sha256": "e353da9a0bfb8f6084bdf922c6788aa62bd5e6db816263ee6e3e8a8e3d473d15"
            },
            "downloads": -1,
            "filename": "clumsygrad-0.1.2-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "cb7edb9babc4cd1d34886b9942ae20bd",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.9",
            "size": 15241,
            "upload_time": "2025-07-08T18:22:55",
            "upload_time_iso_8601": "2025-07-08T18:22:55.142269Z",
            "url": "https://files.pythonhosted.org/packages/66/8e/b5ed645a8ddcf5c9b972b0657500eb5f03adfd22221df574e2f4f840d206/clumsygrad-0.1.2-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "ffa32d344c17112fc5c58644cae52962c0fe115d33db91195c030ebf789d2b77",
                "md5": "92590a9fdf19773be3520039f1da7505",
                "sha256": "5757e42a40639cffb4e91882e6599ada856b84c84b117d16186b1fa84e035dc0"
            },
            "downloads": -1,
            "filename": "clumsygrad-0.1.2.tar.gz",
            "has_sig": false,
            "md5_digest": "92590a9fdf19773be3520039f1da7505",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.9",
            "size": 20265,
            "upload_time": "2025-07-08T18:22:56",
            "upload_time_iso_8601": "2025-07-08T18:22:56.292603Z",
            "url": "https://files.pythonhosted.org/packages/ff/a3/2d344c17112fc5c58644cae52962c0fe115d33db91195c030ebf789d2b77/clumsygrad-0.1.2.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-07-08 18:22:56",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "Sayan-001",
    "github_project": "ClumsyGrad",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "lcname": "clumsygrad"
}
        
Elapsed time: 1.26586s