deepfusion


Namedeepfusion JSON
Version 0.0.3 PyPI version JSON
download
home_pagehttps://github.com/atharvaaalok/deepfusion
SummaryDeepFusion is a highly modular and customizable deep learning framework!
upload_time2024-05-23 09:42:49
maintainerNone
docs_urlNone
authorAtharva Aalok
requires_pythonNone
licenseMIT
keywords deepfusion deep learning neural networks artificial intelligence machine learning model optimization backpropagation
VCS
bugtrack_url
requirements cupy_cuda12x graphviz matplotlib numpy setuptools typing_extensions
Travis-CI No Travis.
coveralls test coverage No coveralls.
            <div align="center">
    <picture>
        <source media="(prefers-color-scheme: light)" srcset="https://raw.githubusercontent.com/atharvaaalok/deepfusion/main/assets/logos/Light_TextRight.svg">
        <source media="(prefers-color-scheme: dark)" srcset="https://raw.githubusercontent.com/atharvaaalok/deepfusion/main/assets/logos/Dark_TextRight.svg">
        <img alt="DeepFusion Logo with text below it. Displays the light version in light mode and
        the dark version logo in dark mode." src="https://raw.githubusercontent.com/atharvaaalok/deepfusion/main/assets/logos/Light_TextRight.svg" width="100%">
    </picture>
</div>

<br>

DeepFusion is a highly modular and customizable deep learning framework.

It is designed to provide strong and explicit control over all data, operations and parameters while
maintaining a simple and intuitive code base.


## Table of Contents
- [Table of Contents](#table-of-contents)
- [DeepFusion Framework](#deepfusion-framework)
- [Basic Usage](#basic-usage)
- [Highlights](#highlights)
  - [1. Customizable training](#1-customizable-training)
  - [2. Gradient Checking](#2-gradient-checking)
- [Installation](#installation)
  - [1. Basic Installation](#1-basic-installation)
  - [2. GPU Training](#2-gpu-training)
  - [3. Network Visualization](#3-network-visualization)
  - [4. All Dependencies](#4-all-dependencies)
- [Resources](#resources)
- [Contribution Guidelines](#contribution-guidelines)
- [License](#license)
- [Acknowledgements](#acknowledgements)
- [Credits](#credits)


## DeepFusion Framework
In DeepFusion, all networks are composed by combining 3 basic types of `components`:
- `Data`
- `Module`
- `Net`

`Data` objects hold the network activations and `Module` objects perform operations on them. The
`Net` object forms a thin wrapper around the `Data` and `Module` objects and is used to perform
the forward and backward passes.

A simple neural network is shown below, where, ellipses represent `Data` objects and rectangles
represent `Module`.

![Basic Neural Network](https://raw.githubusercontent.com/atharvaaalok/deepfusion/main/assets/readme_assets/Basic_NeuralNetwork.svg)

> Note the alternating sequence of `Data` and `Module`. The scheme is `Data` -> `Module` -> `Data`.
> Red represents nodes with updatable parameters.

Every node (`Data` or `Module`) has a unique *ID* (for eg: z1 or MatMul1) using which it can be
accessed and modified thus providing explicit access and control over all data and parameters.

More details on `Data`, `Module` and `Net` functionalities can be found in their respective readmes
in [deepfusion/components](./deepfusion/components/).

This is the basic idea behind deepfusion and any and all neural networks are created using this
procedure of attaching alternating `Data` and `Module` nodes.


## Basic Usage
As described before, in DeepFusion, all networks are composed by combining 3 basic types of
`components`:
- `Data`
- `Module`
- `Net`

The codebase follows the same intuitive structure:
```
deepfusion
└── components
    ├── net
    ├── data
    └── modules
        ├── activation_functions
        ├── loss_functions
        └── matmul.py
```

To construct a neural network we need to import the `Net`, `Data` and required `Module` objects.
```python
# Import Net, Data and necessary Modules
from deepfusion.components.net import Net
from deepfusion.components.data import Data
from deepfusion.components.modules import MatMul
from deepfusion.components.modules.activation_functions import Relu
from deepfusion.components.modules.loss_functions import MSELoss
```
> The codebase is designed in an intuitive manner. Let's see how we would think about the above
> imports. "Okay, to create a neural network I need components (deepfusion.components). What kind of
> components do we need? Net, Data and Modules (import these). What kind of modules (operations) do
> we need? we need matrix multiplication, an activation function and a loss function (import these).
> That's it!"

To connect `Data` and `Module` objects we need to keep in mind the following 2 things:
- `Data` objects are used to specify the activation *dimensions*.
- `Module` objects require the *inputs* and *output* data objects to be specified.


Now, let's construct the simple network we saw above.
```python
# Basic structure: x -> Matmul -> z1 -> Relu -> a -> Matmul -> z2, + y -> MSE -> loss
x = Data(ID = 'x', shape = (1, 3))

z1 = Data(ID = 'z1', shape = (1, 5))
Matmul1 = MatMul(ID = 'Matmul1', inputs = [x], output = z1)

a = Data(ID = 'a', shape = (1, 5))
ActF = Relu(ID = 'ActF', inputs = [z1], output = a)

z2 = Data(ID = 'z2', shape = (1, 1))
Matmul2 = MatMul(ID = 'Matmul2', inputs = [a], output = z2)

# Add target variable, loss variable and loss function
y = Data('y', shape = (1, 1))
loss = Data('loss', shape = (1, 1))
LossF = MSELoss(ID = 'LossF', inputs = [z2, y], output = loss)

# Initialize the neural network
net = Net(ID = 'Net', root_nodes = [loss])
```
> For `Data` the first dimension is the batch size. This is specified 1 during initialization. Eg:
> a length 3 vector would have shape = (1, 3) and a conv volume (C, H, W) would have shape =
> (1, C, H, W). During training any batch size (B, 3) or (B, C, H, W) can be used, the `Net` object
> takes care of it.

> Module parameter dimensions are inferred from connected data objects.

Examples introducing the basics and all features of the library can be found in the [demo](./demo/)
directory or in other [resources](#resources).

To have a look at the codebase tree have a look at [Codebase Tree](./assets/codebase_tree.txt).


## Highlights
### 1. Customizable training
Let's say we make the simple neural network as before:

![Basic Neural Network](https://raw.githubusercontent.com/atharvaaalok/deepfusion/main/assets/readme_assets/Basic_NeuralNetwork.svg)
And train it. During training only the *red* portions of the network receive updates and are
trained. Therefore, the matrix multiplication modules will be trained.

Let's say we have trained the network and now we want to find the input that optimizes the function
that we have learnt. This also falls under the same forward-backward-update procedure with the
following simple twist:
```python
net.freeze() # Freezes all modules
x.unfreeze() # Unfreezes the input node
```
After this we obtain the following network:

![Basic Neural Network](https://raw.githubusercontent.com/atharvaaalok/deepfusion/main/assets/readme_assets/Basic_NN_unfrozen_input.svg)
Now when we train the network only the input node value will get updates and be trained!

### 2. Gradient Checking
When developing new modules, the implementation of the backward pass can often be tricky and have
subtle bugs. Deepfusion provides a gradient checking utility that can find the derivatives of the
loss function(s) w.r.t. any specified data object (data node or module parameter). Eg:
```python
# Compare analytic and numeric gradients with a step size of 1e-6 for:
# Input node: x
gradient_checker(net, data_obj = x, h = 1e-6)
# Matrix multiplication parameter W
gradient_checker(net, data_obj = Matmul1.W, h = 1e-6)
```

> [!NOTE]
> Other features such as forward and backward pass profiling, multiple loss functions, automated
> training, gpu training etc. can be found in the [demo](./demo/) directory or in other
> [resources](#resources).


## Installation

### 1. Basic Installation  
To install the core part of deepfusion use:
```
$ pip install deepfusion
```

### 2. GPU Training
To use GPU training capabilities you will require [CuPy](https://pypi.org/project/cupy/) which
needs the [CUDA Toolkit](https://developer.nvidia.com/cuda-toolkit). If the CUDA Toolkit is
installed then use:
```
$ pip install deepfusion[gpu]
```

### 3. Network Visualization
For visualizing networks you will require the [Graphviz](https://graphviz.org/download/) software
and the [graphviz](https://pypi.org/project/graphviz/) package. If Graphviz is installed then use:
```
$ pip install deepfusion[visualization]
```
### 4. All Dependencies
If all dependencies are pre-installed use:
```
pip install deepfusion[gpu,visualization]
```

> [!IMPORTANT]
> Make sure to select add to PATH options when downloading dependency softwares.


## Resources
- [DeepFusion documentation]()
- [DeepFusion demo](./demo/)
- [DeepFusion Tutorials]()


## Contribution Guidelines
Contributions for the following are encouraged and greatly appreciated:
- **Code Optimization:** Benchmark your results and show a clear improvement.
- **Visualization:** Currently requires graphviz which is usually a pain to install. Structured
  graph visualization using say matplotlib would be a clear win.
- **More Modules:** Most scope for contribution currently in the following modules: loss_functions,
  pooling, normalizations, RNN modules etc.
- **More Features:** Some ideas include adding multiprocessing, working with pre-trained models from
  other libraries etc.
- **Testing:** Incorporating testing codes.
- **Improving Documentation:** Improving doc-string clarity and including doc tests. Also perhaps
  making a website for API reference.

We'll use [Github issues](https://github.com/atharvaaalok/deepfusion/issues) for tracking pull
requests and bugs.


## License
Distributed under the [MIT License](License).


## Acknowledgements
Theoretical and code ideas inspired from:
- [CS231n: Deep Learning for Computer Vision](https://cs231n.stanford.edu/)
- [Coursera: Deep Learning Specialization](https://www.coursera.org/specializations/deep-learning)
- [Google Python Style Guide](https://github.com/google/styleguide/blob/gh-pages/pyguide.md)
- [Udemy: Python Packaging](https://www.udemy.com/course/python-packaging/?couponCode=LEADERSALE24B)


## Credits
- Logo design by [Ankur Tiwary](https://github.com/ankurTiwxry)

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/atharvaaalok/deepfusion",
    "name": "deepfusion",
    "maintainer": null,
    "docs_url": null,
    "requires_python": null,
    "maintainer_email": null,
    "keywords": "deepfusion, deep learning, neural networks, artificial intelligence, machine learning, model, optimization, backpropagation",
    "author": "Atharva Aalok",
    "author_email": "atharvaaalok@gmail.com",
    "download_url": null,
    "platform": null,
    "description": "<div align=\"center\">\r\n    <picture>\r\n        <source media=\"(prefers-color-scheme: light)\" srcset=\"https://raw.githubusercontent.com/atharvaaalok/deepfusion/main/assets/logos/Light_TextRight.svg\">\r\n        <source media=\"(prefers-color-scheme: dark)\" srcset=\"https://raw.githubusercontent.com/atharvaaalok/deepfusion/main/assets/logos/Dark_TextRight.svg\">\r\n        <img alt=\"DeepFusion Logo with text below it. Displays the light version in light mode and\r\n        the dark version logo in dark mode.\" src=\"https://raw.githubusercontent.com/atharvaaalok/deepfusion/main/assets/logos/Light_TextRight.svg\" width=\"100%\">\r\n    </picture>\r\n</div>\r\n\r\n<br>\r\n\r\nDeepFusion is a highly modular and customizable deep learning framework.\r\n\r\nIt is designed to provide strong and explicit control over all data, operations and parameters while\r\nmaintaining a simple and intuitive code base.\r\n\r\n\r\n## Table of Contents\r\n- [Table of Contents](#table-of-contents)\r\n- [DeepFusion Framework](#deepfusion-framework)\r\n- [Basic Usage](#basic-usage)\r\n- [Highlights](#highlights)\r\n  - [1. Customizable training](#1-customizable-training)\r\n  - [2. Gradient Checking](#2-gradient-checking)\r\n- [Installation](#installation)\r\n  - [1. Basic Installation](#1-basic-installation)\r\n  - [2. GPU Training](#2-gpu-training)\r\n  - [3. Network Visualization](#3-network-visualization)\r\n  - [4. All Dependencies](#4-all-dependencies)\r\n- [Resources](#resources)\r\n- [Contribution Guidelines](#contribution-guidelines)\r\n- [License](#license)\r\n- [Acknowledgements](#acknowledgements)\r\n- [Credits](#credits)\r\n\r\n\r\n## DeepFusion Framework\r\nIn DeepFusion, all networks are composed by combining 3 basic types of `components`:\r\n- `Data`\r\n- `Module`\r\n- `Net`\r\n\r\n`Data` objects hold the network activations and `Module` objects perform operations on them. The\r\n`Net` object forms a thin wrapper around the `Data` and `Module` objects and is used to perform\r\nthe forward and backward passes.\r\n\r\nA simple neural network is shown below, where, ellipses represent `Data` objects and rectangles\r\nrepresent `Module`.\r\n\r\n![Basic Neural Network](https://raw.githubusercontent.com/atharvaaalok/deepfusion/main/assets/readme_assets/Basic_NeuralNetwork.svg)\r\n\r\n> Note the alternating sequence of `Data` and `Module`. The scheme is `Data` -> `Module` -> `Data`.\r\n> Red represents nodes with updatable parameters.\r\n\r\nEvery node (`Data` or `Module`) has a unique *ID* (for eg: z1 or MatMul1) using which it can be\r\naccessed and modified thus providing explicit access and control over all data and parameters.\r\n\r\nMore details on `Data`, `Module` and `Net` functionalities can be found in their respective readmes\r\nin [deepfusion/components](./deepfusion/components/).\r\n\r\nThis is the basic idea behind deepfusion and any and all neural networks are created using this\r\nprocedure of attaching alternating `Data` and `Module` nodes.\r\n\r\n\r\n## Basic Usage\r\nAs described before, in DeepFusion, all networks are composed by combining 3 basic types of\r\n`components`:\r\n- `Data`\r\n- `Module`\r\n- `Net`\r\n\r\nThe codebase follows the same intuitive structure:\r\n```\r\ndeepfusion\r\n\u2514\u2500\u2500 components\r\n    \u251c\u2500\u2500 net\r\n    \u251c\u2500\u2500 data\r\n    \u2514\u2500\u2500 modules\r\n        \u251c\u2500\u2500 activation_functions\r\n        \u251c\u2500\u2500 loss_functions\r\n        \u2514\u2500\u2500 matmul.py\r\n```\r\n\r\nTo construct a neural network we need to import the `Net`, `Data` and required `Module` objects.\r\n```python\r\n# Import Net, Data and necessary Modules\r\nfrom deepfusion.components.net import Net\r\nfrom deepfusion.components.data import Data\r\nfrom deepfusion.components.modules import MatMul\r\nfrom deepfusion.components.modules.activation_functions import Relu\r\nfrom deepfusion.components.modules.loss_functions import MSELoss\r\n```\r\n> The codebase is designed in an intuitive manner. Let's see how we would think about the above\r\n> imports. \"Okay, to create a neural network I need components (deepfusion.components). What kind of\r\n> components do we need? Net, Data and Modules (import these). What kind of modules (operations) do\r\n> we need? we need matrix multiplication, an activation function and a loss function (import these).\r\n> That's it!\"\r\n\r\nTo connect `Data` and `Module` objects we need to keep in mind the following 2 things:\r\n- `Data` objects are used to specify the activation *dimensions*.\r\n- `Module` objects require the *inputs* and *output* data objects to be specified.\r\n\r\n\r\nNow, let's construct the simple network we saw above.\r\n```python\r\n# Basic structure: x -> Matmul -> z1 -> Relu -> a -> Matmul -> z2, + y -> MSE -> loss\r\nx = Data(ID = 'x', shape = (1, 3))\r\n\r\nz1 = Data(ID = 'z1', shape = (1, 5))\r\nMatmul1 = MatMul(ID = 'Matmul1', inputs = [x], output = z1)\r\n\r\na = Data(ID = 'a', shape = (1, 5))\r\nActF = Relu(ID = 'ActF', inputs = [z1], output = a)\r\n\r\nz2 = Data(ID = 'z2', shape = (1, 1))\r\nMatmul2 = MatMul(ID = 'Matmul2', inputs = [a], output = z2)\r\n\r\n# Add target variable, loss variable and loss function\r\ny = Data('y', shape = (1, 1))\r\nloss = Data('loss', shape = (1, 1))\r\nLossF = MSELoss(ID = 'LossF', inputs = [z2, y], output = loss)\r\n\r\n# Initialize the neural network\r\nnet = Net(ID = 'Net', root_nodes = [loss])\r\n```\r\n> For `Data` the first dimension is the batch size. This is specified 1 during initialization. Eg:\r\n> a length 3 vector would have shape = (1, 3) and a conv volume (C, H, W) would have shape =\r\n> (1, C, H, W). During training any batch size (B, 3) or (B, C, H, W) can be used, the `Net` object\r\n> takes care of it.\r\n\r\n> Module parameter dimensions are inferred from connected data objects.\r\n\r\nExamples introducing the basics and all features of the library can be found in the [demo](./demo/)\r\ndirectory or in other [resources](#resources).\r\n\r\nTo have a look at the codebase tree have a look at [Codebase Tree](./assets/codebase_tree.txt).\r\n\r\n\r\n## Highlights\r\n### 1. Customizable training\r\nLet's say we make the simple neural network as before:\r\n\r\n![Basic Neural Network](https://raw.githubusercontent.com/atharvaaalok/deepfusion/main/assets/readme_assets/Basic_NeuralNetwork.svg)\r\nAnd train it. During training only the *red* portions of the network receive updates and are\r\ntrained. Therefore, the matrix multiplication modules will be trained.\r\n\r\nLet's say we have trained the network and now we want to find the input that optimizes the function\r\nthat we have learnt. This also falls under the same forward-backward-update procedure with the\r\nfollowing simple twist:\r\n```python\r\nnet.freeze() # Freezes all modules\r\nx.unfreeze() # Unfreezes the input node\r\n```\r\nAfter this we obtain the following network:\r\n\r\n![Basic Neural Network](https://raw.githubusercontent.com/atharvaaalok/deepfusion/main/assets/readme_assets/Basic_NN_unfrozen_input.svg)\r\nNow when we train the network only the input node value will get updates and be trained!\r\n\r\n### 2. Gradient Checking\r\nWhen developing new modules, the implementation of the backward pass can often be tricky and have\r\nsubtle bugs. Deepfusion provides a gradient checking utility that can find the derivatives of the\r\nloss function(s) w.r.t. any specified data object (data node or module parameter). Eg:\r\n```python\r\n# Compare analytic and numeric gradients with a step size of 1e-6 for:\r\n# Input node: x\r\ngradient_checker(net, data_obj = x, h = 1e-6)\r\n# Matrix multiplication parameter W\r\ngradient_checker(net, data_obj = Matmul1.W, h = 1e-6)\r\n```\r\n\r\n> [!NOTE]\r\n> Other features such as forward and backward pass profiling, multiple loss functions, automated\r\n> training, gpu training etc. can be found in the [demo](./demo/) directory or in other\r\n> [resources](#resources).\r\n\r\n\r\n## Installation\r\n\r\n### 1. Basic Installation  \r\nTo install the core part of deepfusion use:\r\n```\r\n$ pip install deepfusion\r\n```\r\n\r\n### 2. GPU Training\r\nTo use GPU training capabilities you will require [CuPy](https://pypi.org/project/cupy/) which\r\nneeds the [CUDA Toolkit](https://developer.nvidia.com/cuda-toolkit). If the CUDA Toolkit is\r\ninstalled then use:\r\n```\r\n$ pip install deepfusion[gpu]\r\n```\r\n\r\n### 3. Network Visualization\r\nFor visualizing networks you will require the [Graphviz](https://graphviz.org/download/) software\r\nand the [graphviz](https://pypi.org/project/graphviz/) package. If Graphviz is installed then use:\r\n```\r\n$ pip install deepfusion[visualization]\r\n```\r\n### 4. All Dependencies\r\nIf all dependencies are pre-installed use:\r\n```\r\npip install deepfusion[gpu,visualization]\r\n```\r\n\r\n> [!IMPORTANT]\r\n> Make sure to select add to PATH options when downloading dependency softwares.\r\n\r\n\r\n## Resources\r\n- [DeepFusion documentation]()\r\n- [DeepFusion demo](./demo/)\r\n- [DeepFusion Tutorials]()\r\n\r\n\r\n## Contribution Guidelines\r\nContributions for the following are encouraged and greatly appreciated:\r\n- **Code Optimization:** Benchmark your results and show a clear improvement.\r\n- **Visualization:** Currently requires graphviz which is usually a pain to install. Structured\r\n  graph visualization using say matplotlib would be a clear win.\r\n- **More Modules:** Most scope for contribution currently in the following modules: loss_functions,\r\n  pooling, normalizations, RNN modules etc.\r\n- **More Features:** Some ideas include adding multiprocessing, working with pre-trained models from\r\n  other libraries etc.\r\n- **Testing:** Incorporating testing codes.\r\n- **Improving Documentation:** Improving doc-string clarity and including doc tests. Also perhaps\r\n  making a website for API reference.\r\n\r\nWe'll use [Github issues](https://github.com/atharvaaalok/deepfusion/issues) for tracking pull\r\nrequests and bugs.\r\n\r\n\r\n## License\r\nDistributed under the [MIT License](License).\r\n\r\n\r\n## Acknowledgements\r\nTheoretical and code ideas inspired from:\r\n- [CS231n: Deep Learning for Computer Vision](https://cs231n.stanford.edu/)\r\n- [Coursera: Deep Learning Specialization](https://www.coursera.org/specializations/deep-learning)\r\n- [Google Python Style Guide](https://github.com/google/styleguide/blob/gh-pages/pyguide.md)\r\n- [Udemy: Python Packaging](https://www.udemy.com/course/python-packaging/?couponCode=LEADERSALE24B)\r\n\r\n\r\n## Credits\r\n- Logo design by [Ankur Tiwary](https://github.com/ankurTiwxry)\r\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "DeepFusion is a highly modular and customizable deep learning framework!",
    "version": "0.0.3",
    "project_urls": {
        "Homepage": "https://github.com/atharvaaalok/deepfusion"
    },
    "split_keywords": [
        "deepfusion",
        " deep learning",
        " neural networks",
        " artificial intelligence",
        " machine learning",
        " model",
        " optimization",
        " backpropagation"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "30b70ae20fbf45cd805e8f5ec0c7853f160b36016263c459e3a12dce7096d6a7",
                "md5": "0b50c2af4e24e0b7459c5afc64d885a9",
                "sha256": "c4199ad5814ab32517d5c25a93d7d074a55a1f790cee455f57a85b85b3f08554"
            },
            "downloads": -1,
            "filename": "deepfusion-0.0.3-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "0b50c2af4e24e0b7459c5afc64d885a9",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": null,
            "size": 60660,
            "upload_time": "2024-05-23T09:42:49",
            "upload_time_iso_8601": "2024-05-23T09:42:49.728811Z",
            "url": "https://files.pythonhosted.org/packages/30/b7/0ae20fbf45cd805e8f5ec0c7853f160b36016263c459e3a12dce7096d6a7/deepfusion-0.0.3-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-05-23 09:42:49",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "atharvaaalok",
    "github_project": "deepfusion",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "requirements": [
        {
            "name": "cupy_cuda12x",
            "specs": [
                [
                    "==",
                    "13.1.0"
                ]
            ]
        },
        {
            "name": "graphviz",
            "specs": [
                [
                    "==",
                    "0.20.3"
                ]
            ]
        },
        {
            "name": "matplotlib",
            "specs": [
                [
                    "==",
                    "3.8.4"
                ]
            ]
        },
        {
            "name": "numpy",
            "specs": [
                [
                    "==",
                    "1.26.4"
                ]
            ]
        },
        {
            "name": "setuptools",
            "specs": [
                [
                    "==",
                    "69.5.1"
                ]
            ]
        },
        {
            "name": "typing_extensions",
            "specs": [
                [
                    "==",
                    "4.11.0"
                ]
            ]
        }
    ],
    "lcname": "deepfusion"
}
        
Elapsed time: 0.33435s