deep-unfolding


Namedeep-unfolding JSON
Version 0.2.0 PyPI version JSON
download
home_pageNone
SummaryDeep unfolding of iterative methods to solve linear equations
upload_time2024-06-18 15:54:58
maintainerNone
docs_urlNone
authorNone
requires_python>=3.9
licenseGPLv3
keywords iterative methods deep unfolding linear equations solver
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            [![Tests](https://github.com/Salahberra2022/deep-unfolding/actions/workflows/tests.yml/badge.svg)](https://github.com/Salahberra2022/deep-unfolding/actions/workflows/tests.yml)
[![codecov](https://codecov.io/gh/Salahberra2022/deep-unfolding/graph/badge.svg?token=7LPWPLHYC4)](https://codecov.io/gh/Salahberra2022/deep-unfolding)
[![docs](https://img.shields.io/badge/docs-click_here-blue.svg)](https://Salahberra2022.github.io/deep-unfolding/)
[![PyPI](https://img.shields.io/pypi/v/deep-unfolding)](https://pypi.org/project/deep-unfolding/)
![PyPI - Downloads](https://img.shields.io/pypi/dm/deep-unfolding?color=blueviolet)
[![GPLv3](https://img.shields.io/badge/license-GPLv3-yellowgreen.svg)](https://www.tldrlegal.com/license/gnu-general-public-license-v3-gpl-3)

# deep-unfolding: Deep unfolding of iterative methods

The **deep-unfolding** package includes iterative methods for solving linear equations. However, due to the various parameters and performance characteristics of the iterative approach, it is necessary to optimize these parameters to improve the convergence rate. **deep-unfolding** takes an iterative algorithm with a fixed number of iterations $T$, unravels its structure, and adds trainable parameters. These parameters are then trained using deep learning techniques such as loss functions, stochastic gradient descent, and backpropagation.

The package contains two different modules containing iterative methods. The first, `methods`, includes conventional iterative methods. The second, `train_methods`, includes deep unfolding versions of the conventional methods.

## Installation

```bash
pip install --upgrade pip
pip install deep-unfolding
```

## Quick start

```python
from deep_unfolding import device, evaluate_model, generate_A_H_sol, SORNet, train_model
from torch import nn, optim

total_itr = 25  # Total number of iterations
n = 300  # Number of rows
m = 600  # Number of columns
bs = 10000  # Mini-batch size (samples)
num_batch = 500  # Number of mini-batches
lr_adam = 0.002  # Learning rate of optimizer
init_val_SORNet = 1.1  # Initial value of omega for SORNet

seed = 12

A, H, W, solution, y = generate_A_H_sol(n=n, m=m, seed=seed, bs=bs)
loss_func = nn.MSELoss()

# Model
model_SorNet = SORNet(A, H, bs, y, init_val_SORNet, device=device)

# Optimizer
opt_SORNet = optim.Adam(model_SorNet.parameters(), lr=lr_adam)

trained_model_SorNet, loss_gen_SORNet = train_model(model_SorNet, opt_SORNet, loss_func, solution, total_itr, num_batch)

norm_list_SORNet = evaluate_model(trained_model_SorNet, solution, n, bs, total_itr, device=device)
```

## Package contents

This package implements various iterative techniques for approximating the solutions of linear problems of the type $Ax = b$. The conventional methods implemented in the `methods` module are:

- **GS**: Gauss-Seidel (GS) algorithm
- **RI**: Richardson iteration algorithm
- **Jacobi**: Jacobi iteration (RI) algorithm
- **SOR**: Successive Over-Relaxation (SOR) algorithm
- **SORCheby**: Successive Over-Relaxation (SOR) with Chebyshev acceleration algorithm
- **AOR**: Accelerated Over-Relaxation (AOR) algorithm
- **AORCheby**: Accelerated Over-Relaxation (AOR) with Chebyshev acceleration algorithm

This package also implements several models based on **Deep Unfolding Learning**, enabling optimization of the parameters of some of the preceding algorithms to obtain an optimal approximation. The models implemented in the module `train_methods` are:

- **SORNet**: Optimization via Deep Unfolding Learning of the Successive Over-Relaxation (SOR) algorithm
- **SORChebyNet**: Optimization via Deep Unfolding Learning of the Successive Over-Relaxation (SOR) with Chebyshev acceleration algorithm
- **AORNet**: Optimization via Deep Unfolding Learning of the Accelerated Over-Relaxation (AOR) algorithm
- **RINet**: Optimization via Deep Unfolding Learning of the Richardson iteration (RI) algorithm

## Reference

If you use this software, please cite the following reference: *available soon*

## License

[GPLv3 License](LICENSE)

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "deep-unfolding",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.9",
    "maintainer_email": null,
    "keywords": "iterative methods, deep unfolding, linear equations solver",
    "author": null,
    "author_email": "Salah Berra <salahberra39@gmail.com>, Nennouche Mohamed <moohaameed.nennouche@gmail.com>, Nuno Fachada <nuno.fachada@ulusofona.pt>",
    "download_url": "https://files.pythonhosted.org/packages/bb/85/0db53bb41ab7dba7acc565d23c893dcb577796fbc1477d47b78e92fef827/deep_unfolding-0.2.0.tar.gz",
    "platform": null,
    "description": "[![Tests](https://github.com/Salahberra2022/deep-unfolding/actions/workflows/tests.yml/badge.svg)](https://github.com/Salahberra2022/deep-unfolding/actions/workflows/tests.yml)\n[![codecov](https://codecov.io/gh/Salahberra2022/deep-unfolding/graph/badge.svg?token=7LPWPLHYC4)](https://codecov.io/gh/Salahberra2022/deep-unfolding)\n[![docs](https://img.shields.io/badge/docs-click_here-blue.svg)](https://Salahberra2022.github.io/deep-unfolding/)\n[![PyPI](https://img.shields.io/pypi/v/deep-unfolding)](https://pypi.org/project/deep-unfolding/)\n![PyPI - Downloads](https://img.shields.io/pypi/dm/deep-unfolding?color=blueviolet)\n[![GPLv3](https://img.shields.io/badge/license-GPLv3-yellowgreen.svg)](https://www.tldrlegal.com/license/gnu-general-public-license-v3-gpl-3)\n\n# deep-unfolding: Deep unfolding of iterative methods\n\nThe **deep-unfolding** package includes iterative methods for solving linear equations. However, due to the various parameters and performance characteristics of the iterative approach, it is necessary to optimize these parameters to improve the convergence rate. **deep-unfolding** takes an iterative algorithm with a fixed number of iterations $T$, unravels its structure, and adds trainable parameters. These parameters are then trained using deep learning techniques such as loss functions, stochastic gradient descent, and backpropagation.\n\nThe package contains two different modules containing iterative methods. The first, `methods`, includes conventional iterative methods. The second, `train_methods`, includes deep unfolding versions of the conventional methods.\n\n## Installation\n\n```bash\npip install --upgrade pip\npip install deep-unfolding\n```\n\n## Quick start\n\n```python\nfrom deep_unfolding import device, evaluate_model, generate_A_H_sol, SORNet, train_model\nfrom torch import nn, optim\n\ntotal_itr = 25  # Total number of iterations\nn = 300  # Number of rows\nm = 600  # Number of columns\nbs = 10000  # Mini-batch size (samples)\nnum_batch = 500  # Number of mini-batches\nlr_adam = 0.002  # Learning rate of optimizer\ninit_val_SORNet = 1.1  # Initial value of omega for SORNet\n\nseed = 12\n\nA, H, W, solution, y = generate_A_H_sol(n=n, m=m, seed=seed, bs=bs)\nloss_func = nn.MSELoss()\n\n# Model\nmodel_SorNet = SORNet(A, H, bs, y, init_val_SORNet, device=device)\n\n# Optimizer\nopt_SORNet = optim.Adam(model_SorNet.parameters(), lr=lr_adam)\n\ntrained_model_SorNet, loss_gen_SORNet = train_model(model_SorNet, opt_SORNet, loss_func, solution, total_itr, num_batch)\n\nnorm_list_SORNet = evaluate_model(trained_model_SorNet, solution, n, bs, total_itr, device=device)\n```\n\n## Package contents\n\nThis package implements various iterative techniques for approximating the solutions of linear problems of the type $Ax = b$. The conventional methods implemented in the `methods` module are:\n\n- **GS**: Gauss-Seidel (GS) algorithm\n- **RI**: Richardson iteration algorithm\n- **Jacobi**: Jacobi iteration (RI) algorithm\n- **SOR**: Successive Over-Relaxation (SOR) algorithm\n- **SORCheby**: Successive Over-Relaxation (SOR) with Chebyshev acceleration algorithm\n- **AOR**: Accelerated Over-Relaxation (AOR) algorithm\n- **AORCheby**: Accelerated Over-Relaxation (AOR) with Chebyshev acceleration algorithm\n\nThis package also implements several models based on **Deep Unfolding Learning**, enabling optimization of the parameters of some of the preceding algorithms to obtain an optimal approximation. The models implemented in the module `train_methods` are:\n\n- **SORNet**: Optimization via Deep Unfolding Learning of the Successive Over-Relaxation (SOR) algorithm\n- **SORChebyNet**: Optimization via Deep Unfolding Learning of the Successive Over-Relaxation (SOR) with Chebyshev acceleration algorithm\n- **AORNet**: Optimization via Deep Unfolding Learning of the Accelerated Over-Relaxation (AOR) algorithm\n- **RINet**: Optimization via Deep Unfolding Learning of the Richardson iteration (RI) algorithm\n\n## Reference\n\nIf you use this software, please cite the following reference: *available soon*\n\n## License\n\n[GPLv3 License](LICENSE)\n",
    "bugtrack_url": null,
    "license": "GPLv3",
    "summary": "Deep unfolding of iterative methods to solve linear equations",
    "version": "0.2.0",
    "project_urls": {
        "Bug Tracker": "https://github.com/Salahberra2022/deep_unfolding/issues",
        "Documentation": "https://Salahberra2022.github.io/deep_unfolding/",
        "Homepage": "https://github.com/Salahberra2022/deep_unfolding"
    },
    "split_keywords": [
        "iterative methods",
        " deep unfolding",
        " linear equations solver"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "eb370350b4df739491a1bfdd183fa28d7a5d41bda34473daaa9c259206a88647",
                "md5": "38ac9159c0d5d71605c304143f49f7c1",
                "sha256": "0f7b474d1fce0ed80af4d5365d63352a3e9c40976cdd7b9d3eaf76e40b81f789"
            },
            "downloads": -1,
            "filename": "deep_unfolding-0.2.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "38ac9159c0d5d71605c304143f49f7c1",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.9",
            "size": 22708,
            "upload_time": "2024-06-18T15:54:56",
            "upload_time_iso_8601": "2024-06-18T15:54:56.571042Z",
            "url": "https://files.pythonhosted.org/packages/eb/37/0350b4df739491a1bfdd183fa28d7a5d41bda34473daaa9c259206a88647/deep_unfolding-0.2.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "bb850db53bb41ab7dba7acc565d23c893dcb577796fbc1477d47b78e92fef827",
                "md5": "4d085c50600ef9ba1d874879bbebcfb1",
                "sha256": "d78f267afd5e5f41dbe81ac6b00d33258dd8390fee7f1670dc6d88d80e85e394"
            },
            "downloads": -1,
            "filename": "deep_unfolding-0.2.0.tar.gz",
            "has_sig": false,
            "md5_digest": "4d085c50600ef9ba1d874879bbebcfb1",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.9",
            "size": 25956,
            "upload_time": "2024-06-18T15:54:58",
            "upload_time_iso_8601": "2024-06-18T15:54:58.404497Z",
            "url": "https://files.pythonhosted.org/packages/bb/85/0db53bb41ab7dba7acc565d23c893dcb577796fbc1477d47b78e92fef827/deep_unfolding-0.2.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-06-18 15:54:58",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "Salahberra2022",
    "github_project": "deep_unfolding",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "deep-unfolding"
}
        
Elapsed time: 0.55877s