tensorly-torch


Nametensorly-torch JSON
Version 0.4.0 PyPI version JSON
download
home_pagehttps://github.com/tensorly/tensorly-torch
SummaryDeep Learning with Tensors in Python, using PyTorch and TensorLy.
upload_time2023-01-15 00:54:49
maintainer
docs_urlNone
authorJean Kossaifi
requires_python
licenseModified BSD
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            ==============
TensorLy-Torch
==============

TensorLy-Torch is a Python library for deep tensor networks that
builds on top of `TensorLy <https://github.com/tensorly/tensorly/>`_
and `PyTorch <https://pytorch.org/>`_.
It allows to easily leverage tensor methods in a deep learning setting and comes with all batteries included.

- **Website:** http://tensorly.org/torch/
- **Source-code:**  https://github.com/tensorly/torch


With TensorLy-Torch, you can easily: 

- **Tensor Factorizations**: decomposing, manipulating and initializing tensor decompositions can be tricky. We take care of it all, in a convenient, unified API.
- **Leverage structure in your data**: with tensor layers, you can easily leverage the structure in your data, through Tensor Regression Layers, Factorized Convolutions, etc
- **Built-in tensor layers**: all you have to do is import tensorly torch and include the layers we provide directly within your PyTorch models!
- **Tensor hooks**: you can easily augment your architectures with our built-in Tensor Hooks. Robustify your network with Tensor Dropout and automatically select the rank end-to-end with L1 Regularization!
- **All the methods available**: we are always adding more methods to make it easy to compare between the performance of various deep tensor based methods!

Deep Tensorized Learning
========================

Tensor methods generalize matrix algebraic operations to higher-orders. Deep neural networks typically map between higher-order tensors. 
In fact, it is the ability of deep convolutional neural networks to preserve and leverage local structure that, along with large datasets and efficient hardware, made the current levels of performance possible.
Tensor methods allow to further leverage and preserve that structure, for individual layers or whole networks. 

.. image:: ./doc/_static/tensorly-torch-pyramid.png

TensorLy is a Python library that aims at making tensor learning simple and accessible.
It provides a high-level API for tensor methods, including core tensor operations, tensor decomposition and regression. 
It has a flexible backend that allows running operations seamlessly using NumPy, PyTorch, TensorFlow, JAX, MXNet and CuPy.

**TensorLy-Torch** is a PyTorch only library that builds on top of TensorLy and provides out-of-the-box tensor layers.

Improve your neural networks with tensor methods
------------------------------------------------

Tensor methods generalize matrix algebraic operations to higher-orders. Deep neural networks typically map between higher-order tensors. 
In fact, it is the ability of deep convolutional neural networks to preserve and leverage local structure that, along with large datasets and efficient hardware, made the current levels of performance possible.
Tensor methods allow to further leverage and preserve that structure, for individual layers or whole networks. 

In TensorLy-Torch, we provide convenient layers that do all the heavy lifting for you 
and provide the benefits tensor based layers wrapped in a nice, well documented and tested API.

For instance, convolution layers of any order (2D, 3D or more), can be efficiently parametrized
using tensor decomposition. Using a CP decomposition results in a separable convolution
and you can replace your original convolution with a series of small efficient ones: 

.. image:: ./doc/_static/cp-conv.png 

These can be easily perform with FactorizedConv in TensorLy-Torch.
We also have Tucker convolutions and new tensor-train convolutions!
We also implement various other methods such as tensor regression and contraction layers, 
tensorized linear layers, tensor dropout and more!


Installing TensorLy-Torch
=========================

Through pip
-----------

.. code:: 

   pip install tensorly-torch


From source
-----------

.. code::

  git clone https://github.com/tensorly/torch
  cd torch
  pip install -e .








            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/tensorly/tensorly-torch",
    "name": "tensorly-torch",
    "maintainer": "",
    "docs_url": null,
    "requires_python": "",
    "maintainer_email": "",
    "keywords": "",
    "author": "Jean Kossaifi",
    "author_email": "jean.kossaifi@gmail.com",
    "download_url": "https://files.pythonhosted.org/packages/ef/20/ab7581fdaec0715da388ff466e51fde9ab24dfb4cedc003a69230c69649f/tensorly-torch-0.4.0.tar.gz",
    "platform": null,
    "description": "==============\nTensorLy-Torch\n==============\n\nTensorLy-Torch is a Python library for deep tensor networks that\nbuilds on top of `TensorLy <https://github.com/tensorly/tensorly/>`_\nand `PyTorch <https://pytorch.org/>`_.\nIt allows to easily leverage tensor methods in a deep learning setting and comes with all batteries included.\n\n- **Website:** http://tensorly.org/torch/\n- **Source-code:**  https://github.com/tensorly/torch\n\n\nWith TensorLy-Torch, you can easily: \n\n- **Tensor Factorizations**: decomposing, manipulating and initializing tensor decompositions can be tricky. We take care of it all, in a convenient, unified API.\n- **Leverage structure in your data**: with tensor layers, you can easily leverage the structure in your data, through Tensor Regression Layers, Factorized Convolutions, etc\n- **Built-in tensor layers**: all you have to do is import tensorly torch and include the layers we provide directly within your PyTorch models!\n- **Tensor hooks**: you can easily augment your architectures with our built-in Tensor Hooks. Robustify your network with Tensor Dropout and automatically select the rank end-to-end with L1 Regularization!\n- **All the methods available**: we are always adding more methods to make it easy to compare between the performance of various deep tensor based methods!\n\nDeep Tensorized Learning\n========================\n\nTensor methods generalize matrix algebraic operations to higher-orders. Deep neural networks typically map between higher-order tensors. \nIn fact, it is the ability of deep convolutional neural networks to preserve and leverage local structure that, along with large datasets and efficient hardware, made the current levels of performance possible.\nTensor methods allow to further leverage and preserve that structure, for individual layers or whole networks. \n\n.. image:: ./doc/_static/tensorly-torch-pyramid.png\n\nTensorLy is a Python library that aims at making tensor learning simple and accessible.\nIt provides a high-level API for tensor methods, including core tensor operations, tensor decomposition and regression. \nIt has a flexible backend that allows running operations seamlessly using NumPy, PyTorch, TensorFlow, JAX, MXNet and CuPy.\n\n**TensorLy-Torch** is a PyTorch only library that builds on top of TensorLy and provides out-of-the-box tensor layers.\n\nImprove your neural networks with tensor methods\n------------------------------------------------\n\nTensor methods generalize matrix algebraic operations to higher-orders. Deep neural networks typically map between higher-order tensors. \nIn fact, it is the ability of deep convolutional neural networks to preserve and leverage local structure that, along with large datasets and efficient hardware, made the current levels of performance possible.\nTensor methods allow to further leverage and preserve that structure, for individual layers or whole networks. \n\nIn TensorLy-Torch, we provide convenient layers that do all the heavy lifting for you \nand provide the benefits tensor based layers wrapped in a nice, well documented and tested API.\n\nFor instance, convolution layers of any order (2D, 3D or more), can be efficiently parametrized\nusing tensor decomposition. Using a CP decomposition results in a separable convolution\nand you can replace your original convolution with a series of small efficient ones: \n\n.. image:: ./doc/_static/cp-conv.png \n\nThese can be easily perform with FactorizedConv in TensorLy-Torch.\nWe also have Tucker convolutions and new tensor-train convolutions!\nWe also implement various other methods such as tensor regression and contraction layers, \ntensorized linear layers, tensor dropout and more!\n\n\nInstalling TensorLy-Torch\n=========================\n\nThrough pip\n-----------\n\n.. code:: \n\n   pip install tensorly-torch\n\n\nFrom source\n-----------\n\n.. code::\n\n  git clone https://github.com/tensorly/torch\n  cd torch\n  pip install -e .\n\n\n\n\n\n\n\n",
    "bugtrack_url": null,
    "license": "Modified BSD",
    "summary": "Deep Learning with Tensors in Python, using PyTorch and TensorLy.",
    "version": "0.4.0",
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "37a8ffd579c8f2eb19279929140f03e07808e4a1d7c609994a8fe005dbe0bbf9",
                "md5": "7bc2a7e1f62ed3b40964bc8fe878a4b9",
                "sha256": "05cb907465a01949f939490e800d2cecc7b6ded34d10bddf5d64e2195e543f80"
            },
            "downloads": -1,
            "filename": "tensorly_torch-0.4.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "7bc2a7e1f62ed3b40964bc8fe878a4b9",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": null,
            "size": 59117,
            "upload_time": "2023-01-15T00:54:48",
            "upload_time_iso_8601": "2023-01-15T00:54:48.545564Z",
            "url": "https://files.pythonhosted.org/packages/37/a8/ffd579c8f2eb19279929140f03e07808e4a1d7c609994a8fe005dbe0bbf9/tensorly_torch-0.4.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "ef20ab7581fdaec0715da388ff466e51fde9ab24dfb4cedc003a69230c69649f",
                "md5": "ed78f85feaddc3f7997771fd9a125b24",
                "sha256": "5e3d7f60c1916be64d6237ad61ae714a3dc75980ec7fff1015bcccd92e7ab95a"
            },
            "downloads": -1,
            "filename": "tensorly-torch-0.4.0.tar.gz",
            "has_sig": false,
            "md5_digest": "ed78f85feaddc3f7997771fd9a125b24",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": null,
            "size": 43375,
            "upload_time": "2023-01-15T00:54:49",
            "upload_time_iso_8601": "2023-01-15T00:54:49.865274Z",
            "url": "https://files.pythonhosted.org/packages/ef/20/ab7581fdaec0715da388ff466e51fde9ab24dfb4cedc003a69230c69649f/tensorly-torch-0.4.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-01-15 00:54:49",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "github_user": "tensorly",
    "github_project": "tensorly-torch",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "requirements": [],
    "lcname": "tensorly-torch"
}
        
Elapsed time: 0.02934s