# Pytorch implementation of Maximum Manifold Capacity Representations Loss
> This is not an official implementation from the authors.
> [Official implementation from the authors](https://github.com/ThomasYerxa/mmcr).
Maximum Manifold Capacity Representation Loss (MMCR Loss) is a novel objective function for self-supervised learning (SSL) proposed by researchers in Center for Neural Science, NYU.
This repository aims to offer a convenient MMCR loss module for PyTorch, which can be easily integrated into your projects using `git clone` or `pip install`.
## How to install
```sh
pip3 install mmcr
```
or
```sh
git clone https://github.com/skyil7/mmcr
cd mmcr
pip install -e .
```
## Usage
```python
import torch
from mmcr import MMCRLoss
loss = MMCRLoss()
input_tensor = torch.randn((8, 16, 128)) # batch_size, n_aug, feature_dim
loss_val = loss(input_tensor)
print(loss_val)
```
## How it works
```math
\mathcal{L} = \lambda\frac{\sum^{N}_{i=1}\lVert z_{i} \rVert_{*}}{N} - \lVert C\rVert_{*}
```
Where $\lambda$ is a trade-off parameter, $`\lVert z_i\rVert_*`$ is local nuclear norm of the $i$-th sample's augmented matrix, and $`\lVert C\rVert_*`$ is the global nuclear norm of centroid matrix $C$.
### Arguments
- `lmbda`: Trade-off parameter $\lambda$. default is 0.
- `n_aug`: number of augmented views. If your input tensor is 3-dimensional $(N, k, d)$, you don't need to specify it.
## Original Implementation from the author
- This repository was developed with reference to the [official implementation](https://github.com/ThomasYerxa/mmcr) provided by the authors.
Raw data
{
"_id": null,
"home_page": "https://github.com/skyil7/mmcr",
"name": "mmcr",
"maintainer": null,
"docs_url": null,
"requires_python": null,
"maintainer_email": null,
"keywords": "torch, loss, mmcr, Maximum Manifold Capacity Representations",
"author": "Gio Paik",
"author_email": null,
"download_url": "https://files.pythonhosted.org/packages/cc/90/aeb3c332e4ffe79a705d89a7e567a6958436e78d47096d8668024bf24da2/mmcr-0.1.0.tar.gz",
"platform": null,
"description": "# Pytorch implementation of Maximum Manifold Capacity Representations Loss\r\n> This is not an official implementation from the authors.\r\n> [Official implementation from the authors](https://github.com/ThomasYerxa/mmcr).\r\n\r\n\r\nMaximum Manifold Capacity Representation Loss (MMCR Loss) is a novel objective function for self-supervised learning (SSL) proposed by researchers in Center for Neural Science, NYU.\r\n\r\nThis repository aims to offer a convenient MMCR loss module for PyTorch, which can be easily integrated into your projects using `git clone` or `pip install`.\r\n\r\n## How to install\r\n```sh\r\npip3 install mmcr\r\n```\r\nor \r\n```sh\r\ngit clone https://github.com/skyil7/mmcr\r\ncd mmcr\r\npip install -e .\r\n```\r\n## Usage\r\n```python\r\nimport torch\r\nfrom mmcr import MMCRLoss\r\n\r\nloss = MMCRLoss()\r\n\r\ninput_tensor = torch.randn((8, 16, 128)) # batch_size, n_aug, feature_dim\r\nloss_val = loss(input_tensor)\r\n\r\nprint(loss_val)\r\n```\r\n\r\n## How it works\r\n```math\r\n\\mathcal{L} = \\lambda\\frac{\\sum^{N}_{i=1}\\lVert z_{i} \\rVert_{*}}{N} - \\lVert C\\rVert_{*}\r\n```\r\n\r\nWhere $\\lambda$ is a trade-off parameter, $`\\lVert z_i\\rVert_*`$ is local nuclear norm of the $i$-th sample's augmented matrix, and $`\\lVert C\\rVert_*`$ is the global nuclear norm of centroid matrix $C$.\r\n\r\n### Arguments\r\n- `lmbda`: Trade-off parameter $\\lambda$. default is 0.\r\n- `n_aug`: number of augmented views. If your input tensor is 3-dimensional $(N, k, d)$, you don't need to specify it.\r\n\r\n## Original Implementation from the author\r\n- This repository was developed with reference to the [official implementation](https://github.com/ThomasYerxa/mmcr) provided by the authors.\r\n",
"bugtrack_url": null,
"license": null,
"summary": "MMCR Loss: Learning efficient coding of natural images with maximum manifold capacity representations",
"version": "0.1.0",
"project_urls": {
"Homepage": "https://github.com/skyil7/mmcr"
},
"split_keywords": [
"torch",
" loss",
" mmcr",
" maximum manifold capacity representations"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "5f6e962a216df69f0ff7d4b8f1fbf0022f9b91bd755462d93b696eaa8cbc6420",
"md5": "f912be52422358a7ec7dd72f28d437b3",
"sha256": "23ac1c0efd919a6ba10491121b6681aa62fc740c43b8871d38312b89dda83cfd"
},
"downloads": -1,
"filename": "mmcr-0.1.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "f912be52422358a7ec7dd72f28d437b3",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": null,
"size": 3465,
"upload_time": "2024-09-16T08:18:08",
"upload_time_iso_8601": "2024-09-16T08:18:08.736637Z",
"url": "https://files.pythonhosted.org/packages/5f/6e/962a216df69f0ff7d4b8f1fbf0022f9b91bd755462d93b696eaa8cbc6420/mmcr-0.1.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "cc90aeb3c332e4ffe79a705d89a7e567a6958436e78d47096d8668024bf24da2",
"md5": "6e8d36253bbb0542d1014f6726a4dd94",
"sha256": "ec7118fc6512139cea16639e6014ddb315cc2fa1592cc0cd53cb3d1d963be7e7"
},
"downloads": -1,
"filename": "mmcr-0.1.0.tar.gz",
"has_sig": false,
"md5_digest": "6e8d36253bbb0542d1014f6726a4dd94",
"packagetype": "sdist",
"python_version": "source",
"requires_python": null,
"size": 3206,
"upload_time": "2024-09-16T08:18:10",
"upload_time_iso_8601": "2024-09-16T08:18:10.505788Z",
"url": "https://files.pythonhosted.org/packages/cc/90/aeb3c332e4ffe79a705d89a7e567a6958436e78d47096d8668024bf24da2/mmcr-0.1.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-09-16 08:18:10",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "skyil7",
"github_project": "mmcr",
"travis_ci": false,
"coveralls": false,
"github_actions": false,
"requirements": [],
"lcname": "mmcr"
}