Name | emonet-py JSON |
Version |
1.0.6
JSON |
| download |
home_page | None |
Summary | A PyTorch port of the MatLab EmoNet network by Kragel et al., 2019. |
upload_time | 2024-07-03 08:19:22 |
maintainer | None |
docs_url | None |
author | None |
requires_python | >=3.9 |
license | Copyright (c) 2024 Laurent Mertens Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. |
keywords |
ann
pytorch
neural networks
emotion recognition
|
VCS |
|
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
# EmoNet: A PyTorch port
This package contains a PyTorch port of the EmoNet network originally developed in MatLab described in the paper "[Emotion schemas are
embedded in the human visual system](https://www.science.org/doi/10.1126/sciadv.aaw4358)" by Krager et al., 2019. It is being distributed with explicit permission from the original (first) author.
The original model can be found at:
[https://github.com/ecco-laboratory/EmoNet](https://github.com/ecco-laboratory/EmoNet)\
A PyTorch port by this lab is also available at: [https://github.com/ecco-laboratory/emonet-pytorch](https://github.com/ecco-laboratory/emonet-pytorch).
## Installation
To install this repository, either clone the [GitLab project](https://gitlab.com/EAVISE/lme/emonet), or install using
```pip install emonet-py```.
| ℹ️ If you install through ```pip```, the ```data``` folder is not included, as it is too big for distribution through PyPI. </br> Instead, the data will be downloaded automatically from the GitLab repository by the ```check_data_files.py``` script when running the code. </br> The files will automatically be deleted when uninstalling the package. |
|-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
## Contents
The ```data``` folder contains:
- The original model parameters, as exported from MatLab (```*.bz2``` files).
- The original mean pixel values used to preprocess images (```img_mean.txt```).
- Two demo images to verify the integrity of the port, i.e., that the outputs generated by the PyTorch model closely match the original
MatLab model outputs (```demo_*.jpg```).
- A PyTorch ```state_dict``` object containing the PyTorch translation of the original weights, to be used in conjunction
with an AlexNetBig instance to obtain the EmoNet model (```emonet.pth```).
The package ```emonet_py``` contains the following scripts:
- ```alexnet_big.py```: defines the original AlexNet model, compared to the updated version that comes with
```torchvision.models```.
- ```check_data_files.py```: a script that will check whether the ```data``` folder, and all expected files in it, are present. If not, they will be downloaded automatically from the GitLab repository.
- ```convert_emonet_matlab_weights.py```: this script can be used to translate the MatLab model parameters to PyTorch. See its internal
documentation for details on this process.
- A demonstration script showing how to use the model (```demo.py```).
- ```emonet.py```: the script defining the EmoNet model, as well as a class, EmoNetPreProcess, to load and preprocess images
using the same image normalization used by the original MatLab model.
- ```emonet_arousal.py```: an arousal prediction model, consisting of an extra linear layer following the EmoNet output layer (see paper).
- ```emonet_valence.py```: a valence prediction model, consisting of an extra linear layer following the EmoNet output layer (see paper).
- ```test_integrity.py```: a UnitTest to check the integrity of the ported model.
Note that the arousal and valence models are also ports of the original models.
## Usage
To load and use EmoNet, simply do (see ```emonet.py/demo.py```):
```
import os
from emonet_py.emonet import EmoNet, EmoNetPreProcess
from emonet_py.emonet_arousal import EmoNetArousal
from emonet_py.emonet_valence import EmoNetValence
if __name__ == '__main__':
emonet = EmoNet(b_eval=True)
emonet_pp = EmoNetPreProcess()
img_big = os.path.join('..', 'data', 'demo_big.jpg')
img_loaded = emonet_pp(img_big)
pred = emonet.emonet(img_loaded.unsqueeze(0))
emonet.prettyprint(pred, b_pc=True)
emo_aro = EmoNetArousal()
print(f"Arousal: {emo_aro(img_loaded.unsqueeze(0))}")
emo_val = EmoNetValence()
print(f"Valence: {emo_val(img_loaded.unsqueeze(0))}")
```
## Licensing
This repository is made available under an MIT license (see [LICENSE.md](./LICENSE.md)).
This is in agreement with the original repository, which also uses an MIT license.
Author: Laurent Mertens\
Mail: [laurent.mertens@kuleuven.be](laurent.mertens@kuleuven.be)
Raw data
{
"_id": null,
"home_page": null,
"name": "emonet-py",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.9",
"maintainer_email": "Laurent Mertens <laurent.mertens@kuleuven.be>",
"keywords": "ANN, PyTorch, Neural Networks, Emotion Recognition",
"author": null,
"author_email": "Laurent Mertens <laurent.mertens@kuleuven.be>",
"download_url": "https://files.pythonhosted.org/packages/69/fe/3991c6273285dee5173cbd4ce0a5252be0b789f1870a606e12f737106421/emonet_py-1.0.6.tar.gz",
"platform": null,
"description": "# EmoNet: A PyTorch port\nThis package contains a PyTorch port of the EmoNet network originally developed in MatLab described in the paper \"[Emotion schemas are\nembedded in the human visual system](https://www.science.org/doi/10.1126/sciadv.aaw4358)\" by Krager et al., 2019. It is being distributed with explicit permission from the original (first) author.\n\nThe original model can be found at:\n[https://github.com/ecco-laboratory/EmoNet](https://github.com/ecco-laboratory/EmoNet)\\\nA PyTorch port by this lab is also available at: [https://github.com/ecco-laboratory/emonet-pytorch](https://github.com/ecco-laboratory/emonet-pytorch).\n## Installation\nTo install this repository, either clone the [GitLab project](https://gitlab.com/EAVISE/lme/emonet), or install using\n```pip install emonet-py```.\n\n| \u2139\ufe0f If you install through ```pip```, the ```data``` folder is not included, as it is too big for distribution through PyPI. </br> Instead, the data will be downloaded automatically from the GitLab repository by the ```check_data_files.py``` script when running the code. </br> The files will automatically be deleted when uninstalling the package. |\n|-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|\n\n## Contents\nThe ```data``` folder contains:\n- The original model parameters, as exported from MatLab (```*.bz2``` files).\n- The original mean pixel values used to preprocess images (```img_mean.txt```).\n- Two demo images to verify the integrity of the port, i.e., that the outputs generated by the PyTorch model closely match the original\nMatLab model outputs (```demo_*.jpg```).\n- A PyTorch ```state_dict``` object containing the PyTorch translation of the original weights, to be used in conjunction\nwith an AlexNetBig instance to obtain the EmoNet model (```emonet.pth```).\n\nThe package ```emonet_py``` contains the following scripts:\n- ```alexnet_big.py```: defines the original AlexNet model, compared to the updated version that comes with\n```torchvision.models```.\n- ```check_data_files.py```: a script that will check whether the ```data``` folder, and all expected files in it, are present. If not, they will be downloaded automatically from the GitLab repository.\n- ```convert_emonet_matlab_weights.py```: this script can be used to translate the MatLab model parameters to PyTorch. See its internal\ndocumentation for details on this process.\n- A demonstration script showing how to use the model (```demo.py```).\n- ```emonet.py```: the script defining the EmoNet model, as well as a class, EmoNetPreProcess, to load and preprocess images\nusing the same image normalization used by the original MatLab model.\n- ```emonet_arousal.py```: an arousal prediction model, consisting of an extra linear layer following the EmoNet output layer (see paper).\n- ```emonet_valence.py```: a valence prediction model, consisting of an extra linear layer following the EmoNet output layer (see paper).\n- ```test_integrity.py```: a UnitTest to check the integrity of the ported model.\nNote that the arousal and valence models are also ports of the original models.\n\n## Usage\nTo load and use EmoNet, simply do (see ```emonet.py/demo.py```):\n```\nimport os\nfrom emonet_py.emonet import EmoNet, EmoNetPreProcess\nfrom emonet_py.emonet_arousal import EmoNetArousal\nfrom emonet_py.emonet_valence import EmoNetValence\n\nif __name__ == '__main__':\n emonet = EmoNet(b_eval=True)\n emonet_pp = EmoNetPreProcess()\n img_big = os.path.join('..', 'data', 'demo_big.jpg')\n img_loaded = emonet_pp(img_big)\n pred = emonet.emonet(img_loaded.unsqueeze(0))\n emonet.prettyprint(pred, b_pc=True)\n\n emo_aro = EmoNetArousal()\n print(f\"Arousal: {emo_aro(img_loaded.unsqueeze(0))}\")\n\n emo_val = EmoNetValence()\n print(f\"Valence: {emo_val(img_loaded.unsqueeze(0))}\")\n```\n\n## Licensing\nThis repository is made available under an MIT license (see [LICENSE.md](./LICENSE.md)).\nThis is in agreement with the original repository, which also uses an MIT license.\n\nAuthor: Laurent Mertens\\\nMail: [laurent.mertens@kuleuven.be](laurent.mertens@kuleuven.be)\n",
"bugtrack_url": null,
"license": "Copyright (c) 2024 Laurent Mertens Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the \"Software\"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ",
"summary": "A PyTorch port of the MatLab EmoNet network by Kragel et al., 2019.",
"version": "1.0.6",
"project_urls": {
"Changelog": "https://gitlab.com/EAVISE/lme/emonet/blob/master/CHANGELOG.md",
"Homepage": "https://gitlab.com/EAVISE/lme/emonet"
},
"split_keywords": [
"ann",
" pytorch",
" neural networks",
" emotion recognition"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "b0ea45c5cc58c223ec326855618630e9da797ff624dfa1f0e244db2e333b3f08",
"md5": "12fdd80ca8d178634c1b8caca3f8d0bb",
"sha256": "df30912f305c17c54e8d71400b31c3b0b932cb25e020829d3f4f339f8e89e7af"
},
"downloads": -1,
"filename": "emonet_py-1.0.6-py3-none-any.whl",
"has_sig": false,
"md5_digest": "12fdd80ca8d178634c1b8caca3f8d0bb",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.9",
"size": 14949,
"upload_time": "2024-07-03T08:19:20",
"upload_time_iso_8601": "2024-07-03T08:19:20.854127Z",
"url": "https://files.pythonhosted.org/packages/b0/ea/45c5cc58c223ec326855618630e9da797ff624dfa1f0e244db2e333b3f08/emonet_py-1.0.6-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "69fe3991c6273285dee5173cbd4ce0a5252be0b789f1870a606e12f737106421",
"md5": "b341913ac0fe9476ad35533f4b237168",
"sha256": "62e877db38642e58379ce126ff87e9bbf1c9cefdf8e25f741691f887a2e8a312"
},
"downloads": -1,
"filename": "emonet_py-1.0.6.tar.gz",
"has_sig": false,
"md5_digest": "b341913ac0fe9476ad35533f4b237168",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.9",
"size": 13332,
"upload_time": "2024-07-03T08:19:22",
"upload_time_iso_8601": "2024-07-03T08:19:22.702854Z",
"url": "https://files.pythonhosted.org/packages/69/fe/3991c6273285dee5173cbd4ce0a5252be0b789f1870a606e12f737106421/emonet_py-1.0.6.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-07-03 08:19:22",
"github": false,
"gitlab": true,
"bitbucket": false,
"codeberg": false,
"gitlab_user": "EAVISE",
"gitlab_project": "lme",
"lcname": "emonet-py"
}