# MGT-python
[![PyPi version](https://badgen.net/pypi/v/musicalgestures/)](https://pypi.org/project/musicalgestures)
[![GitHub license](https://img.shields.io/github/license/fourMs/MGT-python.svg)](https://github.com/fourMs/MGT-python/blob/master/LICENSE)
[![CI](https://github.com/fourMs/MGT-python/actions/workflows/ci.yml/badge.svg)](https://github.com/fourMs/MGT-python/actions/workflows/ci.yml)
[![Documentation Status](https://readthedocs.org/projects/mgt-python/badge/?version=latest)](https://mgt-python.readthedocs.io/en/latest/?badge=latest)
The Musical Gestures Toolbox for Python is a collection of tools for visualization and analysis of audio and video.
![MGT python](https://raw.githubusercontent.com/fourMs/MGT-python/master/musicalgestures/documentation/figures/promo/ipython_example.gif)
## Usage
The easiest way to get started is to take a look at the Jupyter notebook [MusicalGesturesToolbox](https://github.com/fourMs/MGT-python/blob/master/musicalgestures/MusicalGesturesToolbox.ipynb), which shows examples of the usage of the toolbox.
[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/fourMs/MGT-python/blob/master/musicalgestures/MusicalGesturesToolbox.ipynb)
The standard installation via `pip`: paste and execute the following code in the Terminal (OSX, Linux) or the PowerShell (Windows):
`pip install musicalgestures`
MGT is developed in Python 3 and relies on `FFmpeg` and `OpenCV`. See the [wiki documentation](https://github.com/fourMs/MGT-python/wiki#installation) for more details on the installation process.
## Description
Watch a 10-minute introduction to the toolbox:
[![Video](https://www.uio.no/ritmo/english/research/labs/fourms/software/musicalgesturestoolbox/mgt-python/video/nordicsmc2021-thumbnail.png)](https://youtu.be/tZVX_lDFrwc)
MGT can generate both dynamic and static visualizations of video files, including motion videos, history videos, average images, motiongrams, and videograms. It can also extract various features from video files, including the quantity, centroid, and area of motion. The toolbox also integrates well with other libraries, such as OpenPose for skeleton tracking, and Librosa for audio analysis. All the features are described in the [wiki documentation](https://github.com/fourMs/MGT-python/wiki).
## History
This toolbox builds on the [Musical Gestures Toolbox for Matlab](https://github.com/fourMs/MGT-matlab/), which again builds on the [Musical Gestures Toolbox for Max](https://www.uio.no/ritmo/english/research/labs/fourms/software/musicalgesturestoolbox/mgt-max/).
The software is currently maintained by the [fourMs lab](https://github.com/fourMs) at [RITMO Centre for Interdisciplinary Studies in Rhythm, Time and Motion](https://www.uio.no/ritmo/english/) at the University of Oslo.
## Reference
If you use this toolbox in your research, please cite this article:
- Laczkó, B., & Jensenius, A. R. (2021). [Reflections on the Development of the Musical Gestures Toolbox for Python](https://www.duo.uio.no/bitstream/handle/10852/89331/Laczk%25C3%25B3_et_al_2021_Reflections_on_the_Development_of_the.pdf?sequence=2&isAllowed=y). *Proceedings of the Nordic Sound and Music Computing Conference*, Copenhagen.
## Credits
Developers: [Balint Laczko](https://github.com/balintlaczko), [Joachim Poutaraud](https://github.com/joachimpoutaraud), [Frida Furmyr](https://github.com/fridafu), [Marcus Widmer](https://github.com/marcuswidmer), [Alexander Refsum Jensenius](https://github.com/alexarje/)
## License
This toolbox is released under the [GNU General Public License 3.0 license](https://www.gnu.org/licenses/gpl-3.0.en.html).
Raw data
{
"_id": null,
"home_page": "https://github.com/fourMs/MGT-python",
"name": "musicalgestures",
"maintainer": "",
"docs_url": null,
"requires_python": "~=3.7",
"maintainer_email": "",
"keywords": "Computer Vision,Motion Analysis,Musical Gestures,Video-Analysis",
"author": "University of Oslo fourMs Lab",
"author_email": "a.r.jensenius@imv.uio.no",
"download_url": "https://files.pythonhosted.org/packages/9f/fe/5ccf4712437577ca00241ee826876f4d6e78ae2925b166de742633e2580d/musicalgestures-1.3.2.tar.gz",
"platform": null,
"description": "# MGT-python\n[![PyPi version](https://badgen.net/pypi/v/musicalgestures/)](https://pypi.org/project/musicalgestures)\n[![GitHub license](https://img.shields.io/github/license/fourMs/MGT-python.svg)](https://github.com/fourMs/MGT-python/blob/master/LICENSE)\n[![CI](https://github.com/fourMs/MGT-python/actions/workflows/ci.yml/badge.svg)](https://github.com/fourMs/MGT-python/actions/workflows/ci.yml)\n[![Documentation Status](https://readthedocs.org/projects/mgt-python/badge/?version=latest)](https://mgt-python.readthedocs.io/en/latest/?badge=latest)\n\nThe Musical Gestures Toolbox for Python is a collection of tools for visualization and analysis of audio and video.\n\n![MGT python](https://raw.githubusercontent.com/fourMs/MGT-python/master/musicalgestures/documentation/figures/promo/ipython_example.gif)\n\n## Usage\n\nThe easiest way to get started is to take a look at the Jupyter notebook [MusicalGesturesToolbox](https://github.com/fourMs/MGT-python/blob/master/musicalgestures/MusicalGesturesToolbox.ipynb), which shows examples of the usage of the toolbox.\n\n[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/fourMs/MGT-python/blob/master/musicalgestures/MusicalGesturesToolbox.ipynb)\n\nThe standard installation via `pip`: paste and execute the following code in the Terminal (OSX, Linux) or the PowerShell (Windows):\n\n`pip install musicalgestures`\n\nMGT is developed in Python 3 and relies on `FFmpeg` and `OpenCV`. See the [wiki documentation](https://github.com/fourMs/MGT-python/wiki#installation) for more details on the installation process.\n\n## Description\n\nWatch a 10-minute introduction to the toolbox: \n\n[![Video](https://www.uio.no/ritmo/english/research/labs/fourms/software/musicalgesturestoolbox/mgt-python/video/nordicsmc2021-thumbnail.png)](https://youtu.be/tZVX_lDFrwc)\n\nMGT can generate both dynamic and static visualizations of video files, including motion videos, history videos, average images, motiongrams, and videograms. It can also extract various features from video files, including the quantity, centroid, and area of motion. The toolbox also integrates well with other libraries, such as OpenPose for skeleton tracking, and Librosa for audio analysis. All the features are described in the [wiki documentation](https://github.com/fourMs/MGT-python/wiki).\n\n\n## History\n\nThis toolbox builds on the [Musical Gestures Toolbox for Matlab](https://github.com/fourMs/MGT-matlab/), which again builds on the [Musical Gestures Toolbox for Max](https://www.uio.no/ritmo/english/research/labs/fourms/software/musicalgesturestoolbox/mgt-max/).\n\nThe software is currently maintained by the [fourMs lab](https://github.com/fourMs) at [RITMO Centre for Interdisciplinary Studies in Rhythm, Time and Motion](https://www.uio.no/ritmo/english/) at the University of Oslo.\n\n## Reference\n\nIf you use this toolbox in your research, please cite this article:\n\n- Laczk\u00c3\u00b3, B., & Jensenius, A. R. (2021). [Reflections on the Development of the Musical Gestures Toolbox for Python](https://www.duo.uio.no/bitstream/handle/10852/89331/Laczk%25C3%25B3_et_al_2021_Reflections_on_the_Development_of_the.pdf?sequence=2&isAllowed=y). *Proceedings of the Nordic Sound and Music Computing Conference*, Copenhagen.\n\n\n## Credits\n\nDevelopers: [Balint Laczko](https://github.com/balintlaczko), [Joachim Poutaraud](https://github.com/joachimpoutaraud), [Frida Furmyr](https://github.com/fridafu), [Marcus Widmer](https://github.com/marcuswidmer), [Alexander Refsum Jensenius](https://github.com/alexarje/)\n\n## License\n\nThis toolbox is released under the [GNU General Public License 3.0 license](https://www.gnu.org/licenses/gpl-3.0.en.html).\n\n\n",
"bugtrack_url": null,
"license": "GNU General Public License v3 (GPLv3)",
"summary": "Musical Gestures Toolbox for Python",
"version": "1.3.2",
"project_urls": {
"Download": "https://github.com/fourMs/MGT-python/archive/v1.3.2.tar.gz",
"Homepage": "https://github.com/fourMs/MGT-python"
},
"split_keywords": [
"computer vision",
"motion analysis",
"musical gestures",
"video-analysis"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "e502441b1ba40491ad97d3e36aec32fa449c53a63c9caf8c6154fffebc821a9a",
"md5": "de3dd200a171df8a5416e4ec5cc09f27",
"sha256": "9aa670b829d976096ba0091c6a3937659e7bfefb8297e8e0f6a9be19dc32a73c"
},
"downloads": -1,
"filename": "musicalgestures-1.3.2-py3-none-any.whl",
"has_sig": false,
"md5_digest": "de3dd200a171df8a5416e4ec5cc09f27",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": "~=3.7",
"size": 22644171,
"upload_time": "2024-01-04T15:23:29",
"upload_time_iso_8601": "2024-01-04T15:23:29.440065Z",
"url": "https://files.pythonhosted.org/packages/e5/02/441b1ba40491ad97d3e36aec32fa449c53a63c9caf8c6154fffebc821a9a/musicalgestures-1.3.2-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "9ffe5ccf4712437577ca00241ee826876f4d6e78ae2925b166de742633e2580d",
"md5": "40bbb07cdd0ef8e5e48501532a3d32d9",
"sha256": "895bff4166ba954670d4e4097c5497f0415bae973d38c3219bc5573dfe04b595"
},
"downloads": -1,
"filename": "musicalgestures-1.3.2.tar.gz",
"has_sig": false,
"md5_digest": "40bbb07cdd0ef8e5e48501532a3d32d9",
"packagetype": "sdist",
"python_version": "source",
"requires_python": "~=3.7",
"size": 22620969,
"upload_time": "2024-01-04T15:35:58",
"upload_time_iso_8601": "2024-01-04T15:35:58.336253Z",
"url": "https://files.pythonhosted.org/packages/9f/fe/5ccf4712437577ca00241ee826876f4d6e78ae2925b166de742633e2580d/musicalgestures-1.3.2.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-01-04 15:35:58",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "fourMs",
"github_project": "MGT-python",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"lcname": "musicalgestures"
}