Name | mixentropy JSON |
Version |
0.0.1
JSON |
| download |
home_page | None |
Summary | Tools for estimating entropy of mixture distribution and more. |
upload_time | 2024-05-10 02:07:36 |
maintainer | None |
docs_url | None |
author | None |
requires_python | >=3.8 |
license | MIT License Copyright (c) 2024 grizxlyzx Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. |
keywords |
machine learning
deep learning
entropy
|
VCS |
|
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
# MixEntroPY
Numpy & PyTorch vectorized implementation of various differential entropy estimators for ML & DL.
## About The Project
Information theory-inspired machine learning methods are gaining increasing interest, with estimation of the entropy as well as mutual information of random variables serving as their cornerstone.
To estimate the Shannon entropy of a _discrete_ random variable, given its probability distribution, one can simply apply the definition of Shannon entropy $H(x)=-\sum_i p(x_i)\log p(x_i)$ to obtain an accurate result.
But when it comes to estimate differential entropy, $h(x)=-\int p(x)\log p(x)dx$, from data points sampled from datasets or models, often there is no prior knowledge about the underlying distribution.
In such cases, we could make an assumption of the unknown distribution and expect the assumed distribution to have a closed-form expression for its entropy calculation, e.g., multivariate Gaussian.
_Kernel Density Estimation(KDE)_ is one of commonly used methods to approximate probability density of a distribution, However, while a single kernel may have a closed-form expression for its entropy, the mixture of these kernels typically does not.
_**This project offers several entropy estimators for these mixture distributions, (mainly focus on mixture of Gaussian and mixture of uniform), implemented with both Numpy and PyTorch.**_
Most of the estimators are **differentiable**, making them suitable for optimization purposes.
_Please see Github https://github.com/grizxlyzx/MixEntroPY for more information._
Raw data
{
"_id": null,
"home_page": null,
"name": "mixentropy",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.8",
"maintainer_email": "Grizxly <grizxlyzx@gamil.com>",
"keywords": "machine learning, deep learning, entropy",
"author": null,
"author_email": "Grizxly <grizxlyzx@gamil.com>",
"download_url": "https://files.pythonhosted.org/packages/66/db/ae6858984c7c87af922b4a16d4bb706d8ff9f8654fa601180ea4c9862d65/mixentropy-0.0.1.tar.gz",
"platform": null,
"description": "# MixEntroPY\n\nNumpy & PyTorch vectorized implementation of various differential entropy estimators for ML & DL.\n\n## About The Project\n\nInformation theory-inspired machine learning methods are gaining increasing interest, with estimation of the entropy as well as mutual information of random variables serving as their cornerstone.\n\nTo estimate the Shannon entropy of a _discrete_ random variable, given its probability distribution, one can simply apply the definition of Shannon entropy $H(x)=-\\sum_i p(x_i)\\log p(x_i)$ to obtain an accurate result.\n\nBut when it comes to estimate differential entropy, $h(x)=-\\int p(x)\\log p(x)dx$, from data points sampled from datasets or models, often there is no prior knowledge about the underlying distribution.\n\nIn such cases, we could make an assumption of the unknown distribution and expect the assumed distribution to have a closed-form expression for its entropy calculation, e.g., multivariate Gaussian.\n\n_Kernel Density Estimation(KDE)_ is one of commonly used methods to approximate probability density of a distribution, However, while a single kernel may have a closed-form expression for its entropy, the mixture of these kernels typically does not.\n\n_**This project offers several entropy estimators for these mixture distributions, (mainly focus on mixture of Gaussian and mixture of uniform), implemented with both Numpy and PyTorch.**_\n\nMost of the estimators are **differentiable**, making them suitable for optimization purposes.\n\n_Please see Github https://github.com/grizxlyzx/MixEntroPY for more information._\n",
"bugtrack_url": null,
"license": "MIT License Copyright (c) 2024 grizxlyzx Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the \"Software\"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.",
"summary": "Tools for estimating entropy of mixture distribution and more.",
"version": "0.0.1",
"project_urls": {
"Repository": "https://github.com/grizxlyzx/MixEntroPY"
},
"split_keywords": [
"machine learning",
" deep learning",
" entropy"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "aa08bae7868e5044a2443affde5541b2ee45b49252bf7eb34a9b45d49c2b21dc",
"md5": "fded2659a38d4f4ae3a3369f32dfbd24",
"sha256": "c68d590c72e0abdcf5a73e40ec61933caacc20561a02d9d9c26324fc5c479846"
},
"downloads": -1,
"filename": "mixentropy-0.0.1-py3-none-any.whl",
"has_sig": false,
"md5_digest": "fded2659a38d4f4ae3a3369f32dfbd24",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.8",
"size": 20557,
"upload_time": "2024-05-10T02:07:34",
"upload_time_iso_8601": "2024-05-10T02:07:34.328041Z",
"url": "https://files.pythonhosted.org/packages/aa/08/bae7868e5044a2443affde5541b2ee45b49252bf7eb34a9b45d49c2b21dc/mixentropy-0.0.1-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "66dbae6858984c7c87af922b4a16d4bb706d8ff9f8654fa601180ea4c9862d65",
"md5": "9eddedf2871956d7bd6942be16e0df18",
"sha256": "3541fd5f8d04eebb5def3f87b6c0d5213fc52327c6c5ddcd9d7f50803842f154"
},
"downloads": -1,
"filename": "mixentropy-0.0.1.tar.gz",
"has_sig": false,
"md5_digest": "9eddedf2871956d7bd6942be16e0df18",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.8",
"size": 17595,
"upload_time": "2024-05-10T02:07:36",
"upload_time_iso_8601": "2024-05-10T02:07:36.450299Z",
"url": "https://files.pythonhosted.org/packages/66/db/ae6858984c7c87af922b4a16d4bb706d8ff9f8654fa601180ea4c9862d65/mixentropy-0.0.1.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-05-10 02:07:36",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "grizxlyzx",
"github_project": "MixEntroPY",
"travis_ci": false,
"coveralls": false,
"github_actions": false,
"lcname": "mixentropy"
}