gradientcobra v1.1.4
====================
.. image:: https://raw.githubusercontent.com/hassothea/gradientcobra/main/gradientcobra_logo.svg
:width: 200
:alt: Gradient COBRA Logo
|Python39| |Python310|
Introduction
------------
``Gradientcobra`` is ``python`` package implementation of Gradient COBRA method by `S. Has (2023) <https://jdssv.org/index.php/jdssv/article/view/70>`__, as well as other aggregation and kernel methods.
When the loss function of is smooth enough, gradient descent algorithm can be used to efficiently estimate the bandwidth parameter of the model.
For more information, read the "**Documentation and Examples**" below.
Installation
------------
In your terminal, run the following command to download and install from PyPI:
``pip install gradientcobra``
Citation
--------
If you find ``gradientcobra`` helpful, please consider citing the following papaers:
- S.\ Has (2023), `Gradient COBRA: A kernel-based consensual aggregation for regression <https://jdssv.org/index.php/jdssv/article/view/70>`__.
- A.\ Fischer and M. Mougeot (2019), `Aggregation using input-output trade-off <https://www.sciencedirect.com/science/article/pii/S0378375818302349>`__.
- G.\ Biau, A. Fischer, B. Guedj and J. D. Malley (2016), `COBRA: A combined regression strategy <https://doi.org/10.1016/j.jmva.2015.04.007>`__.
Documentation and Examples
--------------------------
For more information about the library:
- read: `gradientcobra documentation <https://hassothea.github.io/files/CodesPhD/gradientcobra_doc.html>`__.
Read more about aggregation and kernel methods, see:
- `GradientCOBRA documentation <https://hassothea.github.io/files/CodesPhD/gradientcobra.html>`__.
- `MixCOBRARegressor documentation <https://hassothea.github.io/files/CodesPhD/mixcobra.html>`__.
- `Kernel Smoother documentation <https://hassothea.github.io/files/CodesPhD/kernelsmoother.html>`__.
- `Super Learner documentation <https://hassothea.github.io/files/CodesPhD/superlearner.html>`__.
Dependencies
------------
- Python 3.9+
- numpy, scipy, scikit-learn, matplotlib, pandas, seaborn, plotly, tqdm
References
----------
- S. Has (2023). A Gradient COBRA: A kernel-based consensual aggregation for regression.
Journal of Data Science, Statistics, and Visualisation, 3(2).
- A.\ Fischer, M. Mougeot (2019). Aggregation using input-output trade-off.
Journal of Statistical Planning and Inference, 200.
- G. Biau, A. Fischer, B. Guedj and J. D. Malley (2016), COBRA: A
combined regression strategy, Journal of Multivariate Analysis.
- M. Mojirsheibani (1999), Combining Classifiers via Discretization,
Journal of the American Statistical Association.
- M.\ J. Van der Laan, E. C. Polley, and A. E. Hubbard (2007). Super Learner.
Statistical Applications of Genetics and Molecular Biology, 6, article 25.
- T.\ Hastie, R. Tibshirani, J. Friedman (2009). Kernel Smoothing Methods.
The Elements of Statistical Learning. Springer Series in Statistics. Springer, New York, NY.
.. |Travis Status| image:: https://img.shields.io/travis/hassothea/gradientcobra.svg?branch=master
:target: https://travis-ci.org/hassothea/gradientcobra
.. |Python39| image:: https://img.shields.io/badge/python-3.9-green.svg
:target: https://pypi.python.org/pypi/gradientcobra
.. |Python310| image:: https://img.shields.io/badge/python-3.10-blue.svg
:target: https://pypi.python.org/pypi/gradientcobra
.. |Coverage Status| image:: https://img.shields.io/codecov/c/github/hassothea/gradientcobra.svg
:target: https://codecov.io/gh/hassothea/gradientcobra
Raw data
{
"_id": null,
"home_page": "https://github.com/hassothea/gradientcobra/",
"name": "gradientcobra",
"maintainer": "",
"docs_url": null,
"requires_python": "",
"maintainer_email": "",
"keywords": "Consensual aggregation,Kernel,Regression,Statistical Aggregation",
"author": "Sothea Has",
"author_email": "sothea.has@lpsm.paris",
"download_url": "https://files.pythonhosted.org/packages/b6/e8/4a0c2a7212d00b6b423e125f6cd91e50dc7eabb4b77af1428741b50cee76/gradientcobra-1.1.4.tar.gz",
"platform": null,
"description": "gradientcobra v1.1.4\r\n====================\r\n\r\n.. image:: https://raw.githubusercontent.com/hassothea/gradientcobra/main/gradientcobra_logo.svg\r\n :width: 200\r\n :alt: Gradient COBRA Logo\r\n\r\n\r\n|Python39| |Python310| \r\n\r\nIntroduction\r\n------------\r\n\r\n``Gradientcobra`` is ``python`` package implementation of Gradient COBRA method by `S. Has (2023) <https://jdssv.org/index.php/jdssv/article/view/70>`__, as well as other aggregation and kernel methods. \r\nWhen the loss function of is smooth enough, gradient descent algorithm can be used to efficiently estimate the bandwidth parameter of the model.\r\n\r\nFor more information, read the \"**Documentation and Examples**\" below.\r\n\r\nInstallation\r\n------------\r\n\r\nIn your terminal, run the following command to download and install from PyPI:\r\n\r\n\r\n``pip install gradientcobra``\r\n\r\n\r\nCitation\r\n--------\r\n\r\nIf you find ``gradientcobra`` helpful, please consider citing the following papaers:\r\n\r\n- S.\\ Has (2023), `Gradient COBRA: A kernel-based consensual aggregation for regression <https://jdssv.org/index.php/jdssv/article/view/70>`__.\r\n\r\n- A.\\ Fischer and M. Mougeot (2019), `Aggregation using input-output trade-off <https://www.sciencedirect.com/science/article/pii/S0378375818302349>`__.\r\n\r\n- G.\\ Biau, A. Fischer, B. Guedj and J. D. Malley (2016), `COBRA: A combined regression strategy <https://doi.org/10.1016/j.jmva.2015.04.007>`__.\r\n\r\n\r\nDocumentation and Examples\r\n--------------------------\r\n\r\nFor more information about the library:\r\n\r\n- read: `gradientcobra documentation <https://hassothea.github.io/files/CodesPhD/gradientcobra_doc.html>`__.\r\n\r\nRead more about aggregation and kernel methods, see:\r\n\r\n- `GradientCOBRA documentation <https://hassothea.github.io/files/CodesPhD/gradientcobra.html>`__.\r\n\r\n\r\n- `MixCOBRARegressor documentation <https://hassothea.github.io/files/CodesPhD/mixcobra.html>`__.\r\n\r\n\r\n- `Kernel Smoother documentation <https://hassothea.github.io/files/CodesPhD/kernelsmoother.html>`__.\r\n\r\n\r\n- `Super Learner documentation <https://hassothea.github.io/files/CodesPhD/superlearner.html>`__.\r\n\r\nDependencies\r\n------------\r\n\r\n- Python 3.9+\r\n- numpy, scipy, scikit-learn, matplotlib, pandas, seaborn, plotly, tqdm\r\n\r\nReferences\r\n----------\r\n\r\n- S. Has (2023). A Gradient COBRA: A kernel-based consensual aggregation for regression. \r\n Journal of Data Science, Statistics, and Visualisation, 3(2).\r\n- A.\\ Fischer, M. Mougeot (2019). Aggregation using input-output trade-off. \r\n Journal of Statistical Planning and Inference, 200.\r\n- G. Biau, A. Fischer, B. Guedj and J. D. Malley (2016), COBRA: A\r\n combined regression strategy, Journal of Multivariate Analysis.\r\n- M. Mojirsheibani (1999), Combining Classifiers via Discretization,\r\n Journal of the American Statistical Association.\r\n- M.\\ J. Van der Laan, E. C. Polley, and A. E. Hubbard (2007). Super Learner. \r\n Statistical Applications of Genetics and Molecular Biology, 6, article 25.\r\n- T.\\ Hastie, R. Tibshirani, J. Friedman (2009). Kernel Smoothing Methods.\r\n The Elements of Statistical Learning. Springer Series in Statistics. Springer, New York, NY.\r\n \r\n.. |Travis Status| image:: https://img.shields.io/travis/hassothea/gradientcobra.svg?branch=master\r\n :target: https://travis-ci.org/hassothea/gradientcobra\r\n\r\n.. |Python39| image:: https://img.shields.io/badge/python-3.9-green.svg\r\n :target: https://pypi.python.org/pypi/gradientcobra\r\n\r\n.. |Python310| image:: https://img.shields.io/badge/python-3.10-blue.svg\r\n :target: https://pypi.python.org/pypi/gradientcobra\r\n\r\n.. |Coverage Status| image:: https://img.shields.io/codecov/c/github/hassothea/gradientcobra.svg\r\n :target: https://codecov.io/gh/hassothea/gradientcobra\r\n",
"bugtrack_url": null,
"license": "",
"summary": "Python implementation for Gradient COBRA by S. Has (2023) with other aggregation and kernel methods.",
"version": "1.1.4",
"project_urls": {
"Homepage": "https://github.com/hassothea/gradientcobra/"
},
"split_keywords": [
"consensual aggregation",
"kernel",
"regression",
"statistical aggregation"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "a2d681c3a59ce772b3e9a6254b791dc25db2bf6a6499c6d5983b4a1e70d6c96f",
"md5": "d106b965a9809923dbffa0f473945964",
"sha256": "54323e1cbf4d4eda79d4002e9e3b481012e09d4d093267a296026a2ca72a9d98"
},
"downloads": -1,
"filename": "gradientcobra-1.1.4-py3-none-any.whl",
"has_sig": false,
"md5_digest": "d106b965a9809923dbffa0f473945964",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": null,
"size": 33297,
"upload_time": "2024-03-08T12:54:56",
"upload_time_iso_8601": "2024-03-08T12:54:56.346552Z",
"url": "https://files.pythonhosted.org/packages/a2/d6/81c3a59ce772b3e9a6254b791dc25db2bf6a6499c6d5983b4a1e70d6c96f/gradientcobra-1.1.4-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "b6e84a0c2a7212d00b6b423e125f6cd91e50dc7eabb4b77af1428741b50cee76",
"md5": "1e4664bc9d2a9c498d198babb4ebceee",
"sha256": "34cb422d00478fd844ef44e179eb66396d0db0745e51d047b1091e5bc4f37ce4"
},
"downloads": -1,
"filename": "gradientcobra-1.1.4.tar.gz",
"has_sig": false,
"md5_digest": "1e4664bc9d2a9c498d198babb4ebceee",
"packagetype": "sdist",
"python_version": "source",
"requires_python": null,
"size": 33452,
"upload_time": "2024-03-08T12:54:58",
"upload_time_iso_8601": "2024-03-08T12:54:58.382427Z",
"url": "https://files.pythonhosted.org/packages/b6/e8/4a0c2a7212d00b6b423e125f6cd91e50dc7eabb4b77af1428741b50cee76/gradientcobra-1.1.4.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-03-08 12:54:58",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "hassothea",
"github_project": "gradientcobra",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"requirements": [],
"lcname": "gradientcobra"
}