humpday


Namehumpday JSON
Version 0.7.1 PyPI version JSON
download
home_pagehttps://github.com/microprediction/humpday
SummaryTaking the pain out of choosing a Python global optimizer
upload_time2023-03-05 02:37:28
maintainer
docs_urlNone
authormicroprediction
requires_python
licenseMIT
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # humpday derivative-free optimizers ([docs](https://microprediction.github.io/humpday/) and  [Elo ratings](https://microprediction.github.io/optimizer-elo-ratings/html_leaderboards/overall.html)) ![tests](https://github.com/microprediction/humpday/workflows/tests/badge.svg) ![nlopt](https://github.com/microprediction/humpday/workflows/test-nlopt/badge.svg) ![ax-platform](https://github.com/microprediction/humpday/workflows/test-ax/badge.svg) ![py-bobyqa](https://github.com/microprediction/humpday/workflows/test-bobyqa/badge.svg) ![dlib](https://github.com/microprediction/humpday/workflows/test-dlib/badge.svg) ![hyperopt](https://github.com/microprediction/humpday/workflows/test-hyperopt/badge.svg) ![pySOT](https://github.com/microprediction/humpday/workflows/test-pySOT/badge.svg) ![skopt](https://github.com/microprediction/humpday/workflows/test-skopt/badge.svg)![hebo](https://github.com/microprediction/humpday/workflows/test-hebo/badge.svg) ![nevergrad](https://github.com/microprediction/humpday/workflows/test-nevergrad/badge.svg) ![nevergrad (GitHub)](https://github.com/microprediction/humpday/workflows/test-nevergrad-github/badge.svg) ![optuna](https://github.com/microprediction/humpday/workflows/test-optuna/badge.svg) ![bayesopt](https://github.com/microprediction/humpday/workflows/test-bayesopt/badge.svg) ![platypus](https://github.com/microprediction/humpday/workflows/test-platypus/badge.svg) ![pymoo](https://github.com/microprediction/humpday/workflows/test-pymoo/badge.svg) ![ultraopt](https://github.com/microprediction/humpday/workflows/test-ultraopt/badge.svg) ![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)

## Deriv-free optimizers from many packages in a common syntax, with evaluation

1. There's a [colab notebook](https://github.com/microprediction/humpday/blob/main/black_box_optimization_package_recommender.ipynb) that recommends a black-box derivative-free optimizer for your objective function. 

2. About fifty strategies drawn from various open source packages are assigned [Elo ratings](https://microprediction.github.io/optimizer-elo-ratings/html_leaderboards/overall.html) depending on dimension of the problem and number of function evaluations allowed. 
 
Hello and welcome to HumpDay, a package that helps you choose a Python global optimizer package, and strategy therein, from [Ax-Platform](https://github.com/microprediction/humpday/blob/main/humpday/optimizers/axcube.py), [bayesian-optimization](https://github.com/microprediction/humpday/blob/main/humpday/optimizers/bayesoptcube.py), [DLib](https://github.com/microprediction/humpday/blob/main/humpday/optimizers/dlibcube.py), [HyperOpt](https://github.com/microprediction/humpday/blob/main/humpday/optimizers/hyperoptcube.py), [NeverGrad](https://github.com/microprediction/humpday/blob/main/humpday/optimizers/nevergradcube.py), [Optuna](https://github.com/microprediction/humpday/blob/main/humpday/optimizers/optunacube.py), [Platypus](https://github.com/microprediction/humpday/blob/main/humpday/optimizers/platypuscube.py), [PyMoo](https://github.com/microprediction/humpday/blob/main/humpday/optimizers/pymoocube.py), [PySOT](https://github.com/microprediction/humpday/blob/main/humpday/optimizers/pysotcube.py), Scipy [classic](https://github.com/microprediction/humpday/blob/main/humpday/optimizers/scipycube.py) and [shgo](https://github.com/microprediction/humpday/blob/main/humpday/optimizers/shgocube.py), [Skopt](https://github.com/microprediction/humpday/blob/main/humpday/optimizers/skoptcube.py),
[nlopt](https://github.com/microprediction/humpday/blob/main/humpday/optimizers/nloptcube.py), [Py-Bobyaq](https://github.com/microprediction/humpday/blob/main/humpday/optimizers/bobyqacube.py), 
[UltraOpt](https://github.com/microprediction/humpday/blob/main/humpday/optimizers/ultraoptcube.py) and maybe others by the time you read this. It also presents *some* of their functionality in a common calling syntax.  
 
### Cite or be cited
Pull requests at [CITE.md](https://github.com/microprediction/humpday/blob/main/CITE.md) are welcome. If your package is benchmarked here I'd like to get this bit right.  
 
### Install

See [INSTALL.md](https://github.com/microprediction/humpday/blob/main/INSTALL.md)

Short version:

    pip install humpday
    pip install humpday[full]

## Recommendations

Pass the dimensions of the problem, function evaluation budget and
 time budget to receive [suggestions](https://github.com/microprediction/humpday/blob/main/humpday/comparison/suggestions.py) that are independent of your problem set,
 
        from pprint import pprint 
        from humpday import suggest
        pprint(suggest(n_dim=5, n_trials=130,n_seconds=5*60))
        
where *n_seconds* is the total computation budget for the optimizer (not the objective function) over all 130 function evaluations. Or simply pass your objective function, and it will time it and do something sensible:
     
        from humpday import recommend
    
        def my_objective(u):
            time.sleep(0.01)
            return u[0]*math.sin(u[1])

        recommendations = recommend(my_objective, n_dim=21, n_trials=130)

## Points race
        
If you have more time, call [points_race](https://github.com/microprediction/humpday/blob/main/humpday/comparison/odious.py) on a list of your own objective functions:

        from humpday import points_race
        points_race(objectives=[my_objective]*2,n_dim=5, n_trials=100)
        
See the [colab notebook](https://github.com/microprediction/humpday/blob/main/black_box_optimization_package_recommender.ipynb).

## How it works 

In the background, 50+ strategies are assigned [Elo ratings](https://github.com/microprediction/optimizer-elo-ratings/tree/main/results/leaderboards) by sister repo [optimizer-elo-ratings](https://github.com/microprediction/optimizer-elo-ratings). Oh I said that already. Never mind. 

## Contribute

By all means contribute more to [optimizers](https://github.com/microprediction/humpday/tree/main/humpday/optimizers). 


![](https://i.imgur.com/FCiSrMQ.png)
 

    
## Articles 

- (most recent) [HumpDay: A Package to Take the Pain Out of Choosing a Python Optimizer](https://www.microprediction.com/blog/humpday). 
- [Comparing Python Global Optimizers](https://www.microprediction.com/blog/optimize).


            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/microprediction/humpday",
    "name": "humpday",
    "maintainer": "",
    "docs_url": null,
    "requires_python": "",
    "maintainer_email": "",
    "keywords": "",
    "author": "microprediction",
    "author_email": "pcotton@intechinvestments.com",
    "download_url": "https://files.pythonhosted.org/packages/10/a1/e26c3ea5424dfa88ae300a5c2274c4be5ba02400b72f46312e1f3a503fdf/humpday-0.7.1.tar.gz",
    "platform": null,
    "description": "# humpday derivative-free optimizers ([docs](https://microprediction.github.io/humpday/) and  [Elo ratings](https://microprediction.github.io/optimizer-elo-ratings/html_leaderboards/overall.html)) ![tests](https://github.com/microprediction/humpday/workflows/tests/badge.svg) ![nlopt](https://github.com/microprediction/humpday/workflows/test-nlopt/badge.svg) ![ax-platform](https://github.com/microprediction/humpday/workflows/test-ax/badge.svg) ![py-bobyqa](https://github.com/microprediction/humpday/workflows/test-bobyqa/badge.svg) ![dlib](https://github.com/microprediction/humpday/workflows/test-dlib/badge.svg) ![hyperopt](https://github.com/microprediction/humpday/workflows/test-hyperopt/badge.svg) ![pySOT](https://github.com/microprediction/humpday/workflows/test-pySOT/badge.svg) ![skopt](https://github.com/microprediction/humpday/workflows/test-skopt/badge.svg)![hebo](https://github.com/microprediction/humpday/workflows/test-hebo/badge.svg) ![nevergrad](https://github.com/microprediction/humpday/workflows/test-nevergrad/badge.svg) ![nevergrad (GitHub)](https://github.com/microprediction/humpday/workflows/test-nevergrad-github/badge.svg) ![optuna](https://github.com/microprediction/humpday/workflows/test-optuna/badge.svg) ![bayesopt](https://github.com/microprediction/humpday/workflows/test-bayesopt/badge.svg) ![platypus](https://github.com/microprediction/humpday/workflows/test-platypus/badge.svg) ![pymoo](https://github.com/microprediction/humpday/workflows/test-pymoo/badge.svg) ![ultraopt](https://github.com/microprediction/humpday/workflows/test-ultraopt/badge.svg) ![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)\n\n## Deriv-free optimizers from many packages in a common syntax, with evaluation\n\n1. There's a [colab notebook](https://github.com/microprediction/humpday/blob/main/black_box_optimization_package_recommender.ipynb) that recommends a black-box derivative-free optimizer for your objective function. \n\n2. About fifty strategies drawn from various open source packages are assigned [Elo ratings](https://microprediction.github.io/optimizer-elo-ratings/html_leaderboards/overall.html) depending on dimension of the problem and number of function evaluations allowed. \n \nHello and welcome to HumpDay, a package that helps you choose a Python global optimizer package, and strategy therein, from [Ax-Platform](https://github.com/microprediction/humpday/blob/main/humpday/optimizers/axcube.py), [bayesian-optimization](https://github.com/microprediction/humpday/blob/main/humpday/optimizers/bayesoptcube.py), [DLib](https://github.com/microprediction/humpday/blob/main/humpday/optimizers/dlibcube.py), [HyperOpt](https://github.com/microprediction/humpday/blob/main/humpday/optimizers/hyperoptcube.py), [NeverGrad](https://github.com/microprediction/humpday/blob/main/humpday/optimizers/nevergradcube.py), [Optuna](https://github.com/microprediction/humpday/blob/main/humpday/optimizers/optunacube.py), [Platypus](https://github.com/microprediction/humpday/blob/main/humpday/optimizers/platypuscube.py), [PyMoo](https://github.com/microprediction/humpday/blob/main/humpday/optimizers/pymoocube.py), [PySOT](https://github.com/microprediction/humpday/blob/main/humpday/optimizers/pysotcube.py), Scipy [classic](https://github.com/microprediction/humpday/blob/main/humpday/optimizers/scipycube.py) and [shgo](https://github.com/microprediction/humpday/blob/main/humpday/optimizers/shgocube.py), [Skopt](https://github.com/microprediction/humpday/blob/main/humpday/optimizers/skoptcube.py),\n[nlopt](https://github.com/microprediction/humpday/blob/main/humpday/optimizers/nloptcube.py), [Py-Bobyaq](https://github.com/microprediction/humpday/blob/main/humpday/optimizers/bobyqacube.py), \n[UltraOpt](https://github.com/microprediction/humpday/blob/main/humpday/optimizers/ultraoptcube.py) and maybe others by the time you read this. It also presents *some* of their functionality in a common calling syntax.  \n \n### Cite or be cited\nPull requests at [CITE.md](https://github.com/microprediction/humpday/blob/main/CITE.md) are welcome. If your package is benchmarked here I'd like to get this bit right.  \n \n### Install\n\nSee [INSTALL.md](https://github.com/microprediction/humpday/blob/main/INSTALL.md)\n\nShort version:\n\n    pip install humpday\n    pip install humpday[full]\n\n## Recommendations\n\nPass the dimensions of the problem, function evaluation budget and\n time budget to receive [suggestions](https://github.com/microprediction/humpday/blob/main/humpday/comparison/suggestions.py) that are independent of your problem set,\n \n        from pprint import pprint \n        from humpday import suggest\n        pprint(suggest(n_dim=5, n_trials=130,n_seconds=5*60))\n        \nwhere *n_seconds* is the total computation budget for the optimizer (not the objective function) over all 130 function evaluations. Or simply pass your objective function, and it will time it and do something sensible:\n     \n        from humpday import recommend\n    \n        def my_objective(u):\n            time.sleep(0.01)\n            return u[0]*math.sin(u[1])\n\n        recommendations = recommend(my_objective, n_dim=21, n_trials=130)\n\n## Points race\n        \nIf you have more time, call [points_race](https://github.com/microprediction/humpday/blob/main/humpday/comparison/odious.py) on a list of your own objective functions:\n\n        from humpday import points_race\n        points_race(objectives=[my_objective]*2,n_dim=5, n_trials=100)\n        \nSee the [colab notebook](https://github.com/microprediction/humpday/blob/main/black_box_optimization_package_recommender.ipynb).\n\n## How it works \n\nIn the background, 50+ strategies are assigned [Elo ratings](https://github.com/microprediction/optimizer-elo-ratings/tree/main/results/leaderboards) by sister repo [optimizer-elo-ratings](https://github.com/microprediction/optimizer-elo-ratings). Oh I said that already. Never mind. \n\n## Contribute\n\nBy all means contribute more to [optimizers](https://github.com/microprediction/humpday/tree/main/humpday/optimizers). \n\n\n![](https://i.imgur.com/FCiSrMQ.png)\n \n\n    \n## Articles \n\n- (most recent) [HumpDay: A Package to Take the Pain Out of Choosing a Python Optimizer](https://www.microprediction.com/blog/humpday). \n- [Comparing Python Global Optimizers](https://www.microprediction.com/blog/optimize).\n\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "Taking the pain out of choosing a Python global optimizer",
    "version": "0.7.1",
    "project_urls": {
        "Homepage": "https://github.com/microprediction/humpday"
    },
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "eb6b4c812c640525228ac17a9089792fdcccd16d8fc23c40fef919634d5b32c2",
                "md5": "a503db0bc4023a2b962035be628308c0",
                "sha256": "3dbb481245d3cb3a66f06af293a5990c883f6fbac302bb7cf21d635d0d6ce18b"
            },
            "downloads": -1,
            "filename": "humpday-0.7.1-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "a503db0bc4023a2b962035be628308c0",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": null,
            "size": 64222,
            "upload_time": "2023-03-05T02:37:26",
            "upload_time_iso_8601": "2023-03-05T02:37:26.972635Z",
            "url": "https://files.pythonhosted.org/packages/eb/6b/4c812c640525228ac17a9089792fdcccd16d8fc23c40fef919634d5b32c2/humpday-0.7.1-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "10a1e26c3ea5424dfa88ae300a5c2274c4be5ba02400b72f46312e1f3a503fdf",
                "md5": "5996b622a963e2ac46588e0d7b7dd768",
                "sha256": "67b1c9e7744cd7ee77c46914795f1643b0fe5632e465d9695a6645071e4a8383"
            },
            "downloads": -1,
            "filename": "humpday-0.7.1.tar.gz",
            "has_sig": false,
            "md5_digest": "5996b622a963e2ac46588e0d7b7dd768",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": null,
            "size": 45317,
            "upload_time": "2023-03-05T02:37:28",
            "upload_time_iso_8601": "2023-03-05T02:37:28.919641Z",
            "url": "https://files.pythonhosted.org/packages/10/a1/e26c3ea5424dfa88ae300a5c2274c4be5ba02400b72f46312e1f3a503fdf/humpday-0.7.1.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-03-05 02:37:28",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "microprediction",
    "github_project": "humpday",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "requirements": [],
    "lcname": "humpday"
}
        
Elapsed time: 0.17444s