nevergrad


Namenevergrad JSON
Version 1.0.1 PyPI version JSON
download
home_pagehttps://github.com/facebookresearch/nevergrad
SummaryA Python toolbox for performing gradient-free optimization
upload_time2023-11-19 06:44:49
maintainer
docs_urlNone
authorFacebook AI Research
requires_python>=3.6
licenseMIT
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage
            [![Support Ukraine](https://img.shields.io/badge/Support-Ukraine-FFD500?style=flat&labelColor=005BBB)](https://opensource.fb.com/support-ukraine) [![CircleCI](https://circleci.com/gh/facebookresearch/nevergrad/tree/main.svg?style=svg)](https://circleci.com/gh/facebookresearch/nevergrad/tree/main)

# Nevergrad - A gradient-free optimization platform

![Nevergrad](https://raw.githubusercontent.com/facebookresearch/nevergrad/1.0.1/docs/resources/Nevergrad-LogoMark.png)


`nevergrad` is a Python 3.8+ library. It can be installed with:

```
pip install nevergrad
```

More installation options, including windows installation, and complete instructions are available in the "Getting started" section of the [**documentation**](https://facebookresearch.github.io/nevergrad/).

You can join Nevergrad users Facebook group [here](https://www.facebook.com/groups/nevergradusers/).

Minimizing a function using an optimizer (here `NGOpt`) is straightforward:

```python
import nevergrad as ng

def square(x):
    return sum((x - .5)**2)

optimizer = ng.optimizers.NGOpt(parametrization=2, budget=100)
recommendation = optimizer.minimize(square)
print(recommendation.value)  # recommended value
>>> [0.49971112 0.5002944]
```

`nevergrad` can also support bounded continuous variables as well as discrete variables, and mixture of those.
To do this, one can specify the input space:

```python
import nevergrad as ng

def fake_training(learning_rate: float, batch_size: int, architecture: str) -> float:
    # optimal for learning_rate=0.2, batch_size=4, architecture="conv"
    return (learning_rate - 0.2)**2 + (batch_size - 4)**2 + (0 if architecture == "conv" else 10)

# Instrumentation class is used for functions with multiple inputs
# (positional and/or keywords)
parametrization = ng.p.Instrumentation(
    # a log-distributed scalar between 0.001 and 1.0
    learning_rate=ng.p.Log(lower=0.001, upper=1.0),
    # an integer from 1 to 12
    batch_size=ng.p.Scalar(lower=1, upper=12).set_integer_casting(),
    # either "conv" or "fc"
    architecture=ng.p.Choice(["conv", "fc"])
)

optimizer = ng.optimizers.NGOpt(parametrization=parametrization, budget=100)
recommendation = optimizer.minimize(fake_training)

# show the recommended keyword arguments of the function
print(recommendation.kwargs)
>>> {'learning_rate': 0.1998, 'batch_size': 4, 'architecture': 'conv'}
```

Learn more on parametrization in the [**documentation**](https://facebookresearch.github.io/nevergrad/)!

![Example of optimization](https://raw.githubusercontent.com/facebookresearch/nevergrad/1.0.1/docs/resources/TwoPointsDE.gif)

*Convergence of a population of points to the minima with two-points DE.*


## Documentation

Check out our [**documentation**](https://facebookresearch.github.io/nevergrad/)! It's still a work in progress, don't hesitate to submit issues and/or PR to update it and make it clearer!
The last version of our [**data**](https://drive.google.com/file/d/1p8d1bMCDlvWrDIMXP7fT9pJa1cgjH3NM/view?usp=sharing) and the last version of our [**PDF report**](https://tinyurl.com/dagstuhloid). 

## Citing

```bibtex
@misc{nevergrad,
    author = {J. Rapin and O. Teytaud},
    title = {{Nevergrad - A gradient-free optimization platform}},
    year = {2018},
    publisher = {GitHub},
    journal = {GitHub repository},
    howpublished = {\url{https://GitHub.com/FacebookResearch/Nevergrad}},
}
```

## License

`nevergrad` is released under the MIT license. See [LICENSE](https://github.com/facebookresearch/nevergrad/blob/1.0.1/LICENSE) for additional details about it.
See also our [Terms of Use](https://opensource.facebook.com/legal/terms) and [Privacy Policy](https://opensource.facebook.com/legal/privacy).

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/facebookresearch/nevergrad",
    "name": "nevergrad",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.6",
    "maintainer_email": "",
    "keywords": "",
    "author": "Facebook AI Research",
    "author_email": "",
    "download_url": "https://files.pythonhosted.org/packages/88/05/c7aa6b3be24ad5a4e1976644646a11bba15c5eef80ee8aaddd06d25fbd13/nevergrad-1.0.1.tar.gz",
    "platform": null,
    "description": "[![Support Ukraine](https://img.shields.io/badge/Support-Ukraine-FFD500?style=flat&labelColor=005BBB)](https://opensource.fb.com/support-ukraine) [![CircleCI](https://circleci.com/gh/facebookresearch/nevergrad/tree/main.svg?style=svg)](https://circleci.com/gh/facebookresearch/nevergrad/tree/main)\n\n# Nevergrad - A gradient-free optimization platform\n\n![Nevergrad](https://raw.githubusercontent.com/facebookresearch/nevergrad/1.0.1/docs/resources/Nevergrad-LogoMark.png)\n\n\n`nevergrad` is a Python 3.8+ library. It can be installed with:\n\n```\npip install nevergrad\n```\n\nMore installation options, including windows installation, and complete instructions are available in the \"Getting started\" section of the [**documentation**](https://facebookresearch.github.io/nevergrad/).\n\nYou can join Nevergrad users Facebook group [here](https://www.facebook.com/groups/nevergradusers/).\n\nMinimizing a function using an optimizer (here `NGOpt`) is straightforward:\n\n```python\nimport nevergrad as ng\n\ndef square(x):\n    return sum((x - .5)**2)\n\noptimizer = ng.optimizers.NGOpt(parametrization=2, budget=100)\nrecommendation = optimizer.minimize(square)\nprint(recommendation.value)  # recommended value\n>>> [0.49971112 0.5002944]\n```\n\n`nevergrad` can also support bounded continuous variables as well as discrete variables, and mixture of those.\nTo do this, one can specify the input space:\n\n```python\nimport nevergrad as ng\n\ndef fake_training(learning_rate: float, batch_size: int, architecture: str) -> float:\n    # optimal for learning_rate=0.2, batch_size=4, architecture=\"conv\"\n    return (learning_rate - 0.2)**2 + (batch_size - 4)**2 + (0 if architecture == \"conv\" else 10)\n\n# Instrumentation class is used for functions with multiple inputs\n# (positional and/or keywords)\nparametrization = ng.p.Instrumentation(\n    # a log-distributed scalar between 0.001 and 1.0\n    learning_rate=ng.p.Log(lower=0.001, upper=1.0),\n    # an integer from 1 to 12\n    batch_size=ng.p.Scalar(lower=1, upper=12).set_integer_casting(),\n    # either \"conv\" or \"fc\"\n    architecture=ng.p.Choice([\"conv\", \"fc\"])\n)\n\noptimizer = ng.optimizers.NGOpt(parametrization=parametrization, budget=100)\nrecommendation = optimizer.minimize(fake_training)\n\n# show the recommended keyword arguments of the function\nprint(recommendation.kwargs)\n>>> {'learning_rate': 0.1998, 'batch_size': 4, 'architecture': 'conv'}\n```\n\nLearn more on parametrization in the [**documentation**](https://facebookresearch.github.io/nevergrad/)!\n\n![Example of optimization](https://raw.githubusercontent.com/facebookresearch/nevergrad/1.0.1/docs/resources/TwoPointsDE.gif)\n\n*Convergence of a population of points to the minima with two-points DE.*\n\n\n## Documentation\n\nCheck out our [**documentation**](https://facebookresearch.github.io/nevergrad/)! It's still a work in progress, don't hesitate to submit issues and/or PR to update it and make it clearer!\nThe last version of our [**data**](https://drive.google.com/file/d/1p8d1bMCDlvWrDIMXP7fT9pJa1cgjH3NM/view?usp=sharing) and the last version of our [**PDF report**](https://tinyurl.com/dagstuhloid). \n\n## Citing\n\n```bibtex\n@misc{nevergrad,\n    author = {J. Rapin and O. Teytaud},\n    title = {{Nevergrad - A gradient-free optimization platform}},\n    year = {2018},\n    publisher = {GitHub},\n    journal = {GitHub repository},\n    howpublished = {\\url{https://GitHub.com/FacebookResearch/Nevergrad}},\n}\n```\n\n## License\n\n`nevergrad` is released under the MIT license. See [LICENSE](https://github.com/facebookresearch/nevergrad/blob/1.0.1/LICENSE) for additional details about it.\nSee also our [Terms of Use](https://opensource.facebook.com/legal/terms) and [Privacy Policy](https://opensource.facebook.com/legal/privacy).\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "A Python toolbox for performing gradient-free optimization",
    "version": "1.0.1",
    "project_urls": {
        "Homepage": "https://github.com/facebookresearch/nevergrad"
    },
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "9e02dbc764a67cdeb4143c16b28c7b061b3f0a3968cc38c59f8ad761023f0e0b",
                "md5": "1b889917eab3ade81f8c897d21e9799c",
                "sha256": "b92c688a95b33127d445e875d0e843a5f3693066c70040c7a103f3156dce9f80"
            },
            "downloads": -1,
            "filename": "nevergrad-1.0.1-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "1b889917eab3ade81f8c897d21e9799c",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.6",
            "size": 477736,
            "upload_time": "2023-11-19T06:44:47",
            "upload_time_iso_8601": "2023-11-19T06:44:47.700469Z",
            "url": "https://files.pythonhosted.org/packages/9e/02/dbc764a67cdeb4143c16b28c7b061b3f0a3968cc38c59f8ad761023f0e0b/nevergrad-1.0.1-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "8805c7aa6b3be24ad5a4e1976644646a11bba15c5eef80ee8aaddd06d25fbd13",
                "md5": "e74e0455c8a25d089550f2214c318f69",
                "sha256": "401eff93de6cd57a4a31f30a76149b638183ba9ba31eddb12a99a4306b43c671"
            },
            "downloads": -1,
            "filename": "nevergrad-1.0.1.tar.gz",
            "has_sig": false,
            "md5_digest": "e74e0455c8a25d089550f2214c318f69",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.6",
            "size": 384899,
            "upload_time": "2023-11-19T06:44:49",
            "upload_time_iso_8601": "2023-11-19T06:44:49.699477Z",
            "url": "https://files.pythonhosted.org/packages/88/05/c7aa6b3be24ad5a4e1976644646a11bba15c5eef80ee8aaddd06d25fbd13/nevergrad-1.0.1.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-11-19 06:44:49",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "facebookresearch",
    "github_project": "nevergrad",
    "travis_ci": false,
    "coveralls": true,
    "github_actions": false,
    "circle": true,
    "lcname": "nevergrad"
}
        
Elapsed time: 0.15159s