# Neptune + Optuna integration
Neptune is a lightweight experiment tracker that offers a single place to track, compare, store, and collaborate on experiments and models.
This integration lets you use it as an Optuna visualization dashboard to log and monitor hyperparameter sweeps live.
## What will you get with this integration?
* Log and monitor the Optuna hyperparameter sweep live:
* values and params for each Trial
* best values and params for the Study
* hardware consumption and console logs
* interactive plots from the optuna.visualization module
* parameter distributions for each Trial
* Study object itself for 'InMemoryStorage' or the database location for the Studies with database storage
* Load the Study directly from the existing Neptune run
![image](https://docs.neptune.ai/img/app/integrations/optuna.png)
## Resources
* [Documentation](https://docs.neptune.ai/integrations/optuna)
* [Code example on GitHub](https://github.com/neptune-ai/examples/blob/main/integrations-and-supported-tools/optuna/scripts)
* [Run logged in the Neptune app](https://app.neptune.ai/o/common/org/optuna-integration/runs/details?viewId=b6190a29-91be-4e64-880a-8f6085a6bb78&detailsTab=dashboard&dashboardId=Vizualizations-5ea92658-6a56-4656-b225-e81c6fbfc8ab&shortId=NEP1-18517&type=run)
* [Run example in Google Colab](https://colab.research.google.com/github/neptune-ai/examples/blob/master/integrations-and-supported-tools/optuna/notebooks/Neptune_Optuna_integration.ipynb)
## Example
On the command line:
```
pip install neptune-optuna
```
In Python:
```python
import neptune
import neptune.integrations.optuna as npt_utils
# Start a run
run = neptune.init_run(
api_token=neptune.ANONYMOUS_API_TOKEN,
project="common/optuna-integration",
)
# Create a NeptuneCallback instance
neptune_callback = npt_utils.NeptuneCallback(run)
# Pass the callback to study.optimize()
study = optuna.create_study(direction="maximize")
study.optimize(objective, n_trials=100, callbacks=[neptune_callback])
# Watch the optimization live in Neptune
```
## Support
If you got stuck or simply want to talk to us, here are your options:
* Check our [FAQ page](https://docs.neptune.ai/getting_help)
* You can submit bug reports, feature requests, or contributions directly to the repository.
* Chat! When in the Neptune application click on the blue message icon in the bottom-right corner and send a message. A real person will talk to you ASAP (typically very ASAP),
* You can just shoot us an email at support@neptune.ai
Raw data
{
"_id": null,
"home_page": "https://neptune.ai/",
"name": "neptune-optuna",
"maintainer": "",
"docs_url": null,
"requires_python": ">=3.7,<4.0",
"maintainer_email": "",
"keywords": "MLOps,ML Experiment Tracking,ML Model Registry,ML Model Store,ML Metadata Store",
"author": "neptune.ai",
"author_email": "contact@neptune.ai",
"download_url": "https://files.pythonhosted.org/packages/d7/e4/c947b537a2e2af78c2a547b5b3089bca7b2fbadbf3a99b0adb5687dc7084/neptune_optuna-1.2.0.tar.gz",
"platform": null,
"description": "# Neptune + Optuna integration\n\nNeptune is a lightweight experiment tracker that offers a single place to track, compare, store, and collaborate on experiments and models.\n\nThis integration lets you use it as an Optuna visualization dashboard to log and monitor hyperparameter sweeps live.\n\n## What will you get with this integration?\n\n* Log and monitor the Optuna hyperparameter sweep live:\n * values and params for each Trial\n * best values and params for the Study\n * hardware consumption and console logs\n * interactive plots from the optuna.visualization module\n * parameter distributions for each Trial\n * Study object itself for 'InMemoryStorage' or the database location for the Studies with database storage\n* Load the Study directly from the existing Neptune run\n\n![image](https://docs.neptune.ai/img/app/integrations/optuna.png)\n\n## Resources\n\n* [Documentation](https://docs.neptune.ai/integrations/optuna)\n* [Code example on GitHub](https://github.com/neptune-ai/examples/blob/main/integrations-and-supported-tools/optuna/scripts)\n* [Run logged in the Neptune app](https://app.neptune.ai/o/common/org/optuna-integration/runs/details?viewId=b6190a29-91be-4e64-880a-8f6085a6bb78&detailsTab=dashboard&dashboardId=Vizualizations-5ea92658-6a56-4656-b225-e81c6fbfc8ab&shortId=NEP1-18517&type=run)\n* [Run example in Google Colab](https://colab.research.google.com/github/neptune-ai/examples/blob/master/integrations-and-supported-tools/optuna/notebooks/Neptune_Optuna_integration.ipynb)\n\n## Example\n\nOn the command line:\n\n```\npip install neptune-optuna\n```\n\nIn Python:\n\n```python\nimport neptune\nimport neptune.integrations.optuna as npt_utils\n\n# Start a run\nrun = neptune.init_run(\n api_token=neptune.ANONYMOUS_API_TOKEN,\n project=\"common/optuna-integration\",\n)\n\n# Create a NeptuneCallback instance\nneptune_callback = npt_utils.NeptuneCallback(run)\n\n# Pass the callback to study.optimize()\nstudy = optuna.create_study(direction=\"maximize\")\nstudy.optimize(objective, n_trials=100, callbacks=[neptune_callback])\n\n# Watch the optimization live in Neptune\n```\n\n## Support\n\nIf you got stuck or simply want to talk to us, here are your options:\n\n* Check our [FAQ page](https://docs.neptune.ai/getting_help)\n* You can submit bug reports, feature requests, or contributions directly to the repository.\n* Chat! When in the Neptune application click on the blue message icon in the bottom-right corner and send a message. A real person will talk to you ASAP (typically very ASAP),\n* You can just shoot us an email at support@neptune.ai\n\n",
"bugtrack_url": null,
"license": "Apache-2.0",
"summary": "Neptune.ai Optuna integration library",
"version": "1.2.0",
"project_urls": {
"Documentation": "https://docs.neptune.ai/integrations-and-supported-tools/model-training/optuna",
"Homepage": "https://neptune.ai/",
"Repository": "https://github.com/neptune-ai/neptune-optuna",
"Tracker": "https://github.com/neptune-ai/neptune-optuna/issues"
},
"split_keywords": [
"mlops",
"ml experiment tracking",
"ml model registry",
"ml model store",
"ml metadata store"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "aad453a888e72d19fad14f34369faff55cda47ac36dacbb010774ad55d5d6aad",
"md5": "e6db1aa23b99c1c7df8277a0255f704c",
"sha256": "b628289be090e3fcf0c4107ce33684f0fefbfa80baf247aadccd8c8f8b654756"
},
"downloads": -1,
"filename": "neptune_optuna-1.2.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "e6db1aa23b99c1c7df8277a0255f704c",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.7,<4.0",
"size": 14213,
"upload_time": "2024-02-14T15:47:17",
"upload_time_iso_8601": "2024-02-14T15:47:17.944084Z",
"url": "https://files.pythonhosted.org/packages/aa/d4/53a888e72d19fad14f34369faff55cda47ac36dacbb010774ad55d5d6aad/neptune_optuna-1.2.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "d7e4c947b537a2e2af78c2a547b5b3089bca7b2fbadbf3a99b0adb5687dc7084",
"md5": "a2e006b5d17f81921aee8a5ef8b96e21",
"sha256": "b0c7df2afc704357cb6510f8e6525a38a689049059076afe986ca039264e4210"
},
"downloads": -1,
"filename": "neptune_optuna-1.2.0.tar.gz",
"has_sig": false,
"md5_digest": "a2e006b5d17f81921aee8a5ef8b96e21",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.7,<4.0",
"size": 13930,
"upload_time": "2024-02-14T15:47:19",
"upload_time_iso_8601": "2024-02-14T15:47:19.121680Z",
"url": "https://files.pythonhosted.org/packages/d7/e4/c947b537a2e2af78c2a547b5b3089bca7b2fbadbf3a99b0adb5687dc7084/neptune_optuna-1.2.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-02-14 15:47:19",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "neptune-ai",
"github_project": "neptune-optuna",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"lcname": "neptune-optuna"
}