Name | bs-scheduler JSON |
Version |
0.5.1
JSON |
| download |
home_page | None |
Summary | A PyTorch Dataloader compatible batch size scheduler library. |
upload_time | 2024-09-16 09:03:39 |
maintainer | None |
docs_url | None |
author | None |
requires_python | >=3.9 |
license | BSD 3-Clause License Copyright (c) 2023, ancestor-mithril Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: 1. Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. 2. Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. 3. Neither the name of the copyright holder nor the names of its contributors may be used to endorse or promote products derived from this software without specific prior written permission. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. |
keywords |
deep learning
pytorch
batch size scheduler
dynamic batch size
|
VCS |
|
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
# bs-scheduler
A Batch Size Scheduler library compatible with PyTorch DataLoaders.
***
## Documentation
* [API Reference](https://ancestor-mithril.github.io/bs-scheduler).
* [Examples](https://ancestor-mithril.github.io/bs-scheduler/tutorials).
<!--For Release Notes, see TODO. -->
***
## Why use a Batch Size Scheduler?
* Using a big batch size has several advantages:
* Better hardware utilization.
* Enhanced parallelism.
* Faster training.
* However, using a big batch size from the start may lead to a generalization gap.
* Therefore, the solution is to gradually increase the batch size, similar to a learning rate decay policy.
* See [Don't Decay the Learning Rate, Increase the Batch Size](https://arxiv.org/abs/1711.00489).
## Available Schedulers
### Batch Size Schedulers
1. `LambdaBS` - sets the batch size to the base batch size times a given lambda.
2. `MultiplicativeBS` - sets the batch size to the current batch size times a given lambda.
3. `StepBS` - multiplies the batch size with a given factor at a given number of steps.
4. `MultiStepBS` - multiplies the batch size with a given factor each time a milestone is reached.
5. `ConstantBS` - multiplies the batch size by a given factor once and decreases it again to its base value after a
given number of steps.
6. `LinearBS` - increases the batch size by a linearly changing multiplicative factor for a given number of steps.
7. `ExponentialBS` - increases the batch size by a given $\gamma$ each step.
8. `PolynomialBS` - increases the batch size using a polynomial function in a given number of steps.
9. `CosineAnnealingBS` - increases the batch size to a maximum batch size and decreases it again following a cyclic
cosine curve.
10. `IncreaseBSOnPlateau` - increases the batch size each time a given metric has stopped improving for a given number
of steps.
11. `CyclicBS` - cycles the batch size between two boundaries with a constant frequency, while also scaling the
distance between boundaries.
12. `CosineAnnealingBSWithWarmRestarts` - increases the batch size to a maximum batch size following a cosine curve,
then restarts while also scaling the number of iterations until the next restart.
13. `OneCycleBS` - decreases the batch size to a minimum batch size then increases it to a given maximum batch size,
following a linear or cosine annealing strategy.
14. `SequentialBS` - calls a list of schedulers sequentially given a list of milestone points which reflect which
scheduler should be called when.
15. `ChainedBSScheduler` - chains a list of batch size schedulers and calls them together each step.
<!--
## Quick Start
TODO.
-->
## Installation
Please install [PyTorch](https://github.com/pytorch/pytorch) first before installing this repository.
```
pip install bs-scheduler
```
## Licensing
The library is licensed under the [BSD-3-Clause license](LICENSE).
## Citation
To be added...
<!--Citation: TODO. -->
Raw data
{
"_id": null,
"home_page": null,
"name": "bs-scheduler",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.9",
"maintainer_email": "George Stoica <george.stoica@senticlab.com>",
"keywords": "deep learning, pytorch, batch size scheduler, dynamic batch size",
"author": null,
"author_email": "George Stoica <george.stoica@senticlab.com>",
"download_url": "https://files.pythonhosted.org/packages/62/38/7fceae7576ae6f651add3781e040e118ce569c6f3d9442872530d07fcda1/bs_scheduler-0.5.1.tar.gz",
"platform": null,
"description": "# bs-scheduler\n\nA Batch Size Scheduler library compatible with PyTorch DataLoaders.\n\n*** \n\n## Documentation\n\n* [API Reference](https://ancestor-mithril.github.io/bs-scheduler).\n\n* [Examples](https://ancestor-mithril.github.io/bs-scheduler/tutorials).\n\n<!--For Release Notes, see TODO. -->\n\n***\n\n## Why use a Batch Size Scheduler?\n\n* Using a big batch size has several advantages:\n * Better hardware utilization.\n * Enhanced parallelism.\n * Faster training.\n* However, using a big batch size from the start may lead to a generalization gap.\n* Therefore, the solution is to gradually increase the batch size, similar to a learning rate decay policy.\n* See [Don't Decay the Learning Rate, Increase the Batch Size](https://arxiv.org/abs/1711.00489).\n\n\n## Available Schedulers\n\n### Batch Size Schedulers\n\n1. `LambdaBS` - sets the batch size to the base batch size times a given lambda.\n2. `MultiplicativeBS` - sets the batch size to the current batch size times a given lambda.\n3. `StepBS` - multiplies the batch size with a given factor at a given number of steps.\n4. `MultiStepBS` - multiplies the batch size with a given factor each time a milestone is reached.\n5. `ConstantBS` - multiplies the batch size by a given factor once and decreases it again to its base value after a\n given number of steps.\n6. `LinearBS` - increases the batch size by a linearly changing multiplicative factor for a given number of steps.\n7. `ExponentialBS` - increases the batch size by a given $\\gamma$ each step.\n8. `PolynomialBS` - increases the batch size using a polynomial function in a given number of steps.\n9. `CosineAnnealingBS` - increases the batch size to a maximum batch size and decreases it again following a cyclic\n cosine curve.\n10. `IncreaseBSOnPlateau` - increases the batch size each time a given metric has stopped improving for a given number\n of steps.\n11. `CyclicBS` - cycles the batch size between two boundaries with a constant frequency, while also scaling the\n distance between boundaries.\n12. `CosineAnnealingBSWithWarmRestarts` - increases the batch size to a maximum batch size following a cosine curve,\n then restarts while also scaling the number of iterations until the next restart.\n13. `OneCycleBS` - decreases the batch size to a minimum batch size then increases it to a given maximum batch size,\n following a linear or cosine annealing strategy.\n14. `SequentialBS` - calls a list of schedulers sequentially given a list of milestone points which reflect which\n scheduler should be called when.\n15. `ChainedBSScheduler` - chains a list of batch size schedulers and calls them together each step.\n\n<!--\n\n## Quick Start\n\nTODO.\n\n-->\n\n## Installation\n\nPlease install [PyTorch](https://github.com/pytorch/pytorch) first before installing this repository.\n\n```\npip install bs-scheduler\n```\n\n## Licensing\n\nThe library is licensed under the [BSD-3-Clause license](LICENSE).\n\n## Citation\n\nTo be added...\n\n<!--Citation: TODO. -->\n",
"bugtrack_url": null,
"license": "BSD 3-Clause License Copyright (c) 2023, ancestor-mithril Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: 1. Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. 2. Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. 3. Neither the name of the copyright holder nor the names of its contributors may be used to endorse or promote products derived from this software without specific prior written permission. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS \"AS IS\" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. ",
"summary": "A PyTorch Dataloader compatible batch size scheduler library.",
"version": "0.5.1",
"project_urls": {
"Issues": "https://github.com/ancestor-mithril/bs-scheduler/issues",
"Repository": "https://github.com/ancestor-mithril/bs-scheduler"
},
"split_keywords": [
"deep learning",
" pytorch",
" batch size scheduler",
" dynamic batch size"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "ae2bdb0056e9786caa9e92d190dd116308bec616cc59d0c2fd029c7dfc2665a1",
"md5": "ea62efec83e4302004779fc49b97e823",
"sha256": "709cb74bf04ee7a2928951a5f241b40cd8ef24947e625fa0cfe2fd8d187464fb"
},
"downloads": -1,
"filename": "bs_scheduler-0.5.1-py3-none-any.whl",
"has_sig": false,
"md5_digest": "ea62efec83e4302004779fc49b97e823",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.9",
"size": 17914,
"upload_time": "2024-09-16T09:03:38",
"upload_time_iso_8601": "2024-09-16T09:03:38.022194Z",
"url": "https://files.pythonhosted.org/packages/ae/2b/db0056e9786caa9e92d190dd116308bec616cc59d0c2fd029c7dfc2665a1/bs_scheduler-0.5.1-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "62387fceae7576ae6f651add3781e040e118ce569c6f3d9442872530d07fcda1",
"md5": "b58261a30550179e77ada0122371b5e0",
"sha256": "42c7ac2413108f3e79375effefa740c078caf642ad1fd42cb2170ca89451618f"
},
"downloads": -1,
"filename": "bs_scheduler-0.5.1.tar.gz",
"has_sig": false,
"md5_digest": "b58261a30550179e77ada0122371b5e0",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.9",
"size": 26137,
"upload_time": "2024-09-16T09:03:39",
"upload_time_iso_8601": "2024-09-16T09:03:39.060159Z",
"url": "https://files.pythonhosted.org/packages/62/38/7fceae7576ae6f651add3781e040e118ce569c6f3d9442872530d07fcda1/bs_scheduler-0.5.1.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-09-16 09:03:39",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "ancestor-mithril",
"github_project": "bs-scheduler",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"lcname": "bs-scheduler"
}