finetuning-scheduler


Namefinetuning-scheduler JSON
Version 2.5.0 PyPI version JSON
download
home_pagehttps://github.com/speediedan/finetuning-scheduler
SummaryA PyTorch Lightning extension that enhances model experimentation with flexible fine-tuning schedules.
upload_time2024-12-20 19:13:50
maintainerNone
docs_urlNone
authorDan Dale
requires_python>=3.9
licenseApache-2.0
keywords deep learning pytorch ai machine learning pytorch-lightning lightning fine-tuning finetuning
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            <div align="center">

<img src="https://github.com/speediedan/finetuning-scheduler/raw/v2.5.0/docs/source/_static/images/logos/logo_fts.png" width="401px">

**A PyTorch Lightning extension that enhances model experimentation with flexible fine-tuning schedules.**

______________________________________________________________________

<p align="center">
  <a href="https://finetuning-scheduler.readthedocs.io/en/stable/">Docs</a> •
  <a href="#Setup">Setup</a> •
  <a href="#examples">Examples</a> •
  <a href="#community">Community</a>
</p>

[![PyPI - Python Version](https://img.shields.io/pypi/pyversions/finetuning-scheduler)](https://pypi.org/project/finetuning-scheduler/)
[![PyPI Status](https://badge.fury.io/py/finetuning-scheduler.svg)](https://badge.fury.io/py/finetuning-scheduler)\
[![codecov](https://codecov.io/gh/speediedan/finetuning-scheduler/release/2.5.0/graph/badge.svg?flag=gpu)](https://codecov.io/gh/speediedan/finetuning-scheduler)
[![ReadTheDocs](https://readthedocs.org/projects/finetuning-scheduler/badge/?version=latest)](https://finetuning-scheduler.readthedocs.io/en/stable/)
[![DOI](https://zenodo.org/badge/455666112.svg)](https://zenodo.org/badge/latestdoi/455666112)
[![license](https://img.shields.io/badge/License-Apache%202.0-blue.svg)](https://github.com/speediedan/finetuning-scheduler/blob/master/LICENSE)

</div>

______________________________________________________________________

<img width="300px" src="https://github.com/speediedan/finetuning-scheduler/raw/v2.5.0/docs/source/_static/images/fts/fts_explicit_loss_anim.gif" alt="FinetuningScheduler explicit loss animation" align="right"/>

[FinetuningScheduler](https://finetuning-scheduler.readthedocs.io/en/stable/api/finetuning_scheduler.fts.html#finetuning_scheduler.fts.FinetuningScheduler) is simple to use yet powerful, offering a number of features that facilitate model research and exploration:

- easy specification of flexible fine-tuning schedules with explicit or regex-based parameter selection
  - implicit schedules for initial/naive model exploration
  - explicit schedules for performance tuning, fine-grained behavioral experimentation and computational efficiency
- automatic restoration of best per-phase checkpoints driven by iterative application of early-stopping criteria to each fine-tuning phase
- composition of early-stopping and manually-set epoch-driven fine-tuning phase transitions

______________________________________________________________________

## Setup

### Step 0: Install from PyPI

```bash
pip install finetuning-scheduler
```

<!--  -->

### Step 1: Import the FinetuningScheduler callback and start fine-tuning!

```python
import lightning as L
from finetuning_scheduler import FinetuningScheduler

trainer = L.Trainer(callbacks=[FinetuningScheduler()])
```

Get started by following [the Fine-Tuning Scheduler introduction](https://finetuning-scheduler.readthedocs.io/en/stable/index.html) which includes a [CLI-based example](https://finetuning-scheduler.readthedocs.io/en/stable/index.html#example-scheduled-fine-tuning-for-superglue) or by following the [notebook-based](https://pytorch-lightning.readthedocs.io/en/stable/notebooks/lightning_examples/finetuning-scheduler.html) Fine-Tuning Scheduler tutorial.

______________________________________________________________________

### Installation Using the Standalone `pytorch-lightning` Package

*applicable to versions >= `2.0.0`*

Now that the core Lightning package is `lightning` rather than `pytorch-lightning`, Fine-Tuning Scheduler (FTS) by default depends upon the `lightning` package rather than the standalone `pytorch-lightning`. If you would like to continue to use FTS with the standalone `pytorch-lightning` package instead, you can still do so as follows:

Install a given FTS release (for example v2.0.0) using standalone `pytorch-lightning`:

```bash
export FTS_VERSION=2.0.0
export PACKAGE_NAME=pytorch
wget https://github.com/speediedan/finetuning-scheduler/releases/download/v${FTS_VERSION}/finetuning-scheduler-${FTS_VERSION}.tar.gz
pip install finetuning-scheduler-${FTS_VERSION}.tar.gz
```

______________________________________________________________________

## Examples

### Scheduled Fine-Tuning For SuperGLUE

- [Notebook-based Tutorial](https://pytorch-lightning.readthedocs.io/en/stable/notebooks/lightning_examples/finetuning-scheduler.html)
- [CLI-based Tutorial](https://finetuning-scheduler.readthedocs.io/en/stable/#example-scheduled-fine-tuning-for-superglue)
- [FSDP Scheduled Fine-Tuning](https://finetuning-scheduler.readthedocs.io/en/stable/advanced/fsdp_scheduled_fine_tuning.html)
- [LR Scheduler Reinitialization](https://finetuning-scheduler.readthedocs.io/en/stable/advanced/lr_scheduler_reinitialization.html) (advanced)
- [Optimizer Reinitialization](https://finetuning-scheduler.readthedocs.io/en/stable/advanced/optimizer_reinitialization.html) (advanced)

______________________________________________________________________

## Continuous Integration

Fine-Tuning Scheduler is rigorously tested across multiple CPUs, GPUs and against major Python and PyTorch versions. Each Fine-Tuning Scheduler minor release (major.minor.patch) is paired with a Lightning minor release (e.g. Fine-Tuning Scheduler 2.0 depends upon Lightning 2.0).

To ensure maximum stability, the latest Lightning patch release fully tested with Fine-Tuning Scheduler is set as a maximum dependency in Fine-Tuning Scheduler's requirements.txt (e.g. \<= 1.7.1). If you'd like to test a specific Lightning patch version greater than that currently in Fine-Tuning Scheduler's requirements.txt, it will likely work but you should install Fine-Tuning Scheduler from source and update the requirements.txt as desired.

<details>
  <summary>Current build statuses for Fine-Tuning Scheduler </summary>

| System / (PyTorch/Python ver) |                                                                                                        2.2.2/3.9                                                                                                         |                                                                                                              2.5.1/3.9, 2.5.1/3.12                                                                                                               |
| :---------------------------: | :----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------: | :----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------: |
|      Linux \[GPUs\*\*\]       |                                                                                                            -                                                                                                             | [![Build Status](https://dev.azure.com//speediedan/finetuning-scheduler/_apis/build/status/Multi-GPU%20&%20Example%20Tests?branchName=refs%2Ftags%2F2.5.0)](https://dev.azure.com/speediedan/finetuning-scheduler/_build/latest?definitionId=2&branchName=refs%2Ftags%2F2.5.0) |
|     Linux (Ubuntu 22.04)      | [![Test](https://github.com/speediedan/finetuning-scheduler/actions/workflows/ci_test-full.yml/badge.svg?tag=2.5.0)](https://github.com/speediedan/finetuning-scheduler/actions/workflows/ci_test-full.yml) |             [![Test](https://github.com/speediedan/finetuning-scheduler/actions/workflows/ci_test-full.yml/badge.svg?tag=2.5.0)](https://github.com/speediedan/finetuning-scheduler/actions/workflows/ci_test-full.yml)             |
|           OSX (14)            | [![Test](https://github.com/speediedan/finetuning-scheduler/actions/workflows/ci_test-full.yml/badge.svg?tag=2.5.0)](https://github.com/speediedan/finetuning-scheduler/actions/workflows/ci_test-full.yml) |             [![Test](https://github.com/speediedan/finetuning-scheduler/actions/workflows/ci_test-full.yml/badge.svg?tag=2.5.0)](https://github.com/speediedan/finetuning-scheduler/actions/workflows/ci_test-full.yml)             |
|        Windows (2022)         | [![Test](https://github.com/speediedan/finetuning-scheduler/actions/workflows/ci_test-full.yml/badge.svg?tag=2.5.0)](https://github.com/speediedan/finetuning-scheduler/actions/workflows/ci_test-full.yml) |             [![Test](https://github.com/speediedan/finetuning-scheduler/actions/workflows/ci_test-full.yml/badge.svg?tag=2.5.0)](https://github.com/speediedan/finetuning-scheduler/actions/workflows/ci_test-full.yml)             |

- \*\* tests run on one RTX 4090 and one RTX 2070

</details>

## Community

Fine-Tuning Scheduler is developed and maintained by the community in close communication with the [Lightning team](https://pytorch-lightning.readthedocs.io/en/stable/governance.html). Thanks to everyone in the community for their tireless effort building and improving the immensely useful core Lightning project.

PR's welcome! Please see the [contributing guidelines](https://finetuning-scheduler.readthedocs.io/en/stable/generated/CONTRIBUTING.html) (which are essentially the same as Lightning's).

______________________________________________________________________

## Citing Fine-Tuning Scheduler

Please cite:

```tex
@misc{Dan_Dale_2022_6463952,
    author       = {Dan Dale},
    title        = {{Fine-Tuning Scheduler}},
    month        = Feb,
    year         = 2022,
    doi          = {10.5281/zenodo.6463952},
    publisher    = {Zenodo},
    url          = {https://zenodo.org/record/6463952}
    }
```

Feel free to star the repo as well if you find it useful or interesting. Thanks 😊!

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/speediedan/finetuning-scheduler",
    "name": "finetuning-scheduler",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.9",
    "maintainer_email": null,
    "keywords": "deep learning, pytorch, AI, machine learning, pytorch-lightning, lightning, fine-tuning, finetuning",
    "author": "Dan Dale",
    "author_email": "danny.dale@gmail.com",
    "download_url": "https://files.pythonhosted.org/packages/57/31/4592b9db25aa8ca7f06323c334ee076b5e7a7fa7671f0f173b4762be7d1e/finetuning_scheduler-2.5.0.tar.gz",
    "platform": null,
    "description": "<div align=\"center\">\n\n<img src=\"https://github.com/speediedan/finetuning-scheduler/raw/v2.5.0/docs/source/_static/images/logos/logo_fts.png\" width=\"401px\">\n\n**A PyTorch Lightning extension that enhances model experimentation with flexible fine-tuning schedules.**\n\n______________________________________________________________________\n\n<p align=\"center\">\n  <a href=\"https://finetuning-scheduler.readthedocs.io/en/stable/\">Docs</a> \u2022\n  <a href=\"#Setup\">Setup</a> \u2022\n  <a href=\"#examples\">Examples</a> \u2022\n  <a href=\"#community\">Community</a>\n</p>\n\n[![PyPI - Python Version](https://img.shields.io/pypi/pyversions/finetuning-scheduler)](https://pypi.org/project/finetuning-scheduler/)\n[![PyPI Status](https://badge.fury.io/py/finetuning-scheduler.svg)](https://badge.fury.io/py/finetuning-scheduler)\\\n[![codecov](https://codecov.io/gh/speediedan/finetuning-scheduler/release/2.5.0/graph/badge.svg?flag=gpu)](https://codecov.io/gh/speediedan/finetuning-scheduler)\n[![ReadTheDocs](https://readthedocs.org/projects/finetuning-scheduler/badge/?version=latest)](https://finetuning-scheduler.readthedocs.io/en/stable/)\n[![DOI](https://zenodo.org/badge/455666112.svg)](https://zenodo.org/badge/latestdoi/455666112)\n[![license](https://img.shields.io/badge/License-Apache%202.0-blue.svg)](https://github.com/speediedan/finetuning-scheduler/blob/master/LICENSE)\n\n</div>\n\n______________________________________________________________________\n\n<img width=\"300px\" src=\"https://github.com/speediedan/finetuning-scheduler/raw/v2.5.0/docs/source/_static/images/fts/fts_explicit_loss_anim.gif\" alt=\"FinetuningScheduler explicit loss animation\" align=\"right\"/>\n\n[FinetuningScheduler](https://finetuning-scheduler.readthedocs.io/en/stable/api/finetuning_scheduler.fts.html#finetuning_scheduler.fts.FinetuningScheduler) is simple to use yet powerful, offering a number of features that facilitate model research and exploration:\n\n- easy specification of flexible fine-tuning schedules with explicit or regex-based parameter selection\n  - implicit schedules for initial/naive model exploration\n  - explicit schedules for performance tuning, fine-grained behavioral experimentation and computational efficiency\n- automatic restoration of best per-phase checkpoints driven by iterative application of early-stopping criteria to each fine-tuning phase\n- composition of early-stopping and manually-set epoch-driven fine-tuning phase transitions\n\n______________________________________________________________________\n\n## Setup\n\n### Step 0: Install from PyPI\n\n```bash\npip install finetuning-scheduler\n```\n\n<!--  -->\n\n### Step 1: Import the FinetuningScheduler callback and start fine-tuning!\n\n```python\nimport lightning as L\nfrom finetuning_scheduler import FinetuningScheduler\n\ntrainer = L.Trainer(callbacks=[FinetuningScheduler()])\n```\n\nGet started by following [the Fine-Tuning Scheduler introduction](https://finetuning-scheduler.readthedocs.io/en/stable/index.html) which includes a [CLI-based example](https://finetuning-scheduler.readthedocs.io/en/stable/index.html#example-scheduled-fine-tuning-for-superglue) or by following the [notebook-based](https://pytorch-lightning.readthedocs.io/en/stable/notebooks/lightning_examples/finetuning-scheduler.html) Fine-Tuning Scheduler tutorial.\n\n______________________________________________________________________\n\n### Installation Using the Standalone `pytorch-lightning` Package\n\n*applicable to versions >= `2.0.0`*\n\nNow that the core Lightning package is `lightning` rather than `pytorch-lightning`, Fine-Tuning Scheduler (FTS) by default depends upon the `lightning` package rather than the standalone `pytorch-lightning`. If you would like to continue to use FTS with the standalone `pytorch-lightning` package instead, you can still do so as follows:\n\nInstall a given FTS release (for example v2.0.0) using standalone `pytorch-lightning`:\n\n```bash\nexport FTS_VERSION=2.0.0\nexport PACKAGE_NAME=pytorch\nwget https://github.com/speediedan/finetuning-scheduler/releases/download/v${FTS_VERSION}/finetuning-scheduler-${FTS_VERSION}.tar.gz\npip install finetuning-scheduler-${FTS_VERSION}.tar.gz\n```\n\n______________________________________________________________________\n\n## Examples\n\n### Scheduled Fine-Tuning For SuperGLUE\n\n- [Notebook-based Tutorial](https://pytorch-lightning.readthedocs.io/en/stable/notebooks/lightning_examples/finetuning-scheduler.html)\n- [CLI-based Tutorial](https://finetuning-scheduler.readthedocs.io/en/stable/#example-scheduled-fine-tuning-for-superglue)\n- [FSDP Scheduled Fine-Tuning](https://finetuning-scheduler.readthedocs.io/en/stable/advanced/fsdp_scheduled_fine_tuning.html)\n- [LR Scheduler Reinitialization](https://finetuning-scheduler.readthedocs.io/en/stable/advanced/lr_scheduler_reinitialization.html) (advanced)\n- [Optimizer Reinitialization](https://finetuning-scheduler.readthedocs.io/en/stable/advanced/optimizer_reinitialization.html) (advanced)\n\n______________________________________________________________________\n\n## Continuous Integration\n\nFine-Tuning Scheduler is rigorously tested across multiple CPUs, GPUs and against major Python and PyTorch versions. Each Fine-Tuning Scheduler minor release (major.minor.patch) is paired with a Lightning minor release (e.g. Fine-Tuning Scheduler 2.0 depends upon Lightning 2.0).\n\nTo ensure maximum stability, the latest Lightning patch release fully tested with Fine-Tuning Scheduler is set as a maximum dependency in Fine-Tuning Scheduler's requirements.txt (e.g. \\<= 1.7.1). If you'd like to test a specific Lightning patch version greater than that currently in Fine-Tuning Scheduler's requirements.txt, it will likely work but you should install Fine-Tuning Scheduler from source and update the requirements.txt as desired.\n\n<details>\n  <summary>Current build statuses for Fine-Tuning Scheduler </summary>\n\n| System / (PyTorch/Python ver) |                                                                                                        2.2.2/3.9                                                                                                         |                                                                                                              2.5.1/3.9, 2.5.1/3.12                                                                                                               |\n| :---------------------------: | :----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------: | :----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------: |\n|      Linux \\[GPUs\\*\\*\\]       |                                                                                                            -                                                                                                             | [![Build Status](https://dev.azure.com//speediedan/finetuning-scheduler/_apis/build/status/Multi-GPU%20&%20Example%20Tests?branchName=refs%2Ftags%2F2.5.0)](https://dev.azure.com/speediedan/finetuning-scheduler/_build/latest?definitionId=2&branchName=refs%2Ftags%2F2.5.0) |\n|     Linux (Ubuntu 22.04)      | [![Test](https://github.com/speediedan/finetuning-scheduler/actions/workflows/ci_test-full.yml/badge.svg?tag=2.5.0)](https://github.com/speediedan/finetuning-scheduler/actions/workflows/ci_test-full.yml) |             [![Test](https://github.com/speediedan/finetuning-scheduler/actions/workflows/ci_test-full.yml/badge.svg?tag=2.5.0)](https://github.com/speediedan/finetuning-scheduler/actions/workflows/ci_test-full.yml)             |\n|           OSX (14)            | [![Test](https://github.com/speediedan/finetuning-scheduler/actions/workflows/ci_test-full.yml/badge.svg?tag=2.5.0)](https://github.com/speediedan/finetuning-scheduler/actions/workflows/ci_test-full.yml) |             [![Test](https://github.com/speediedan/finetuning-scheduler/actions/workflows/ci_test-full.yml/badge.svg?tag=2.5.0)](https://github.com/speediedan/finetuning-scheduler/actions/workflows/ci_test-full.yml)             |\n|        Windows (2022)         | [![Test](https://github.com/speediedan/finetuning-scheduler/actions/workflows/ci_test-full.yml/badge.svg?tag=2.5.0)](https://github.com/speediedan/finetuning-scheduler/actions/workflows/ci_test-full.yml) |             [![Test](https://github.com/speediedan/finetuning-scheduler/actions/workflows/ci_test-full.yml/badge.svg?tag=2.5.0)](https://github.com/speediedan/finetuning-scheduler/actions/workflows/ci_test-full.yml)             |\n\n- \\*\\* tests run on one RTX 4090 and one RTX 2070\n\n</details>\n\n## Community\n\nFine-Tuning Scheduler is developed and maintained by the community in close communication with the [Lightning team](https://pytorch-lightning.readthedocs.io/en/stable/governance.html). Thanks to everyone in the community for their tireless effort building and improving the immensely useful core Lightning project.\n\nPR's welcome! Please see the [contributing guidelines](https://finetuning-scheduler.readthedocs.io/en/stable/generated/CONTRIBUTING.html) (which are essentially the same as Lightning's).\n\n______________________________________________________________________\n\n## Citing Fine-Tuning Scheduler\n\nPlease cite:\n\n```tex\n@misc{Dan_Dale_2022_6463952,\n    author       = {Dan Dale},\n    title        = {{Fine-Tuning Scheduler}},\n    month        = Feb,\n    year         = 2022,\n    doi          = {10.5281/zenodo.6463952},\n    publisher    = {Zenodo},\n    url          = {https://zenodo.org/record/6463952}\n    }\n```\n\nFeel free to star the repo as well if you find it useful or interesting. Thanks \ud83d\ude0a!\n",
    "bugtrack_url": null,
    "license": "Apache-2.0",
    "summary": "A PyTorch Lightning extension that enhances model experimentation with flexible fine-tuning schedules.",
    "version": "2.5.0",
    "project_urls": {
        "Bug Tracker": "https://github.com/speediedan/finetuning-scheduler/issues",
        "Documentation": "https://finetuning-scheduler.readthedocs.io/en/latest/",
        "Download": "https://github.com/speediedan/finetuning-scheduler",
        "Homepage": "https://github.com/speediedan/finetuning-scheduler",
        "Source Code": "https://github.com/speediedan/finetuning-scheduler"
    },
    "split_keywords": [
        "deep learning",
        " pytorch",
        " ai",
        " machine learning",
        " pytorch-lightning",
        " lightning",
        " fine-tuning",
        " finetuning"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "1f24d73a7923a05d25f6000083e05f2c9306fc011ff0517faf37626fcfcd6f02",
                "md5": "4a2396a1fe9929ce2b2f0959f88fea77",
                "sha256": "d7964dec9a4a4f50575c485d9b787650bfdf0fa881de6fd34118fbdebab16401"
            },
            "downloads": -1,
            "filename": "finetuning_scheduler-2.5.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "4a2396a1fe9929ce2b2f0959f88fea77",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.9",
            "size": 132826,
            "upload_time": "2024-12-20T19:13:47",
            "upload_time_iso_8601": "2024-12-20T19:13:47.975146Z",
            "url": "https://files.pythonhosted.org/packages/1f/24/d73a7923a05d25f6000083e05f2c9306fc011ff0517faf37626fcfcd6f02/finetuning_scheduler-2.5.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "57314592b9db25aa8ca7f06323c334ee076b5e7a7fa7671f0f173b4762be7d1e",
                "md5": "a7682b4e2a508559e58dce01d1c6c70c",
                "sha256": "d0e9dcc3a462f38ea7c024bf7decc64ac1c9a5e15bb2c5916b4f0765c8122fae"
            },
            "downloads": -1,
            "filename": "finetuning_scheduler-2.5.0.tar.gz",
            "has_sig": false,
            "md5_digest": "a7682b4e2a508559e58dce01d1c6c70c",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.9",
            "size": 160206,
            "upload_time": "2024-12-20T19:13:50",
            "upload_time_iso_8601": "2024-12-20T19:13:50.442248Z",
            "url": "https://files.pythonhosted.org/packages/57/31/4592b9db25aa8ca7f06323c334ee076b5e7a7fa7671f0f173b4762be7d1e/finetuning_scheduler-2.5.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-12-20 19:13:50",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "speediedan",
    "github_project": "finetuning-scheduler",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "requirements": [],
    "lcname": "finetuning-scheduler"
}
        
Elapsed time: 0.42342s