letstune


Nameletstune JSON
Version 0.3.0 PyPI version JSON
download
home_pagehttps://www.letstune.org/
SummaryHyper-parameter tuning for the masses!
upload_time2023-11-25 14:59:01
maintainer
docs_urlNone
authorMichał Słapek
requires_python>=3.10,<3.11
licenseMIT
keywords machine-learning hyperparameter-tuning deep-learning
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            <div align="center">
  <img src="https://raw.githubusercontent.com/mslapek/letstune/main/img/logo.svg"><br>
</div>

-----------------

# letstune

*Hyper-parameter tuning for the masses!*

![License: MIT](https://img.shields.io/badge/license-MIT-purple.svg?style=flat-square)
[![Documentation Status](https://readthedocs.org/projects/letstune/badge/?version=latest&style=flat-square)](https://letstune.readthedocs.io/en/latest/?badge=latest)
[![PyPI wheel](https://img.shields.io/pypi/wheel/letstune?color=orange&label=pip&style=flat-square)](https://pypi.org/project/letstune/)
[![Code style: black](https://img.shields.io/badge/code%20style-black-000000.svg?style=flat-square)](https://github.com/psf/black)
[![Imports: isort](https://img.shields.io/badge/%20imports-isort-%231674b1?style=flat-square)](https://pycqa.github.io/isort/)

[![Lint and test workflow](https://github.com/mslapek/letstune/actions/workflows/linttest.yml/badge.svg)](https://github.com/mslapek/letstune/actions/workflows/linttest.yml)

* [Documentation](https://letstune.readthedocs.io/en/latest/)
* [PyPI Package](https://pypi.org/project/letstune/)
* [Examples](examples)

Machine Learning algorithms have **many parameters**, which are expected to be
chosen by a user - like the number of layers or learning rate.

It requires **a lot of trial and error**.

_letstune_ **automatically tries** various parameter configurations and
gives you back the best model.

## How it differs from `GridSearchCV`?

_letstune_ will give you **a better model** in **a shorter time** in comparison
to the classical hyperparameter tuning algorithms.

1. Generate random parameters
2. Evaluate each parameter with **a small time budget**
3. **Drop low-performers** automatically, only good-performers will stay in the pool

The 3rd point is the distinguishing feature of _letstune_ - other algorithms
dutifully train weak models - without a good reason.

## Ergonomics

Common tasks in _letstune_ are realized with Python one-liners:

The best model:

```python
model = tuning[0].best_epoch.checkpoint.load_pickle()
```

Pandas summary dataframe with **parameters** and **metric values**:

```python
df = tuning.to_df()
```

## Great! How to use it?

Install with **pip**:

```
pip install letstune
```

First, define your **parameters**:

```python
import letstune
from letstune import Params, rand

class SGDClassifierParams(Params):
    model_cls = SGDClassifier

    average: bool
    l1_ratio: float = rand.uniform(0, 1)
    alpha: float = rand.uniform(1e-2, 1e0, log=True)
```

Then define a **trainer**.
**Trainer** is an object, which knows how to **train** a model!

```python
class DigitsTrainer(letstune.SimpleTrainer):
    params_cls = SGDClassifierParams
    metric = "accuracy"

    def load_dataset(self, dataset):
        self.X_train, self.X_test, self.y_train, self.y_test = dataset

    def train(self, params):
        # params has type SGDClassifierParams

        # letstune provides method create_model
        # returning SGDClassifier
        model = params.create_model(
            loss="hinge",
            penalty="elasticnet",
            fit_intercept=True,
            random_state=42,
        )
        model.fit(self.X_train, self.y_train)

        accuracy = model.score(self.X_test, self.y_test)

        return model, {"accuracy": accuracy}


trainer = DigitsTrainer()  # new instance!
```

Neural networks and gradient boosting trainings
can be based on `letstune.EpochTrainer`,
which has `train_epoch` method.

Finally, **let's tune**!

```python
tuning = letstune.tune(
    trainer,
    16,  # number of tested random parameters
    dataset=(X_train, X_test, y_train, y_test),
    results_dir="digits_tuning",
)
```

**Our model** is ready to use:

```python
model = tuning[0].checkpoint.load_pickle()
```

Don't forget to check out [examples directory](examples)! 👀

Documentation is [here](https://letstune.readthedocs.io/en/latest/)!

## Additionally

Works with your favourite ML library 🐍 - it's **library agnostic**!

**Resumes work** from the point, where program was stopped.

Permissive **business-friendly** MIT license.

## References

*A System for Massively Parallel Hyperparameter Tuning* by Li et al.;
[arXiv:1810.05934](https://arxiv.org/abs/1810.05934)

Overview of various hyperparameter-tuning algorithms.
_letstune_ implements a variant of Successive Halving.

## Contributing

Issues are tracked on [GitHub](https://github.com/mslapek/letstune/issues).

## Changelog

Please see [CHANGELOG.md](CHANGELOG.md).

            

Raw data

            {
    "_id": null,
    "home_page": "https://www.letstune.org/",
    "name": "letstune",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.10,<3.11",
    "maintainer_email": "",
    "keywords": "machine-learning,hyperparameter-tuning,deep-learning",
    "author": "Micha\u0142 S\u0142apek",
    "author_email": "28485371+mslapek@users.noreply.github.com",
    "download_url": "https://files.pythonhosted.org/packages/f4/f7/651eb8f373c059dc80ee3b577f1ac09cacdf6f6d8aefb92ec59830908ea0/letstune-0.3.0.tar.gz",
    "platform": null,
    "description": "<div align=\"center\">\n  <img src=\"https://raw.githubusercontent.com/mslapek/letstune/main/img/logo.svg\"><br>\n</div>\n\n-----------------\n\n# letstune\n\n*Hyper-parameter tuning for the masses!*\n\n![License: MIT](https://img.shields.io/badge/license-MIT-purple.svg?style=flat-square)\n[![Documentation Status](https://readthedocs.org/projects/letstune/badge/?version=latest&style=flat-square)](https://letstune.readthedocs.io/en/latest/?badge=latest)\n[![PyPI wheel](https://img.shields.io/pypi/wheel/letstune?color=orange&label=pip&style=flat-square)](https://pypi.org/project/letstune/)\n[![Code style: black](https://img.shields.io/badge/code%20style-black-000000.svg?style=flat-square)](https://github.com/psf/black)\n[![Imports: isort](https://img.shields.io/badge/%20imports-isort-%231674b1?style=flat-square)](https://pycqa.github.io/isort/)\n\n[![Lint and test workflow](https://github.com/mslapek/letstune/actions/workflows/linttest.yml/badge.svg)](https://github.com/mslapek/letstune/actions/workflows/linttest.yml)\n\n* [Documentation](https://letstune.readthedocs.io/en/latest/)\n* [PyPI Package](https://pypi.org/project/letstune/)\n* [Examples](examples)\n\nMachine Learning algorithms have **many parameters**, which are expected to be\nchosen by a user - like the number of layers or learning rate.\n\nIt requires **a lot of trial and error**.\n\n_letstune_ **automatically tries** various parameter configurations and\ngives you back the best model.\n\n## How it differs from `GridSearchCV`?\n\n_letstune_ will give you **a better model** in **a shorter time** in comparison\nto the classical hyperparameter tuning algorithms.\n\n1. Generate random parameters\n2. Evaluate each parameter with **a small time budget**\n3. **Drop low-performers** automatically, only good-performers will stay in the pool\n\nThe 3rd point is the distinguishing feature of _letstune_ - other algorithms\ndutifully train weak models - without a good reason.\n\n## Ergonomics\n\nCommon tasks in _letstune_ are realized with Python one-liners:\n\nThe best model:\n\n```python\nmodel = tuning[0].best_epoch.checkpoint.load_pickle()\n```\n\nPandas summary dataframe with **parameters** and **metric values**:\n\n```python\ndf = tuning.to_df()\n```\n\n## Great! How to use it?\n\nInstall with **pip**:\n\n```\npip install letstune\n```\n\nFirst, define your **parameters**:\n\n```python\nimport letstune\nfrom letstune import Params, rand\n\nclass SGDClassifierParams(Params):\n    model_cls = SGDClassifier\n\n    average: bool\n    l1_ratio: float = rand.uniform(0, 1)\n    alpha: float = rand.uniform(1e-2, 1e0, log=True)\n```\n\nThen define a **trainer**.\n**Trainer** is an object, which knows how to **train** a model!\n\n```python\nclass DigitsTrainer(letstune.SimpleTrainer):\n    params_cls = SGDClassifierParams\n    metric = \"accuracy\"\n\n    def load_dataset(self, dataset):\n        self.X_train, self.X_test, self.y_train, self.y_test = dataset\n\n    def train(self, params):\n        # params has type SGDClassifierParams\n\n        # letstune provides method create_model\n        # returning SGDClassifier\n        model = params.create_model(\n            loss=\"hinge\",\n            penalty=\"elasticnet\",\n            fit_intercept=True,\n            random_state=42,\n        )\n        model.fit(self.X_train, self.y_train)\n\n        accuracy = model.score(self.X_test, self.y_test)\n\n        return model, {\"accuracy\": accuracy}\n\n\ntrainer = DigitsTrainer()  # new instance!\n```\n\nNeural networks and gradient boosting trainings\ncan be based on `letstune.EpochTrainer`,\nwhich has `train_epoch` method.\n\nFinally, **let's tune**!\n\n```python\ntuning = letstune.tune(\n    trainer,\n    16,  # number of tested random parameters\n    dataset=(X_train, X_test, y_train, y_test),\n    results_dir=\"digits_tuning\",\n)\n```\n\n**Our model** is ready to use:\n\n```python\nmodel = tuning[0].checkpoint.load_pickle()\n```\n\nDon't forget to check out [examples directory](examples)! \ud83d\udc40\n\nDocumentation is [here](https://letstune.readthedocs.io/en/latest/)!\n\n## Additionally\n\nWorks with your favourite ML library \ud83d\udc0d - it's **library agnostic**!\n\n**Resumes work** from the point, where program was stopped.\n\nPermissive **business-friendly** MIT license.\n\n## References\n\n*A System for Massively Parallel Hyperparameter Tuning* by Li et al.;\n[arXiv:1810.05934](https://arxiv.org/abs/1810.05934)\n\nOverview of various hyperparameter-tuning algorithms.\n_letstune_ implements a variant of Successive Halving.\n\n## Contributing\n\nIssues are tracked on [GitHub](https://github.com/mslapek/letstune/issues).\n\n## Changelog\n\nPlease see [CHANGELOG.md](CHANGELOG.md).\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "Hyper-parameter tuning for the masses!",
    "version": "0.3.0",
    "project_urls": {
        "Documentation": "https://letstune.readthedocs.io/",
        "Homepage": "https://www.letstune.org/",
        "Repository": "https://github.com/mslapek/letstune"
    },
    "split_keywords": [
        "machine-learning",
        "hyperparameter-tuning",
        "deep-learning"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "84e3e2059bcf766ff7976cdb6a11bf4e2c971014b68c2bc98f5dc63eb0def524",
                "md5": "58f3c8231fb2cb2301158c77f6bb0760",
                "sha256": "e5ac649f867e9d99f3aad97b9ee862f0167f2e76e6fae5bfc745e4ab7bc806b2"
            },
            "downloads": -1,
            "filename": "letstune-0.3.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "58f3c8231fb2cb2301158c77f6bb0760",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.10,<3.11",
            "size": 30301,
            "upload_time": "2023-11-25T14:58:59",
            "upload_time_iso_8601": "2023-11-25T14:58:59.232323Z",
            "url": "https://files.pythonhosted.org/packages/84/e3/e2059bcf766ff7976cdb6a11bf4e2c971014b68c2bc98f5dc63eb0def524/letstune-0.3.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "f4f7651eb8f373c059dc80ee3b577f1ac09cacdf6f6d8aefb92ec59830908ea0",
                "md5": "585eebce28c3ed808db2485b16947217",
                "sha256": "f5671f076f3df249dfa9d653443d265ef9a07d6563b84bfc1c0b3c263941d408"
            },
            "downloads": -1,
            "filename": "letstune-0.3.0.tar.gz",
            "has_sig": false,
            "md5_digest": "585eebce28c3ed808db2485b16947217",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.10,<3.11",
            "size": 23878,
            "upload_time": "2023-11-25T14:59:01",
            "upload_time_iso_8601": "2023-11-25T14:59:01.970582Z",
            "url": "https://files.pythonhosted.org/packages/f4/f7/651eb8f373c059dc80ee3b577f1ac09cacdf6f6d8aefb92ec59830908ea0/letstune-0.3.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-11-25 14:59:01",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "mslapek",
    "github_project": "letstune",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "letstune"
}
        
Elapsed time: 0.34450s