hyperparameter-tuning


Namehyperparameter-tuning JSON
Version 0.3.1 PyPI version JSON
download
home_pagehttps://github.com/AndreFCruz/hpt
SummaryA minimal framework for running hyperparameter tuning
upload_time2023-09-05 09:38:53
maintainer
docs_urlNone
authorAndreFCruz
requires_python>=3.8
licenseMIT
keywords ml optimization hyperparameter tuning fairness
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # hpt

> This repository is under construction :construction:

![badge for tests status](https://github.com/AndreFCruz/hpt/actions/workflows/python-package.yml/badge.svg)
![badge for PyPI publishing status](https://github.com/AndreFCruz/hpt/actions/workflows/python-publish.yml/badge.svg)

A minimal hyperparameter tuning framework to help you train hundreds of models.

It's essentially a set of helpful wrappers over optuna.


## Install

Install package from [PyPI](https://pypi.org/project/hyperparameter-tuning/):


`
pip install hyperparameter-tuning
`

## Getting started

```py
from hpt.tuner import ObjectiveFunction, OptunaTuner

obj_func = ObjectiveFunction(
    X_train, y_train, X_test, y_test,
    hyperparameter_space=HYPERPARAM_SPACE_PATH,    # path to YAML file
    eval_metric="accuracy",
    s_train=s_train,
    s_val=s_test,
    threshold=0.50,
)

tuner = OptunaTuner(
    objective_function=obj_func,
    direction="maximize",    # NOTE: can pass other useful study kwargs here (e.g. storage)
)

# Then just run optimize as you would for an optuna.Study object
tuner.optimize(n_trials=20, n_jobs=4)

# Results are stored in tuner.results
tuner.results

# You can reconstruct the best predictor with:
clf = obj_func.reconstruct_model(obj_func.best_trial)
```

## Defining a hyperparameter space

The hyperparameter space is provided either path to a YAML file, or as a `dict` 
with the same structure.
Example hyperparameter spaces [here](examples/sklearn.small_hyperparam_space.yaml) and 
[here](examples/sklearn.large_hyperparam_space.yaml).

The YAML file must follow this structure:
```yaml
# One or more top-level algorithms
DT:  
    # Full classpath of algorithm's constructor
    classpath: sklearn.tree.DecisionTreeClassifier
    
    # One or more key-word arguments to be passed to the constructor
    kwargs:
        
        # Kwargs may be sampled from a distribution
        max_depth:
            type: int           # either 'int' or 'float'
            range: [ 10, 100 ]  # minimum and maximum values
            log: True           # (optionally) whether to use logarithmic scale
        
        # Kwargs may be sampled from a fixed set of categories
        criterion:
            - 'gini'
            - 'entropy'
        
        # Kwargs may be a pre-defined value
        min_samples_split: 4


# You may explore multiple algorithms at once
LR:
    classpath: sklearn.linear_model.LogisticRegression
    kwargs:
        # An example of a float hyperparameter
        C:
            type: float
            range: [ 0.01, 1.0 ]
            log: True

```

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/AndreFCruz/hpt",
    "name": "hyperparameter-tuning",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.8",
    "maintainer_email": "",
    "keywords": "ml,optimization,hyperparameter,tuning,fairness",
    "author": "AndreFCruz",
    "author_email": "",
    "download_url": "https://files.pythonhosted.org/packages/c8/c5/40907d79c62e8528a9bfa21ee1776f0b66d63a5e7da7287fbfca0ff92c97/hyperparameter-tuning-0.3.1.tar.gz",
    "platform": null,
    "description": "# hpt\n\n> This repository is under construction :construction:\n\n![badge for tests status](https://github.com/AndreFCruz/hpt/actions/workflows/python-package.yml/badge.svg)\n![badge for PyPI publishing status](https://github.com/AndreFCruz/hpt/actions/workflows/python-publish.yml/badge.svg)\n\nA minimal hyperparameter tuning framework to help you train hundreds of models.\n\nIt's essentially a set of helpful wrappers over optuna.\n\n\n## Install\n\nInstall package from [PyPI](https://pypi.org/project/hyperparameter-tuning/):\n\n\n`\npip install hyperparameter-tuning\n`\n\n## Getting started\n\n```py\nfrom hpt.tuner import ObjectiveFunction, OptunaTuner\n\nobj_func = ObjectiveFunction(\n    X_train, y_train, X_test, y_test,\n    hyperparameter_space=HYPERPARAM_SPACE_PATH,    # path to YAML file\n    eval_metric=\"accuracy\",\n    s_train=s_train,\n    s_val=s_test,\n    threshold=0.50,\n)\n\ntuner = OptunaTuner(\n    objective_function=obj_func,\n    direction=\"maximize\",    # NOTE: can pass other useful study kwargs here (e.g. storage)\n)\n\n# Then just run optimize as you would for an optuna.Study object\ntuner.optimize(n_trials=20, n_jobs=4)\n\n# Results are stored in tuner.results\ntuner.results\n\n# You can reconstruct the best predictor with:\nclf = obj_func.reconstruct_model(obj_func.best_trial)\n```\n\n## Defining a hyperparameter space\n\nThe hyperparameter space is provided either path to a YAML file, or as a `dict` \nwith the same structure.\nExample hyperparameter spaces [here](examples/sklearn.small_hyperparam_space.yaml) and \n[here](examples/sklearn.large_hyperparam_space.yaml).\n\nThe YAML file must follow this structure:\n```yaml\n# One or more top-level algorithms\nDT:  \n    # Full classpath of algorithm's constructor\n    classpath: sklearn.tree.DecisionTreeClassifier\n    \n    # One or more key-word arguments to be passed to the constructor\n    kwargs:\n        \n        # Kwargs may be sampled from a distribution\n        max_depth:\n            type: int           # either 'int' or 'float'\n            range: [ 10, 100 ]  # minimum and maximum values\n            log: True           # (optionally) whether to use logarithmic scale\n        \n        # Kwargs may be sampled from a fixed set of categories\n        criterion:\n            - 'gini'\n            - 'entropy'\n        \n        # Kwargs may be a pre-defined value\n        min_samples_split: 4\n\n\n# You may explore multiple algorithms at once\nLR:\n    classpath: sklearn.linear_model.LogisticRegression\n    kwargs:\n        # An example of a float hyperparameter\n        C:\n            type: float\n            range: [ 0.01, 1.0 ]\n            log: True\n\n```\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "A minimal framework for running hyperparameter tuning",
    "version": "0.3.1",
    "project_urls": {
        "Homepage": "https://github.com/AndreFCruz/hpt"
    },
    "split_keywords": [
        "ml",
        "optimization",
        "hyperparameter",
        "tuning",
        "fairness"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "ef0636f0d4a9188ec9654a1ab96a9c3c4c1618d0a2cb566f76120e5cf89f451c",
                "md5": "9e64f4a4b4c46964389cf79a1438c928",
                "sha256": "df861403bbaad3263c826e369a215e76a2c8b038c85e856d8717343f639692da"
            },
            "downloads": -1,
            "filename": "hyperparameter_tuning-0.3.1-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "9e64f4a4b4c46964389cf79a1438c928",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.8",
            "size": 25935,
            "upload_time": "2023-09-05T09:38:51",
            "upload_time_iso_8601": "2023-09-05T09:38:51.987393Z",
            "url": "https://files.pythonhosted.org/packages/ef/06/36f0d4a9188ec9654a1ab96a9c3c4c1618d0a2cb566f76120e5cf89f451c/hyperparameter_tuning-0.3.1-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "c8c540907d79c62e8528a9bfa21ee1776f0b66d63a5e7da7287fbfca0ff92c97",
                "md5": "1f922c9a889393ade8031bf5d65b76b8",
                "sha256": "451e0b8a51e989d63913eba9bdc162224fe010401f38fdc051c5fdd343f23613"
            },
            "downloads": -1,
            "filename": "hyperparameter-tuning-0.3.1.tar.gz",
            "has_sig": false,
            "md5_digest": "1f922c9a889393ade8031bf5d65b76b8",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.8",
            "size": 24211,
            "upload_time": "2023-09-05T09:38:53",
            "upload_time_iso_8601": "2023-09-05T09:38:53.530458Z",
            "url": "https://files.pythonhosted.org/packages/c8/c5/40907d79c62e8528a9bfa21ee1776f0b66d63a5e7da7287fbfca0ff92c97/hyperparameter-tuning-0.3.1.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-09-05 09:38:53",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "AndreFCruz",
    "github_project": "hpt",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "hyperparameter-tuning"
}
        
Elapsed time: 0.36550s