hypered


Namehypered JSON
Version 1.0.8 PyPI version JSON
download
home_pagehttps://github.com/vahidk/hypered
SummarySimple hyper parameter tuning model.
upload_time2024-07-13 07:12:48
maintainerNone
docs_urlNone
authorVahid Kazemi
requires_pythonNone
licenseMIT
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            <h1 align="center">
<img src="https://raw.githubusercontent.com/vahidk/hypered/main/media/hypered.png" alt="SafeConfig Library" width="256">
</h1><br>


[![PyPI - Downloads](https://img.shields.io/pypi/dm/hypered)](https://pypi.org/project/hypered/)

Hypered provides a flexible interface for optimizing hyperparameters of any blackbox system. It implements bayesian optimization and supports the creation of various types of hyperparameter search spaces including real, integer, and categorical variables.

## Features

- **Hyperparameter Spaces**: Define real, integer, and categorical variables.
- **Objective Functions**: Easily create minimization and maximization objectives.
- **Experiment Management**: Automatically handles experiment directories and parameter/result files.
- **Web-based Dashboard**: Visualize the experiment results for better insight.

## Installation

To install hypered, simply run:

```bash
pip install hypered
```

## Usage

### Step 1: Model Script

To use hypered you first need to define a model script that takes the hyper-parameters as an input json file and computes the loss/objective value and write it as a json file. Both input and output files should be provided by a command line arguments. The following is an example model script:

**example.py**
```python
import argparse
import json
import numpy as np

def eval_objective(params: dict) -> dict:
    op = params["option"]
    x = params["x"]

    if op == "first":
        loss = np.square(x - 5)
    elif op == "second":
        loss = np.abs(x - 3) - 2
    else:
        print("Invalid option", op)
        exit(0)

    return {"loss": loss}

def main():
    parser = argparse.ArgumentParser(description="Simple model.")
    parser.add_argument("params", type=str, help="Params file.")
    parser.add_argument("results", type=str, help="Results file.")
    args = parser.parse_args()

    params = json.loads(open(args.params).read())
    results = eval_objective(params)
    with open(args.results, "w") as f:
        f.write(json.dumps(results))


if __name__ == "__main__":
    main()
```

### Step 2: Configuration File

Next we need to define a configuration file that specifies the hyper parameters as well as the objective function. Below is an example configuration file:

**example.conf**
```python
optimize(
    name="learning_params",
    objective=minimize("loss"),
    binary="python3 example.py {params_path} {results_path}",
    random_starts=10,
    iterations=30,
    params={
        "option": categorical(["first", "second"]),
        "x": real(-10, 10)
    }
)
```

### Step 3: Running the Hyperparameter Optimizer

To run the hyperparameter optimizer, use the `hypered` script with the path to your configuration file:

```bash
hypered example.conf
```

This will start the optimization process as defined in your configuration file and output the best parameters.

### Step 4: Visualize the Results on Dashboard

Finally, you can visualize the results of hyper-parameter optimization on hypered-dash with the following command:

```bash
hypered-dash
```

## Reference

### Optimizers

#### `optimize`

This function performs hyperparameter optimization using Gaussian Processes.

**Arguments:**
- `name` (str): The name of the parameter group.
- `objective` (function): The objective function to minimize or maximize.
- `binary` (str): The command line binary to execute the experiment.
- `params` (dict): The dictionary of parameters to optimize.
- `random_starts` (int, optional): The number of random initialization points.
- `iterations` (int, optional): The number of iterations to run the optimization.
- `kernel` (str, optional): The type of kernel to use in the Gaussian process model. Defaults to "RBF".
- `kernel_scale` (float, optional): The scale of the kernel. Defaults to 1.0.
- `acquisition_fn` (str, optional): The type of acquisition function to use. Defaults to "UCB".
- `optimizer_restarts` (int, optional): The number of restarts for the optimizer. Defaults to 5.
- `seed` (int, optional): The random seed for reproducibility.
- `cwd` (str, optional): The current working directory for the subprocess.

Note that you can use predefined variables {params_path} and {results_path} in your binary string to specify the path to parameters and results json files accordingly.

### Objectives

#### `minimize`

Minimize a given variable.

#### `maximize`

Maximize a given variable.

### Variables

#### `real`

Class for defining a real-valued hyperparameter.

#### `integer`

Class for defining an integer-valued hyperparameter.

#### `categorical`

Class for defining a categorical hyperparameter.

### Utilities

#### `experiment_dir`

Retrieves the experiment directory from the context.

#### `params_path`

Retrieves the parameters path from the context.

#### `results_path`

Retrieves the results path from the context.

#### `device_id`

Returns a device ID in a round-robin fashion.

## License

This library is licensed under the MIT License. See the `LICENSE` file for more details.

## Contributing

Contributions are welcome! Please open an issue or submit a pull request on GitHub.

## Contact

For any questions or issues, please open an issue on the GitHub repository.

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/vahidk/hypered",
    "name": "hypered",
    "maintainer": null,
    "docs_url": null,
    "requires_python": null,
    "maintainer_email": null,
    "keywords": null,
    "author": "Vahid Kazemi",
    "author_email": "vkazemi@gmail.com",
    "download_url": "https://files.pythonhosted.org/packages/a1/39/d353568f0be97bafb894e03802553ae35aed1a2ae83274e1ac8cd0c6787c/hypered-1.0.8.tar.gz",
    "platform": null,
    "description": "<h1 align=\"center\">\n<img src=\"https://raw.githubusercontent.com/vahidk/hypered/main/media/hypered.png\" alt=\"SafeConfig Library\" width=\"256\">\n</h1><br>\n\n\n[![PyPI - Downloads](https://img.shields.io/pypi/dm/hypered)](https://pypi.org/project/hypered/)\n\nHypered provides a flexible interface for optimizing hyperparameters of any blackbox system. It implements bayesian optimization and supports the creation of various types of hyperparameter search spaces including real, integer, and categorical variables.\n\n## Features\n\n- **Hyperparameter Spaces**: Define real, integer, and categorical variables.\n- **Objective Functions**: Easily create minimization and maximization objectives.\n- **Experiment Management**: Automatically handles experiment directories and parameter/result files.\n- **Web-based Dashboard**: Visualize the experiment results for better insight.\n\n## Installation\n\nTo install hypered, simply run:\n\n```bash\npip install hypered\n```\n\n## Usage\n\n### Step 1: Model Script\n\nTo use hypered you first need to define a model script that takes the hyper-parameters as an input json file and computes the loss/objective value and write it as a json file. Both input and output files should be provided by a command line arguments. The following is an example model script:\n\n**example.py**\n```python\nimport argparse\nimport json\nimport numpy as np\n\ndef eval_objective(params: dict) -> dict:\n    op = params[\"option\"]\n    x = params[\"x\"]\n\n    if op == \"first\":\n        loss = np.square(x - 5)\n    elif op == \"second\":\n        loss = np.abs(x - 3) - 2\n    else:\n        print(\"Invalid option\", op)\n        exit(0)\n\n    return {\"loss\": loss}\n\ndef main():\n    parser = argparse.ArgumentParser(description=\"Simple model.\")\n    parser.add_argument(\"params\", type=str, help=\"Params file.\")\n    parser.add_argument(\"results\", type=str, help=\"Results file.\")\n    args = parser.parse_args()\n\n    params = json.loads(open(args.params).read())\n    results = eval_objective(params)\n    with open(args.results, \"w\") as f:\n        f.write(json.dumps(results))\n\n\nif __name__ == \"__main__\":\n    main()\n```\n\n### Step 2: Configuration File\n\nNext we need to define a configuration file that specifies the hyper parameters as well as the objective function. Below is an example configuration file:\n\n**example.conf**\n```python\noptimize(\n    name=\"learning_params\",\n    objective=minimize(\"loss\"),\n    binary=\"python3 example.py {params_path} {results_path}\",\n    random_starts=10,\n    iterations=30,\n    params={\n        \"option\": categorical([\"first\", \"second\"]),\n        \"x\": real(-10, 10)\n    }\n)\n```\n\n### Step 3: Running the Hyperparameter Optimizer\n\nTo run the hyperparameter optimizer, use the `hypered` script with the path to your configuration file:\n\n```bash\nhypered example.conf\n```\n\nThis will start the optimization process as defined in your configuration file and output the best parameters.\n\n### Step 4: Visualize the Results on Dashboard\n\nFinally, you can visualize the results of hyper-parameter optimization on hypered-dash with the following command:\n\n```bash\nhypered-dash\n```\n\n## Reference\n\n### Optimizers\n\n#### `optimize`\n\nThis function performs hyperparameter optimization using Gaussian Processes.\n\n**Arguments:**\n- `name` (str): The name of the parameter group.\n- `objective` (function): The objective function to minimize or maximize.\n- `binary` (str): The command line binary to execute the experiment.\n- `params` (dict): The dictionary of parameters to optimize.\n- `random_starts` (int, optional): The number of random initialization points.\n- `iterations` (int, optional): The number of iterations to run the optimization.\n- `kernel` (str, optional): The type of kernel to use in the Gaussian process model. Defaults to \"RBF\".\n- `kernel_scale` (float, optional): The scale of the kernel. Defaults to 1.0.\n- `acquisition_fn` (str, optional): The type of acquisition function to use. Defaults to \"UCB\".\n- `optimizer_restarts` (int, optional): The number of restarts for the optimizer. Defaults to 5.\n- `seed` (int, optional): The random seed for reproducibility.\n- `cwd` (str, optional): The current working directory for the subprocess.\n\nNote that you can use predefined variables {params_path} and {results_path} in your binary string to specify the path to parameters and results json files accordingly.\n\n### Objectives\n\n#### `minimize`\n\nMinimize a given variable.\n\n#### `maximize`\n\nMaximize a given variable.\n\n### Variables\n\n#### `real`\n\nClass for defining a real-valued hyperparameter.\n\n#### `integer`\n\nClass for defining an integer-valued hyperparameter.\n\n#### `categorical`\n\nClass for defining a categorical hyperparameter.\n\n### Utilities\n\n#### `experiment_dir`\n\nRetrieves the experiment directory from the context.\n\n#### `params_path`\n\nRetrieves the parameters path from the context.\n\n#### `results_path`\n\nRetrieves the results path from the context.\n\n#### `device_id`\n\nReturns a device ID in a round-robin fashion.\n\n## License\n\nThis library is licensed under the MIT License. See the `LICENSE` file for more details.\n\n## Contributing\n\nContributions are welcome! Please open an issue or submit a pull request on GitHub.\n\n## Contact\n\nFor any questions or issues, please open an issue on the GitHub repository.\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "Simple hyper parameter tuning model.",
    "version": "1.0.8",
    "project_urls": {
        "Homepage": "https://github.com/vahidk/hypered"
    },
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "a139d353568f0be97bafb894e03802553ae35aed1a2ae83274e1ac8cd0c6787c",
                "md5": "99de3fa9b76c0665902c6d1a9974150d",
                "sha256": "49a802b0ac4e1ec21939295b3402966efeca5f92a0159a3fde4ea83cab8bccd7"
            },
            "downloads": -1,
            "filename": "hypered-1.0.8.tar.gz",
            "has_sig": false,
            "md5_digest": "99de3fa9b76c0665902c6d1a9974150d",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": null,
            "size": 19353,
            "upload_time": "2024-07-13T07:12:48",
            "upload_time_iso_8601": "2024-07-13T07:12:48.614792Z",
            "url": "https://files.pythonhosted.org/packages/a1/39/d353568f0be97bafb894e03802553ae35aed1a2ae83274e1ac8cd0c6787c/hypered-1.0.8.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-07-13 07:12:48",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "vahidk",
    "github_project": "hypered",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "hypered"
}
        
Elapsed time: 0.57405s