neural-pipeline-search


Nameneural-pipeline-search JSON
Version 0.12.1 PyPI version JSON
download
home_pagehttps://github.com/automl/neps
SummaryNeural Pipeline Search helps deep learning experts find the best neural pipeline.
upload_time2024-07-03 05:40:55
maintainerNone
docs_urlNone
authorDanny Stoll
requires_python<3.12,>=3.8
licenseApache-2.0
keywords neural pipeline search neural architecture search hyperparameter optimization automl
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # Neural Pipeline Search (NePS)

[![PyPI version](https://img.shields.io/pypi/v/neural-pipeline-search?color=informational)](https://pypi.org/project/neural-pipeline-search/)
[![Python versions](https://img.shields.io/pypi/pyversions/neural-pipeline-search)](https://pypi.org/project/neural-pipeline-search/)
[![License](https://img.shields.io/pypi/l/neural-pipeline-search?color=informational)](LICENSE)
[![Tests](https://github.com/automl/neps/actions/workflows/tests.yaml/badge.svg)](https://github.com/automl/neps/actions)

Welcome to NePS, a powerful and flexible Python library for hyperparameter optimization (HPO) and neural architecture search (NAS) with its primary goal: enable HPO and NAS for deep learners!

NePS houses recently published and also well-established algorithms that can all be run massively parallel on distributed setups, with tools to analyze runs, restart runs, etc., all tailored to the needs of deep learning experts.

Take a look at our [documentation](https://automl.github.io/neps/latest/) for all the details on how to use NePS!

## Key Features

In addition to the features offered by traditional HPO and NAS libraries, NePS, e.g., stands out with:

1. [**Hyperparameter Optimization (HPO) With Prior Knowledge:**](neps_examples/template/priorband_template.py)

   - NePS excels in efficiently tuning hyperparameters using algorithms that enable users to make use of their prior knowledge within the search space. This is leveraged by the insights presented in:
     - [PriorBand: Practical Hyperparameter Optimization in the Age of Deep Learning](https://arxiv.org/abs/2306.12370)
     - [πBO: Augmenting Acquisition Functions with User Beliefs for Bayesian Optimization](https://arxiv.org/abs/2204.11051)

1. [**Neural Architecture Search (NAS) With Context-free Grammar Search Spaces:**](neps_examples/basic_usage/architecture.py)

   - NePS is equipped to handle context-free grammar search spaces, providing advanced capabilities for designing and optimizing architectures. this is leveraged by the insights presented in:
     - [Construction of Hierarchical Neural Architecture Search Spaces based on Context-free Grammars](https://arxiv.org/abs/2211.01842)

1. [**Easy Parallelization and Resumption of Runs:**](https://automl.github.io/neps/latest/examples/efficiency/)

   - NePS simplifies the process of parallelizing optimization tasks both on individual computers and in distributed
     computing environments. It also allows users to conveniently resume these optimization tasks after completion to
     ensure a seamless and efficient workflow for long-running experiments.

1. [**Seamless User Code Integration:**](neps_examples/template/)

   - NePS's modular design ensures flexibility and extensibility. Integrate NePS effortlessly into existing machine learning workflows.

## Installation

To install the latest release from PyPI run

```bash
pip install neural-pipeline-search
```

To get the latest version from github run

```bash
pip install git+https://github.com/automl/neps.git
```

> Note: As indicated with the `v0.x.x` version number APIs will change in the future.


## Basic Usage

Using `neps` always follows the same pattern:

1. Define a `run_pipeline` function capable of evaluating different architectural and/or hyperparameter configurations
   for your problem.
1. Define a search space named `pipeline_space` of those Parameters e.g. via a dictionary
1. Call `neps.run` to optimize `run_pipeline` over `pipeline_space`

In code, the usage pattern can look like this:

```python
import neps
import logging


# 1. Define a function that accepts hyperparameters and computes the validation error
def run_pipeline(
    hyperparameter_a: float, hyperparameter_b: int, architecture_parameter: str
) -> dict:
    # Create your model
    model = MyModel(architecture_parameter)

    # Train and evaluate the model with your training pipeline
    validation_error, training_error = train_and_eval(
        model, hyperparameter_a, hyperparameter_b
    )

    return {  # dict or float(validation error)
        "loss": validation_error,
        "info_dict": {
            "training_error": training_error
            # + Other metrics
        },
    }


# 2. Define a search space of parameters; use the same parameter names as in run_pipeline
pipeline_space = dict(
    hyperparameter_a=neps.FloatParameter(
        lower=0.001, upper=0.1, log=True  # The search space is sampled in log space
    ),
    hyperparameter_b=neps.IntegerParameter(lower=1, upper=42),
    architecture_parameter=neps.CategoricalParameter(["option_a", "option_b"]),
)


# 3. Run the NePS optimization
logging.basicConfig(level=logging.INFO)
neps.run(
    run_pipeline=run_pipeline,
    pipeline_space=pipeline_space,
    root_directory="path/to/save/results",  # Replace with the actual path.
    max_evaluations_total=100,
)
```

## Examples

Discover how NePS works through these practical examples:

- **[Hyperparameter Optimization (HPO)](neps_examples/basic_usage/hyperparameters.py)**: Learn the essentials of hyperparameter optimization with NePS.

- **[Architecture Search with Primitives](neps_examples/basic_usage/architecture.py)**: Dive into architecture search using primitives in NePS.

- **[Multi-Fidelity Optimization](neps_examples/efficiency/multi_fidelity.py)**: Understand how to leverage multi-fidelity optimization for efficient model tuning.

- **[Utilizing Expert Priors for Hyperparameters](neps_examples/efficiency/expert_priors_for_hyperparameters.py)**: Learn how to incorporate expert priors for more efficient hyperparameter selection.

- **[Additional NePS Examples](neps_examples/)**: Explore more examples, including various use cases and advanced configurations in NePS.

## Contributing

Please see the [documentation for contributors](https://automl.github.io/neps/latest/dev_docs/contributing/).

## Citations

For pointers on citing the NePS package and papers refer to our [documentation on citations](https://automl.github.io/neps/latest/citations/).

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/automl/neps",
    "name": "neural-pipeline-search",
    "maintainer": null,
    "docs_url": null,
    "requires_python": "<3.12,>=3.8",
    "maintainer_email": null,
    "keywords": "Neural Pipeline Search, Neural Architecture Search, Hyperparameter Optimization, AutoML",
    "author": "Danny Stoll",
    "author_email": "stolld@cs.uni-freiburg.de",
    "download_url": "https://files.pythonhosted.org/packages/cb/c6/01d78955a5c675cd520d86166e72902be818f2038e0b373b3286f33b1cb7/neural_pipeline_search-0.12.1.tar.gz",
    "platform": null,
    "description": "# Neural Pipeline Search (NePS)\n\n[![PyPI version](https://img.shields.io/pypi/v/neural-pipeline-search?color=informational)](https://pypi.org/project/neural-pipeline-search/)\n[![Python versions](https://img.shields.io/pypi/pyversions/neural-pipeline-search)](https://pypi.org/project/neural-pipeline-search/)\n[![License](https://img.shields.io/pypi/l/neural-pipeline-search?color=informational)](LICENSE)\n[![Tests](https://github.com/automl/neps/actions/workflows/tests.yaml/badge.svg)](https://github.com/automl/neps/actions)\n\nWelcome to NePS, a powerful and flexible Python library for hyperparameter optimization (HPO) and neural architecture search (NAS) with its primary goal: enable HPO and NAS for deep learners!\n\nNePS houses recently published and also well-established algorithms that can all be run massively parallel on distributed setups, with tools to analyze runs, restart runs, etc., all tailored to the needs of deep learning experts.\n\nTake a look at our [documentation](https://automl.github.io/neps/latest/) for all the details on how to use NePS!\n\n## Key Features\n\nIn addition to the features offered by traditional HPO and NAS libraries, NePS, e.g., stands out with:\n\n1. [**Hyperparameter Optimization (HPO) With Prior Knowledge:**](neps_examples/template/priorband_template.py)\n\n   - NePS excels in efficiently tuning hyperparameters using algorithms that enable users to make use of their prior knowledge within the search space. This is leveraged by the insights presented in:\n     - [PriorBand: Practical Hyperparameter Optimization in the Age of Deep Learning](https://arxiv.org/abs/2306.12370)\n     - [\u03c0BO: Augmenting Acquisition Functions with User Beliefs for Bayesian Optimization](https://arxiv.org/abs/2204.11051)\n\n1. [**Neural Architecture Search (NAS) With Context-free Grammar Search Spaces:**](neps_examples/basic_usage/architecture.py)\n\n   - NePS is equipped to handle context-free grammar search spaces, providing advanced capabilities for designing and optimizing architectures. this is leveraged by the insights presented in:\n     - [Construction of Hierarchical Neural Architecture Search Spaces based on Context-free Grammars](https://arxiv.org/abs/2211.01842)\n\n1. [**Easy Parallelization and Resumption of Runs:**](https://automl.github.io/neps/latest/examples/efficiency/)\n\n   - NePS simplifies the process of parallelizing optimization tasks both on individual computers and in distributed\n     computing environments. It also allows users to conveniently resume these optimization tasks after completion to\n     ensure a seamless and efficient workflow for long-running experiments.\n\n1. [**Seamless User Code Integration:**](neps_examples/template/)\n\n   - NePS's modular design ensures flexibility and extensibility. Integrate NePS effortlessly into existing machine learning workflows.\n\n## Installation\n\nTo install the latest release from PyPI run\n\n```bash\npip install neural-pipeline-search\n```\n\nTo get the latest version from github run\n\n```bash\npip install git+https://github.com/automl/neps.git\n```\n\n> Note: As indicated with the `v0.x.x` version number APIs will change in the future.\n\n\n## Basic Usage\n\nUsing `neps` always follows the same pattern:\n\n1. Define a `run_pipeline` function capable of evaluating different architectural and/or hyperparameter configurations\n   for your problem.\n1. Define a search space named `pipeline_space` of those Parameters e.g. via a dictionary\n1. Call `neps.run` to optimize `run_pipeline` over `pipeline_space`\n\nIn code, the usage pattern can look like this:\n\n```python\nimport neps\nimport logging\n\n\n# 1. Define a function that accepts hyperparameters and computes the validation error\ndef run_pipeline(\n    hyperparameter_a: float, hyperparameter_b: int, architecture_parameter: str\n) -> dict:\n    # Create your model\n    model = MyModel(architecture_parameter)\n\n    # Train and evaluate the model with your training pipeline\n    validation_error, training_error = train_and_eval(\n        model, hyperparameter_a, hyperparameter_b\n    )\n\n    return {  # dict or float(validation error)\n        \"loss\": validation_error,\n        \"info_dict\": {\n            \"training_error\": training_error\n            # + Other metrics\n        },\n    }\n\n\n# 2. Define a search space of parameters; use the same parameter names as in run_pipeline\npipeline_space = dict(\n    hyperparameter_a=neps.FloatParameter(\n        lower=0.001, upper=0.1, log=True  # The search space is sampled in log space\n    ),\n    hyperparameter_b=neps.IntegerParameter(lower=1, upper=42),\n    architecture_parameter=neps.CategoricalParameter([\"option_a\", \"option_b\"]),\n)\n\n\n# 3. Run the NePS optimization\nlogging.basicConfig(level=logging.INFO)\nneps.run(\n    run_pipeline=run_pipeline,\n    pipeline_space=pipeline_space,\n    root_directory=\"path/to/save/results\",  # Replace with the actual path.\n    max_evaluations_total=100,\n)\n```\n\n## Examples\n\nDiscover how NePS works through these practical examples:\n\n- **[Hyperparameter Optimization (HPO)](neps_examples/basic_usage/hyperparameters.py)**: Learn the essentials of hyperparameter optimization with NePS.\n\n- **[Architecture Search with Primitives](neps_examples/basic_usage/architecture.py)**: Dive into architecture search using primitives in NePS.\n\n- **[Multi-Fidelity Optimization](neps_examples/efficiency/multi_fidelity.py)**: Understand how to leverage multi-fidelity optimization for efficient model tuning.\n\n- **[Utilizing Expert Priors for Hyperparameters](neps_examples/efficiency/expert_priors_for_hyperparameters.py)**: Learn how to incorporate expert priors for more efficient hyperparameter selection.\n\n- **[Additional NePS Examples](neps_examples/)**: Explore more examples, including various use cases and advanced configurations in NePS.\n\n## Contributing\n\nPlease see the [documentation for contributors](https://automl.github.io/neps/latest/dev_docs/contributing/).\n\n## Citations\n\nFor pointers on citing the NePS package and papers refer to our [documentation on citations](https://automl.github.io/neps/latest/citations/).\n",
    "bugtrack_url": null,
    "license": "Apache-2.0",
    "summary": "Neural Pipeline Search helps deep learning experts find the best neural pipeline.",
    "version": "0.12.1",
    "project_urls": {
        "Documentation": "https://automl.github.io/neps/",
        "Homepage": "https://github.com/automl/neps",
        "Repository": "https://github.com/automl/neps"
    },
    "split_keywords": [
        "neural pipeline search",
        " neural architecture search",
        " hyperparameter optimization",
        " automl"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "d26607d7a17e02ea0118c45f2271a8cfcf8bd0bef8f8cdccfa1e3d88a3c8eaf2",
                "md5": "762bedb7f7d89d736037a9ed0bb9fd9b",
                "sha256": "a6f78eb02143c12d6c4035caf2a2d8561bf46494955d44ef951c246a91453314"
            },
            "downloads": -1,
            "filename": "neural_pipeline_search-0.12.1-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "762bedb7f7d89d736037a9ed0bb9fd9b",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<3.12,>=3.8",
            "size": 332887,
            "upload_time": "2024-07-03T05:40:50",
            "upload_time_iso_8601": "2024-07-03T05:40:50.885989Z",
            "url": "https://files.pythonhosted.org/packages/d2/66/07d7a17e02ea0118c45f2271a8cfcf8bd0bef8f8cdccfa1e3d88a3c8eaf2/neural_pipeline_search-0.12.1-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "cbc601d78955a5c675cd520d86166e72902be818f2038e0b373b3286f33b1cb7",
                "md5": "d5ae29bff11f0bab1f9fe4142354404d",
                "sha256": "8e5d1c1e393565351385e30e878045f510eafbe7bf00e30999915390f57d18e2"
            },
            "downloads": -1,
            "filename": "neural_pipeline_search-0.12.1.tar.gz",
            "has_sig": false,
            "md5_digest": "d5ae29bff11f0bab1f9fe4142354404d",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<3.12,>=3.8",
            "size": 257670,
            "upload_time": "2024-07-03T05:40:55",
            "upload_time_iso_8601": "2024-07-03T05:40:55.052917Z",
            "url": "https://files.pythonhosted.org/packages/cb/c6/01d78955a5c675cd520d86166e72902be818f2038e0b373b3286f33b1cb7/neural_pipeline_search-0.12.1.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-07-03 05:40:55",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "automl",
    "github_project": "neps",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "neural-pipeline-search"
}
        
Elapsed time: 5.11722s