hypershap


Namehypershap JSON
Version 0.0.3 PyPI version JSON
download
home_pageNone
SummaryHyperSHAP is a post-hoc explanation method for hyperparameter optimization.
upload_time2025-09-15 08:53:13
maintainerNone
docs_urlNone
authorNone
requires_python<4.0,>=3.10
licenseNone
keywords python
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # HyperSHAP <img src="https://raw.githubusercontent.com/automl/hypershap/main/docs/source/_static/logo/hypershap-logo.png" alt="HyperSHAP Logo" align="right" height="200px"/>

[![Release](https://img.shields.io/github/v/release/automl/HyperSHAP)](https://img.shields.io/github/v/release/automl/hypershap)
[![Build status](https://img.shields.io/github/actions/workflow/status/automl/hypershap/main.yml?branch=main)](https://github.com/automl/hypershap/actions/workflows/main.yml?query=branch%3Amain)
[![Coverage Status](https://coveralls.io/repos/github/automl/HyperSHAP/badge.svg?branch=dev)](https://coveralls.io/github/automl/HyperSHAP?branch=dev)
[![Commit activity](https://img.shields.io/github/commit-activity/m/automl/hypershap)](https://img.shields.io/github/commit-activity/m/automl/hypershap)
[![License](https://img.shields.io/github/license/automl/hypershap)](https://img.shields.io/github/license/automl/hypershap)

HyperSHAP – a game‑theoretic Python library for explaining Hyperparameter Optimization (HPO). It uses Shapley values and interaction indices to provide both local and global insights into how individual hyper‑parameters (and their interactions) affect a model’s performance.

- **Github repository**: <https://github.com/automl/hypershap/>
- **Documentation** <https://automl.github.io/hypershap/>


## Table of Contents
- [Features](#features)
- [Installation](#installation)
- [Getting Started](#getting-started)
- [API Overview](#api-overview)
- [Example Notebook](#example-notebook)
- [Citation](#citation)
- [Contributing](#contributing)
- [License](#license)

---

## Features
- **Additive Shapley decomposition** of any performance metric across hyper‑parameters.
- **Interaction analysis** via the Faithful Shapley Interaction Index (FSII).
- Ready‑made explanation tasks for **Ablation**, **Tunability**, and **Optimizer Bias** studies.
- Integrated **visualisation** (SI‑graph) for interaction effects.
- Works with any surrogate model that follows the `ExplanationTask` interface.

---

## Installation

```sh
$ make install
```

## Getting Started
Given an existing setup with a ConfigurationSpace from the [ConfigSpace package](https://github.com/automl/ConfigSpace) and black-box function as follows:
```Python
from ConfigSpace import ConfigurationSpace, Configuration

# ConfigurationSpace describing the hyperparameter space
cs = ConfigurationSpace()
  ...

# A black-box function, evaluating ConfigSpace.Configuration objects
def blackbox_function(cfg: Configuration) -> float:
  ...
```

You can use HyperSHAP as follows:
```Python
from hypershap import ExplanationTask, HyperSHAP

# Instantiate HyperSHAP
hypershap = HyperSHAP(ExplanationTask.from_function(config_space=cs,function=blackbox_function))
# Conduct tunability analysis
hypershap.tunability(baseline_config=cs.get_default_configuration())
# Plot results as a Shapley Interaction graph
hypershap.plot_si_graph()
```

The example demonstrates how to:
1. Wrap a black-box function in an explanation task.
2. Use `HyperSHAP` to obtain interaction values for the **tunability** game.
3. Plot the corresponding SI-graph.

---

## API Overview

| Method | Purpose                                                                                                                           | Key Arguments |
|--------|-----------------------------------------------------------------------------------------------------------------------------------|---------------|
| `HyperSHAP(explanation_task)` | Initialize the explainer with a generic `ExplanationTask`.                                                                        |
| `ablation(config_of_interest, baseline_config, index="FSII", order=2)` | Explain the contribution of each hyperparameter value (and interactions) when moving from a baseline to a specific configuration. |
| `tunability(baseline_config=None, index="FSII", order=2, n_samples=10_000)` | Quantify how much performance can be gained by tuning subsets of hyper‑parameters.                                                |
| `optimizer_bias(optimizer_of_interest, optimizer_ensemble, index="FSII", order=2)` | Attribute performance differences to a particular optimizer vs. an ensemble of optimizers.                                        |
| `plot_si_graph(interaction_values=None, save_path=None)` | Plot the Shapley Interaction (SI) graph; uses the most recent interaction values if none are supplied.                            |
| `ExplanationTask.get_hyperparameter_names()` | Helper to retrieve ordered hyper‑parameter names (used for visualisation).                                                        |

All methods return an `InteractionValues` object (from **shapiq**) that can be inspected, saved, or passed to the visualisation routine.

---

## Example Notebooks
Full Jupyter notebooks illustrating all three explanation tasks (ablation, tunability, optimizer bias) are included in the repository under `examples/`. The notebookts walk through:

- Building a mockup environment
- Creating the corresponding explanation task
- Loading explanation tasks from different setups: data, black-box function, and existing surrogate model.
- Computing interaction values with HyperSHAP
- Visualizing results with `plot_si_graph`

---

## Citation
If you use HyperSHAP in your research, please cite the original paper:

```bibtex
@article{wever-arxiv25,
  author       = {Marcel Wever and
                  Maximilian Muschalik and
                  Fabian Fumagalli and
                  Marius Lindauer},
  title        = {HyperSHAP: Shapley Values and Interactions for Hyperparameter Importance},
  journal      = {CoRR},
  volume       = {abs/2502.01276},
  year         = {2025},
  doi          = {10.48550/ARXIV.2502.01276},
}
```

The paper introduces the underlying game-theoretic framework and demonstrates its usefulness for HPO explainability.

## Contributing

Contributions are welcome! Please follow these steps:

1. Fork the repo and create a feature branch (git checkout -b feat/your-feature).
2. Write tests (the project uses pytest).
3. Ensure all tests pass (pytest).
4. Update documentation if you add new functionality.
5. Submit a Pull Request with a clear description of the changes.


See CONTRIBUTING.md for detailed guidelines.

---

## License
HyperSHAP is released under the BSD 3-Clause License. See the `LICENSE` file for full terms.

---

**Enjoy exploring your HPO pipelines with HyperSHAP!** 🎉

---
Repository initiated with [fpgmaas/cookiecutter-uv](https://github.com/fpgmaas/cookiecutter-uv).

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "hypershap",
    "maintainer": null,
    "docs_url": null,
    "requires_python": "<4.0,>=3.10",
    "maintainer_email": null,
    "keywords": "python",
    "author": null,
    "author_email": "Marcel Wever <m.wever@ai.uni-hannover.de>",
    "download_url": "https://files.pythonhosted.org/packages/04/f3/2a8f683f39329ec3efbf2a95b7adae5f428f696045bf2535ad07d665f5b3/hypershap-0.0.3.tar.gz",
    "platform": null,
    "description": "# HyperSHAP <img src=\"https://raw.githubusercontent.com/automl/hypershap/main/docs/source/_static/logo/hypershap-logo.png\" alt=\"HyperSHAP Logo\" align=\"right\" height=\"200px\"/>\n\n[![Release](https://img.shields.io/github/v/release/automl/HyperSHAP)](https://img.shields.io/github/v/release/automl/hypershap)\n[![Build status](https://img.shields.io/github/actions/workflow/status/automl/hypershap/main.yml?branch=main)](https://github.com/automl/hypershap/actions/workflows/main.yml?query=branch%3Amain)\n[![Coverage Status](https://coveralls.io/repos/github/automl/HyperSHAP/badge.svg?branch=dev)](https://coveralls.io/github/automl/HyperSHAP?branch=dev)\n[![Commit activity](https://img.shields.io/github/commit-activity/m/automl/hypershap)](https://img.shields.io/github/commit-activity/m/automl/hypershap)\n[![License](https://img.shields.io/github/license/automl/hypershap)](https://img.shields.io/github/license/automl/hypershap)\n\nHyperSHAP \u2013 a game\u2011theoretic Python library for explaining Hyperparameter Optimization (HPO). It uses Shapley values and interaction indices to provide both local and global insights into how individual hyper\u2011parameters (and their interactions) affect a model\u2019s performance.\n\n- **Github repository**: <https://github.com/automl/hypershap/>\n- **Documentation** <https://automl.github.io/hypershap/>\n\n\n## Table of Contents\n- [Features](#features)\n- [Installation](#installation)\n- [Getting Started](#getting-started)\n- [API Overview](#api-overview)\n- [Example Notebook](#example-notebook)\n- [Citation](#citation)\n- [Contributing](#contributing)\n- [License](#license)\n\n---\n\n## Features\n- **Additive Shapley decomposition** of any performance metric across hyper\u2011parameters.\n- **Interaction analysis** via the Faithful Shapley Interaction Index (FSII).\n- Ready\u2011made explanation tasks for **Ablation**, **Tunability**, and **Optimizer Bias** studies.\n- Integrated **visualisation** (SI\u2011graph) for interaction effects.\n- Works with any surrogate model that follows the `ExplanationTask` interface.\n\n---\n\n## Installation\n\n```sh\n$ make install\n```\n\n## Getting Started\nGiven an existing setup with a ConfigurationSpace from the [ConfigSpace package](https://github.com/automl/ConfigSpace) and black-box function as follows:\n```Python\nfrom ConfigSpace import ConfigurationSpace, Configuration\n\n# ConfigurationSpace describing the hyperparameter space\ncs = ConfigurationSpace()\n  ...\n\n# A black-box function, evaluating ConfigSpace.Configuration objects\ndef blackbox_function(cfg: Configuration) -> float:\n  ...\n```\n\nYou can use HyperSHAP as follows:\n```Python\nfrom hypershap import ExplanationTask, HyperSHAP\n\n# Instantiate HyperSHAP\nhypershap = HyperSHAP(ExplanationTask.from_function(config_space=cs,function=blackbox_function))\n# Conduct tunability analysis\nhypershap.tunability(baseline_config=cs.get_default_configuration())\n# Plot results as a Shapley Interaction graph\nhypershap.plot_si_graph()\n```\n\nThe example demonstrates how to:\n1. Wrap a black-box function in an explanation task.\n2. Use `HyperSHAP` to obtain interaction values for the **tunability** game.\n3. Plot the corresponding SI-graph.\n\n---\n\n## API Overview\n\n| Method | Purpose                                                                                                                           | Key Arguments |\n|--------|-----------------------------------------------------------------------------------------------------------------------------------|---------------|\n| `HyperSHAP(explanation_task)` | Initialize the explainer with a generic `ExplanationTask`.                                                                        |\n| `ablation(config_of_interest, baseline_config, index=\"FSII\", order=2)` | Explain the contribution of each hyperparameter value (and interactions) when moving from a baseline to a specific configuration. |\n| `tunability(baseline_config=None, index=\"FSII\", order=2, n_samples=10_000)` | Quantify how much performance can be gained by tuning subsets of hyper\u2011parameters.                                                |\n| `optimizer_bias(optimizer_of_interest, optimizer_ensemble, index=\"FSII\", order=2)` | Attribute performance differences to a particular optimizer vs. an ensemble of optimizers.                                        |\n| `plot_si_graph(interaction_values=None, save_path=None)` | Plot the Shapley Interaction (SI) graph; uses the most recent interaction values if none are supplied.                            |\n| `ExplanationTask.get_hyperparameter_names()` | Helper to retrieve ordered hyper\u2011parameter names (used for visualisation).                                                        |\n\nAll methods return an `InteractionValues` object (from **shapiq**) that can be inspected, saved, or passed to the visualisation routine.\n\n---\n\n## Example Notebooks\nFull Jupyter notebooks illustrating all three explanation tasks (ablation, tunability, optimizer bias) are included in the repository under `examples/`. The notebookts walk through:\n\n- Building a mockup environment\n- Creating the corresponding explanation task\n- Loading explanation tasks from different setups: data, black-box function, and existing surrogate model.\n- Computing interaction values with HyperSHAP\n- Visualizing results with `plot_si_graph`\n\n---\n\n## Citation\nIf you use HyperSHAP in your research, please cite the original paper:\n\n```bibtex\n@article{wever-arxiv25,\n  author       = {Marcel Wever and\n                  Maximilian Muschalik and\n                  Fabian Fumagalli and\n                  Marius Lindauer},\n  title        = {HyperSHAP: Shapley Values and Interactions for Hyperparameter Importance},\n  journal      = {CoRR},\n  volume       = {abs/2502.01276},\n  year         = {2025},\n  doi          = {10.48550/ARXIV.2502.01276},\n}\n```\n\nThe paper introduces the underlying game-theoretic framework and demonstrates its usefulness for HPO explainability.\n\n## Contributing\n\nContributions are welcome! Please follow these steps:\n\n1. Fork the repo and create a feature branch (git checkout -b feat/your-feature).\n2. Write tests (the project uses pytest).\n3. Ensure all tests pass (pytest).\n4. Update documentation if you add new functionality.\n5. Submit a Pull Request with a clear description of the changes.\n\n\nSee CONTRIBUTING.md for detailed guidelines.\n\n---\n\n## License\nHyperSHAP is released under the BSD 3-Clause License. See the `LICENSE` file for full terms.\n\n---\n\n**Enjoy exploring your HPO pipelines with HyperSHAP!** \ud83c\udf89\n\n---\nRepository initiated with [fpgmaas/cookiecutter-uv](https://github.com/fpgmaas/cookiecutter-uv).\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "HyperSHAP is a post-hoc explanation method for hyperparameter optimization.",
    "version": "0.0.3",
    "project_urls": {
        "Documentation": "https://automl.github.io/hypershap/",
        "Homepage": "https://automl.github.io/hypershap/",
        "Repository": "https://github.com/automl/hypershap"
    },
    "split_keywords": [
        "python"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "0f76d58c008e9f4bf6bf34f0797192ec7610c5d043af22b3ff5833fcc59feafa",
                "md5": "05335507ca2a4fb0117c02ff3dec2a83",
                "sha256": "666bb86f961d2b7a1421622ca23a0d7f055c2f6962bc879138bf94899bb6b301"
            },
            "downloads": -1,
            "filename": "hypershap-0.0.3-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "05335507ca2a4fb0117c02ff3dec2a83",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<4.0,>=3.10",
            "size": 24087,
            "upload_time": "2025-09-15T08:53:12",
            "upload_time_iso_8601": "2025-09-15T08:53:12.615263Z",
            "url": "https://files.pythonhosted.org/packages/0f/76/d58c008e9f4bf6bf34f0797192ec7610c5d043af22b3ff5833fcc59feafa/hypershap-0.0.3-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "04f32a8f683f39329ec3efbf2a95b7adae5f428f696045bf2535ad07d665f5b3",
                "md5": "ecd868170bc5b60750b25ee13d511687",
                "sha256": "e96a32f959dd05baa46582da4fd3810ca20d1d85b89f4bd08e6f942952a35fca"
            },
            "downloads": -1,
            "filename": "hypershap-0.0.3.tar.gz",
            "has_sig": false,
            "md5_digest": "ecd868170bc5b60750b25ee13d511687",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<4.0,>=3.10",
            "size": 480808,
            "upload_time": "2025-09-15T08:53:13",
            "upload_time_iso_8601": "2025-09-15T08:53:13.679296Z",
            "url": "https://files.pythonhosted.org/packages/04/f3/2a8f683f39329ec3efbf2a95b7adae5f428f696045bf2535ad07d665f5b3/hypershap-0.0.3.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-09-15 08:53:13",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "automl",
    "github_project": "hypershap",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "tox": true,
    "lcname": "hypershap"
}
        
Elapsed time: 8.43300s