approx-cp


Nameapprox-cp JSON
Version 0.0.2 PyPI version JSON
download
home_pagehttps://github.com/cambridge-mlg/acp
SummaryPython implementation of Approximate full Conformal Prediction (ACP)
upload_time2023-04-12 18:54:13
maintainer
docs_urlNone
authorJavier Abad
requires_python>=3
license
keywords machine learning aaai conformal prediction
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # Approximate full Conformal Prediction

This repository contains the Python implementation of [Approximating Full Conformal Prediction at Scale via Influence Functions](https://arxiv.org/abs/2202.01315).

* [Overview](#overview)
* [Contents](#contents)
* [Third-party software](#third-party-software)
* [Usage](#usage)
* [Tutorial Notebook](#tutorial-notebook)
* [Experiments](#experiments)
* [Reference](#reference)

## Overview

Approximate full Conformal Prediction (ACP) outputs a prediction set that contains the true label with at least a probability specified by the practicioner. In large datasets, ACP inherits the statistical power of the highly efficient full Conformal Prediction. The method works as a wrapper for any differentiable ML model.

## Contents

This repository is organized as follows. In the folder `src/acp` you can find the following modules:

 - `methods.py` Python implementation of the ACP algorithms.
 - `others.py` Python implementation of the comparing methods (SCP, APS, RAPS, CV+, JK+).
 - `wrapper.py` Python implementation of ACP as a wrapper for any differentiable PyTorch model. See `models.py` for examples.
 - `models.py` Examples of models compatible with `wrapper.py` (e.g., logistic regression, neural network, convolutional neural network).
 - `experiments.py` Python file to run the experiments from the command line.
 - `models/` Saved models.

The folder  `src/third_party/` contains additional third-party software.
 
## Third-party software

We include the following third-party packages for comparison with ACP:

- [RAPS](https://github.com/aangelopoulos/conformal_classification)
- [APS, CV+, JK+](https://github.com/msesia/arc)
 

## Usage

### Installation
ACP can be utilized as a fully-independent `pip` package. You can download the framework by running the following command in the terminal:

```bash
pip install approx-cp
```
In order to use ACP in your own models, just include the following imports in your file:

```bash
from acp.wrapper import ACP_D, ACP_O #Deleted scheme (ACP_D) and ordinary scheme (ACP_O)
```
Alternatively, you can clone this repo by running:

```bash
git clone https://github.com/cambridge-mlg/acp
cd acp
```
And install the ACP Python package in a customizable conda environment:

```bash
conda create -n myenv python=3.9
conda activate myenv
pip install --upgrade pip
pip install -e .         
```
Now, just include the import:

```bash
from acp.wrapper import ACP_D, ACP_O
```

### Constructing prediction sets with ACP

ACP works as a wrapper for any PyTorch model with `.fit()` and `.predict()` methods. Once you instantiate your model, you can generate tight prediction sets that contain the true label with a specified probability. Here is an example with synthetic data:

```bash
from sklearn.datasets import make_classification
from sklearn.model_selection import train_test_split
from acp.models import NeuralNetwork
from acp.wrapper import ACP_D

X, Y = make_classification(n_samples = 1100, n_features = 10, n_classes = 2, n_clusters_per_class = 1, n_informative = 3, random_state = 42)
Xtrain, Xtest, Ytrain, Ytest = train_test_split(X, Y, test_size = 100, random_state = 42)
model = NeuralNetwork(input_size = 10, num_neurons = [20, 10], out_size = 2, seed = 42, l2_reg = 0.01)

ACP = ACP_D(Xtrain, Ytrain, model, seed = 42, verbose = True)
sets = ACP.predict(Xtest, epsilon = 0.1, out_file = "results/test")
```

## Reference

J. Abad Martinez, U. Bhatt, A. Weller and G. Cherubin. Approximating Full Conformal Prediction at Scale via Influence Functions. Association for the Advancement of Artificial Intelligence Conference on Artificial Intelligence (AAAI), 2023.

 BiBTeX:

```
@inproceedings{Abad2023ApproximatingFC,
  title={Approximating Full Conformal Prediction at Scale via Influence Functions},
  author={Javier Abad and Umang Bhatt and Adrian Weller and Giovanni Cherubin},
  booktitle={AAAI Conference on Artificial Intelligence},
  year={2023}
}
```




            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/cambridge-mlg/acp",
    "name": "approx-cp",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3",
    "maintainer_email": "",
    "keywords": "machine learning,AAAI,conformal prediction",
    "author": "Javier Abad",
    "author_email": "javier.abadmartinez@ai.ethz.ch",
    "download_url": "https://files.pythonhosted.org/packages/50/fe/0e880914ea77eba609899488db9b5755effca40b31ebfba327dd620d214e/approx-cp-0.0.2.tar.gz",
    "platform": null,
    "description": "# Approximate full Conformal Prediction\n\nThis repository contains the Python implementation of [Approximating Full Conformal Prediction at Scale via Influence Functions](https://arxiv.org/abs/2202.01315).\n\n* [Overview](#overview)\n* [Contents](#contents)\n* [Third-party software](#third-party-software)\n* [Usage](#usage)\n* [Tutorial Notebook](#tutorial-notebook)\n* [Experiments](#experiments)\n* [Reference](#reference)\n\n## Overview\n\nApproximate full Conformal Prediction (ACP) outputs a prediction set that contains the true label with at least a probability specified by the practicioner. In large datasets, ACP inherits the statistical power of the highly efficient full Conformal Prediction. The method works as a wrapper for any differentiable ML model.\n\n## Contents\n\nThis repository is organized as follows. In the folder `src/acp` you can find the following modules:\n\n - `methods.py` Python implementation of the ACP algorithms.\n - `others.py` Python implementation of the comparing methods (SCP, APS, RAPS, CV+, JK+).\n - `wrapper.py` Python implementation of ACP as a wrapper for any differentiable PyTorch model. See `models.py` for examples.\n - `models.py` Examples of models compatible with `wrapper.py` (e.g., logistic regression, neural network, convolutional neural network).\n - `experiments.py` Python file to run the experiments from the command line.\n - `models/` Saved models.\n\nThe folder  `src/third_party/` contains additional third-party software.\n \n## Third-party software\n\nWe include the following third-party packages for comparison with ACP:\n\n- [RAPS](https://github.com/aangelopoulos/conformal_classification)\n- [APS, CV+, JK+](https://github.com/msesia/arc)\n \n\n## Usage\n\n### Installation\nACP can be utilized as a fully-independent `pip` package. You can download the framework by running the following command in the terminal:\n\n```bash\npip install approx-cp\n```\nIn order to use ACP in your own models, just include the following imports in your file:\n\n```bash\nfrom acp.wrapper import ACP_D, ACP_O #Deleted scheme (ACP_D) and ordinary scheme (ACP_O)\n```\nAlternatively, you can clone this repo by running:\n\n```bash\ngit clone https://github.com/cambridge-mlg/acp\ncd acp\n```\nAnd install the ACP Python package in a customizable conda environment:\n\n```bash\nconda create -n myenv python=3.9\nconda activate myenv\npip install --upgrade pip\npip install -e .         \n```\nNow, just include the import:\n\n```bash\nfrom acp.wrapper import ACP_D, ACP_O\n```\n\n### Constructing prediction sets with ACP\n\nACP works as a wrapper for any PyTorch model with `.fit()` and `.predict()` methods. Once you instantiate your model, you can generate tight prediction sets that contain the true label with a specified probability. Here is an example with synthetic data:\n\n```bash\nfrom sklearn.datasets import make_classification\nfrom sklearn.model_selection import train_test_split\nfrom acp.models import NeuralNetwork\nfrom acp.wrapper import ACP_D\n\nX, Y = make_classification(n_samples = 1100, n_features = 10, n_classes = 2, n_clusters_per_class = 1, n_informative = 3, random_state = 42)\nXtrain, Xtest, Ytrain, Ytest = train_test_split(X, Y, test_size = 100, random_state = 42)\nmodel = NeuralNetwork(input_size = 10, num_neurons = [20, 10], out_size = 2, seed = 42, l2_reg = 0.01)\n\nACP = ACP_D(Xtrain, Ytrain, model, seed = 42, verbose = True)\nsets = ACP.predict(Xtest, epsilon = 0.1, out_file = \"results/test\")\n```\n\n## Reference\n\nJ. Abad Martinez, U. Bhatt, A. Weller and G. Cherubin. Approximating Full Conformal Prediction at Scale via Influence Functions. Association for the Advancement of Artificial Intelligence Conference on Artificial Intelligence (AAAI), 2023.\n\n BiBTeX:\n\n```\n@inproceedings{Abad2023ApproximatingFC,\n  title={Approximating Full Conformal Prediction at Scale via Influence Functions},\n  author={Javier Abad and Umang Bhatt and Adrian Weller and Giovanni Cherubin},\n  booktitle={AAAI Conference on Artificial Intelligence},\n  year={2023}\n}\n```\n\n\n\n",
    "bugtrack_url": null,
    "license": "",
    "summary": "Python implementation of Approximate full Conformal Prediction (ACP)",
    "version": "0.0.2",
    "split_keywords": [
        "machine learning",
        "aaai",
        "conformal prediction"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "24f69a0136066fac52af1f6034af9481180d61596addfc4cde61ac52b6fda9a5",
                "md5": "c6cb88f47e437878a18719329cefb3fc",
                "sha256": "d43940f641b9dea51f79ffc258c3d3a9084f7125b867705620bf4eb391648c88"
            },
            "downloads": -1,
            "filename": "approx_cp-0.0.2-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "c6cb88f47e437878a18719329cefb3fc",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3",
            "size": 25565,
            "upload_time": "2023-04-12T18:54:10",
            "upload_time_iso_8601": "2023-04-12T18:54:10.873010Z",
            "url": "https://files.pythonhosted.org/packages/24/f6/9a0136066fac52af1f6034af9481180d61596addfc4cde61ac52b6fda9a5/approx_cp-0.0.2-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "50fe0e880914ea77eba609899488db9b5755effca40b31ebfba327dd620d214e",
                "md5": "253af68a5cb90b9a4d521f6b3d73c403",
                "sha256": "a4a9754a481271a661c9adcccd772b00f4474a615e1f58d46dc7ec0717709064"
            },
            "downloads": -1,
            "filename": "approx-cp-0.0.2.tar.gz",
            "has_sig": false,
            "md5_digest": "253af68a5cb90b9a4d521f6b3d73c403",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3",
            "size": 24441,
            "upload_time": "2023-04-12T18:54:13",
            "upload_time_iso_8601": "2023-04-12T18:54:13.258769Z",
            "url": "https://files.pythonhosted.org/packages/50/fe/0e880914ea77eba609899488db9b5755effca40b31ebfba327dd620d214e/approx-cp-0.0.2.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-04-12 18:54:13",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "github_user": "cambridge-mlg",
    "github_project": "acp",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "lcname": "approx-cp"
}
        
Elapsed time: 0.21473s