oracle-guardian-ai


Nameoracle-guardian-ai JSON
Version 1.1.0 PyPI version JSON
download
home_pageNone
SummaryOracle Guardian AI Open Source Project
upload_time2024-04-22 22:02:56
maintainerNone
docs_urlNone
authorOracle Data Science
requires_python>=3.8
licenseNone
keywords oracle cloud infrastructure oci fairness bias privacy ai
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # Oracle Guardian AI Open Source Project

[![PyPI](https://img.shields.io/pypi/v/oracle-guardian-ai.svg?style=for-the-badge&logo=pypi&logoColor=white)](https://pypi.org/project/oracle-guardian-ai/) [![Python](https://img.shields.io/pypi/pyversions/oracle-guardian-ai.svg?style=for-the-badge&logo=pypi&logoColor=white)](https://pypi.org/project/oracle-guardian-ai/) [![Code style: black](https://img.shields.io/badge/code%20style-black-000000.svg?style=for-the-badge&logo=pypi&logoColor=white)](https://github.com/ambv/black)

Oracle Guardian AI Open Source Project is a library consisting of tools to assess fairness/bias and privacy of machine learning models and data sets. This package contains `fairness` and `privacy_estimation` modules.

The `Fairness` module offers tools to help you diagnose and understand the unintended bias present in your dataset and model so that you can make steps towards more inclusive and fair applications of machine learning.

The `Privacy Estimation` module helps estimate potential leakage of sensitive information in the training data through attacks on Machine Learning (ML) models. The main idea is to carry out Membership Inference Attacks on a given target model trained on a given sensitive dataset, and measure their success to estimate the risk of leakage.

## Installation

You have various options when installing `oracle-guardian-ai`.

### Installing the oracle-guardian-ai base package

```bash
python3 -m pip install oracle-guardian-ai
```

### Installing extras libraries

The `all-optional` module will install all optional dependencies. Note the single quotes around installation of extra libraries.

```bash
python3 -m pip install 'oracle-guardian-ai[all-optional]'
```

To work with fairness/bias, install the `fairness` module. You can find extra dependencies in [requirements-fairness.txt](https://github.com/oracle/guardian-ai/blob/main/guardian_ai/requirements-fairness.txt).

```bash
python3 -m pip install 'oracle-guardian-ai[fairness]'
```

To work with privacy estimation, install the `privacy` module. You can find extra dependencies in [requirements-privacy.txt](https://github.com/oracle/guardian-ai/blob/main/guardian_ai/requirements-privacy.txt).

```bash
python3 -m pip install 'oracle-guardian-ai[privacy]'
```

## Documentation
  - [Oracle Guardian AI Documentation](https://oracle-guardian-ai.readthedocs.io/en/latest/index.html)
  - [OCI Data Science and AI services Examples](https://github.com/oracle/oci-data-science-ai-samples)
  - [Oracle AI & Data Science Blog](https://blogs.oracle.com/ai-and-datascience/)

## Examples

### Measurement with a Fairness Metric

```python
from guardian_ai.fairness.metrics import ModelStatisticalParityScorer
fairness_score = ModelStatisticalParityScorer(protected_attributes='<target_attribute>')
```

### Bias Mitigation

```python
from guardian_ai.fairness.bias_mitigation import ModelBiasMitigator
bias_mitigated_model = ModelBiasMitigator(
    model,
    protected_attribute_names='<target_attribute>',
    fairness_metric="statistical_parity",
    accuracy_metric="balanced_accuracy",
)

bias_mitigated_model.fit(X_val, y_val)
bias_mitigated_model.predict(X_test)
```


## Contributing

This project welcomes contributions from the community. Before submitting a pull request, please review our [contribution guide](./CONTRIBUTING.md).

Find Getting Started instructions for developers in [README-development.md](https://github.com/oracle/guardian-ai/blob/main/README-development.md).

## Security

Consult the security guide [SECURITY.md](https://github.com/oracle/guardian-ai/blob/main/SECURITY.md) for our responsible security vulnerability disclosure process.

## License

Copyright (c) 2023 Oracle and/or its affiliates. Licensed under the [Universal Permissive License v1.0](https://oss.oracle.com/licenses/upl/).


            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "oracle-guardian-ai",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.8",
    "maintainer_email": null,
    "keywords": "Oracle Cloud Infrastructure, OCI, Fairness, Bias, Privacy, AI",
    "author": "Oracle Data Science",
    "author_email": null,
    "download_url": "https://files.pythonhosted.org/packages/4e/7d/e2c7bcf2bc55be759435a43674ebec7a1edac3a56d364813059faf940573/oracle_guardian_ai-1.1.0.tar.gz",
    "platform": null,
    "description": "# Oracle Guardian AI Open Source Project\n\n[![PyPI](https://img.shields.io/pypi/v/oracle-guardian-ai.svg?style=for-the-badge&logo=pypi&logoColor=white)](https://pypi.org/project/oracle-guardian-ai/) [![Python](https://img.shields.io/pypi/pyversions/oracle-guardian-ai.svg?style=for-the-badge&logo=pypi&logoColor=white)](https://pypi.org/project/oracle-guardian-ai/) [![Code style: black](https://img.shields.io/badge/code%20style-black-000000.svg?style=for-the-badge&logo=pypi&logoColor=white)](https://github.com/ambv/black)\n\nOracle Guardian AI Open Source Project is a library consisting of tools to assess fairness/bias and privacy of machine learning models and data sets. This package contains `fairness` and `privacy_estimation` modules.\n\nThe `Fairness` module offers tools to help you diagnose and understand the unintended bias present in your dataset and model so that you can make steps towards more inclusive and fair applications of machine learning.\n\nThe `Privacy Estimation` module helps estimate potential leakage of sensitive information in the training data through attacks on Machine Learning (ML) models. The main idea is to carry out Membership Inference Attacks on a given target model trained on a given sensitive dataset, and measure their success to estimate the risk of leakage.\n\n## Installation\n\nYou have various options when installing `oracle-guardian-ai`.\n\n### Installing the oracle-guardian-ai base package\n\n```bash\npython3 -m pip install oracle-guardian-ai\n```\n\n### Installing extras libraries\n\nThe `all-optional` module will install all optional dependencies. Note the single quotes around installation of extra libraries.\n\n```bash\npython3 -m pip install 'oracle-guardian-ai[all-optional]'\n```\n\nTo work with fairness/bias, install the `fairness` module. You can find extra dependencies in [requirements-fairness.txt](https://github.com/oracle/guardian-ai/blob/main/guardian_ai/requirements-fairness.txt).\n\n```bash\npython3 -m pip install 'oracle-guardian-ai[fairness]'\n```\n\nTo work with privacy estimation, install the `privacy` module. You can find extra dependencies in [requirements-privacy.txt](https://github.com/oracle/guardian-ai/blob/main/guardian_ai/requirements-privacy.txt).\n\n```bash\npython3 -m pip install 'oracle-guardian-ai[privacy]'\n```\n\n## Documentation\n  - [Oracle Guardian AI Documentation](https://oracle-guardian-ai.readthedocs.io/en/latest/index.html)\n  - [OCI Data Science and AI services Examples](https://github.com/oracle/oci-data-science-ai-samples)\n  - [Oracle AI & Data Science Blog](https://blogs.oracle.com/ai-and-datascience/)\n\n## Examples\n\n### Measurement with a Fairness Metric\n\n```python\nfrom guardian_ai.fairness.metrics import ModelStatisticalParityScorer\nfairness_score = ModelStatisticalParityScorer(protected_attributes='<target_attribute>')\n```\n\n### Bias Mitigation\n\n```python\nfrom guardian_ai.fairness.bias_mitigation import ModelBiasMitigator\nbias_mitigated_model = ModelBiasMitigator(\n    model,\n    protected_attribute_names='<target_attribute>',\n    fairness_metric=\"statistical_parity\",\n    accuracy_metric=\"balanced_accuracy\",\n)\n\nbias_mitigated_model.fit(X_val, y_val)\nbias_mitigated_model.predict(X_test)\n```\n\n\n## Contributing\n\nThis project welcomes contributions from the community. Before submitting a pull request, please review our [contribution guide](./CONTRIBUTING.md).\n\nFind Getting Started instructions for developers in [README-development.md](https://github.com/oracle/guardian-ai/blob/main/README-development.md).\n\n## Security\n\nConsult the security guide [SECURITY.md](https://github.com/oracle/guardian-ai/blob/main/SECURITY.md) for our responsible security vulnerability disclosure process.\n\n## License\n\nCopyright (c) 2023 Oracle and/or its affiliates. Licensed under the [Universal Permissive License v1.0](https://oss.oracle.com/licenses/upl/).\n\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "Oracle Guardian AI Open Source Project",
    "version": "1.1.0",
    "project_urls": {
        "Documentation": "https://oralce-guardian-ai.readthedocs.io/en/latest/index.html",
        "Repository": "https://github.com/oracle/guardian-ai"
    },
    "split_keywords": [
        "oracle cloud infrastructure",
        " oci",
        " fairness",
        " bias",
        " privacy",
        " ai"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "5f9522e94e1c2e37b154e46f20e8052b125d5902c3e32fc9d78756704ed25ad7",
                "md5": "7387f1b456dc7eb50fca42faeefdd595",
                "sha256": "0c9b6115bd0345f0953d6ba36edb556cd70091732be1ffa3340d18d76f974b79"
            },
            "downloads": -1,
            "filename": "oracle_guardian_ai-1.1.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "7387f1b456dc7eb50fca42faeefdd595",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.8",
            "size": 67813,
            "upload_time": "2024-04-22T22:02:54",
            "upload_time_iso_8601": "2024-04-22T22:02:54.729242Z",
            "url": "https://files.pythonhosted.org/packages/5f/95/22e94e1c2e37b154e46f20e8052b125d5902c3e32fc9d78756704ed25ad7/oracle_guardian_ai-1.1.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "4e7de2c7bcf2bc55be759435a43674ebec7a1edac3a56d364813059faf940573",
                "md5": "c76d1b7037c02e23b7797ace7cacf2d9",
                "sha256": "4a1de2cea3073a4652e9ff4d3db6ad8940786cd173641a5ce14934a8fc6585a9"
            },
            "downloads": -1,
            "filename": "oracle_guardian_ai-1.1.0.tar.gz",
            "has_sig": false,
            "md5_digest": "c76d1b7037c02e23b7797ace7cacf2d9",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.8",
            "size": 52827,
            "upload_time": "2024-04-22T22:02:56",
            "upload_time_iso_8601": "2024-04-22T22:02:56.277649Z",
            "url": "https://files.pythonhosted.org/packages/4e/7d/e2c7bcf2bc55be759435a43674ebec7a1edac3a56d364813059faf940573/oracle_guardian_ai-1.1.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-04-22 22:02:56",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "oracle",
    "github_project": "guardian-ai",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "oracle-guardian-ai"
}
        
Elapsed time: 0.44921s