oracle-guardian-ai


Nameoracle-guardian-ai JSON
Version 1.2.0 PyPI version JSON
download
home_pageNone
SummaryOracle Guardian AI Open Source Project
upload_time2024-11-13 01:59:38
maintainerNone
docs_urlNone
authorOracle Data Science
requires_python>=3.9
licenseNone
keywords oracle cloud infrastructure oci fairness bias privacy ai
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # Oracle Guardian AI Open Source Project

[![PyPI](https://img.shields.io/pypi/v/oracle-guardian-ai.svg?style=for-the-badge&logo=pypi&logoColor=white)](https://pypi.org/project/oracle-guardian-ai/) [![Python](https://img.shields.io/pypi/pyversions/oracle-guardian-ai.svg?style=for-the-badge&logo=pypi&logoColor=white)](https://pypi.org/project/oracle-guardian-ai/) [![Code style: black](https://img.shields.io/badge/code%20style-black-000000.svg?style=for-the-badge&logo=pypi&logoColor=white)](https://github.com/ambv/black)

Oracle Guardian AI Open Source Project is a library consisting of tools to assess fairness/bias and privacy of machine learning models and data sets. This package contains `fairness` and `privacy_estimation` modules.

The `Fairness` module offers tools to help you diagnose and understand the unintended bias present in your dataset and model so that you can make steps towards more inclusive and fair applications of machine learning.

The `Privacy Estimation` module helps estimate potential leakage of sensitive information in the training data through attacks on Machine Learning (ML) models. The main idea is to carry out Membership Inference Attacks on a given target model trained on a given sensitive dataset, and measure their success to estimate the risk of leakage.

## Installation

You have various options when installing `oracle-guardian-ai`.

### Installing the oracle-guardian-ai base package

```bash
python3 -m pip install oracle-guardian-ai
```

### Installing extras libraries

The `all-optional` module will install all optional dependencies. Note the single quotes around installation of extra libraries.

```bash
python3 -m pip install 'oracle-guardian-ai[all-optional]'
```

To work with fairness/bias, install the `fairness` module. You can find extra dependencies in [requirements-fairness.txt](https://github.com/oracle/guardian-ai/blob/main/guardian_ai/requirements-fairness.txt).

```bash
python3 -m pip install 'oracle-guardian-ai[fairness]'
```

To work with privacy estimation, install the `privacy` module. You can find extra dependencies in [requirements-privacy.txt](https://github.com/oracle/guardian-ai/blob/main/guardian_ai/requirements-privacy.txt).

```bash
python3 -m pip install 'oracle-guardian-ai[privacy]'
```

## Documentation
  - [Oracle Guardian AI Documentation](https://oracle-guardian-ai.readthedocs.io/en/latest/index.html)
  - [OCI Data Science and AI services Examples](https://github.com/oracle/oci-data-science-ai-samples)
  - [Oracle AI & Data Science Blog](https://blogs.oracle.com/ai-and-datascience/)

## Examples

### Measurement with a Fairness Metric

```python
from guardian_ai.fairness.metrics import ModelStatisticalParityScorer
fairness_score = ModelStatisticalParityScorer(protected_attributes='<target_attribute>')
```

### Bias Mitigation

```python
from guardian_ai.fairness.bias_mitigation import ModelBiasMitigator
bias_mitigated_model = ModelBiasMitigator(
    model,
    protected_attribute_names='<target_attribute>',
    fairness_metric="statistical_parity",
    accuracy_metric="balanced_accuracy",
)

bias_mitigated_model.fit(X_val, y_val)
bias_mitigated_model.predict(X_test)
```


## Contributing

This project welcomes contributions from the community. Before submitting a pull request, please review our [contribution guide](./CONTRIBUTING.md).

Find Getting Started instructions for developers in [README-development.md](https://github.com/oracle/guardian-ai/blob/main/README-development.md).

## Security

Consult the security guide [SECURITY.md](https://github.com/oracle/guardian-ai/blob/main/SECURITY.md) for our responsible security vulnerability disclosure process.

## License

Copyright (c) 2023 Oracle and/or its affiliates. Licensed under the [Universal Permissive License v1.0](https://oss.oracle.com/licenses/upl/).


            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "oracle-guardian-ai",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.9",
    "maintainer_email": null,
    "keywords": "Oracle Cloud Infrastructure, OCI, Fairness, Bias, Privacy, AI",
    "author": "Oracle Data Science",
    "author_email": null,
    "download_url": "https://files.pythonhosted.org/packages/11/10/72fe263a88c88b0688e28898589e89271861704c0ff5bdfa13ca62312345/oracle_guardian_ai-1.2.0.tar.gz",
    "platform": null,
    "description": "# Oracle Guardian AI Open Source Project\n\n[![PyPI](https://img.shields.io/pypi/v/oracle-guardian-ai.svg?style=for-the-badge&logo=pypi&logoColor=white)](https://pypi.org/project/oracle-guardian-ai/) [![Python](https://img.shields.io/pypi/pyversions/oracle-guardian-ai.svg?style=for-the-badge&logo=pypi&logoColor=white)](https://pypi.org/project/oracle-guardian-ai/) [![Code style: black](https://img.shields.io/badge/code%20style-black-000000.svg?style=for-the-badge&logo=pypi&logoColor=white)](https://github.com/ambv/black)\n\nOracle Guardian AI Open Source Project is a library consisting of tools to assess fairness/bias and privacy of machine learning models and data sets. This package contains `fairness` and `privacy_estimation` modules.\n\nThe `Fairness` module offers tools to help you diagnose and understand the unintended bias present in your dataset and model so that you can make steps towards more inclusive and fair applications of machine learning.\n\nThe `Privacy Estimation` module helps estimate potential leakage of sensitive information in the training data through attacks on Machine Learning (ML) models. The main idea is to carry out Membership Inference Attacks on a given target model trained on a given sensitive dataset, and measure their success to estimate the risk of leakage.\n\n## Installation\n\nYou have various options when installing `oracle-guardian-ai`.\n\n### Installing the oracle-guardian-ai base package\n\n```bash\npython3 -m pip install oracle-guardian-ai\n```\n\n### Installing extras libraries\n\nThe `all-optional` module will install all optional dependencies. Note the single quotes around installation of extra libraries.\n\n```bash\npython3 -m pip install 'oracle-guardian-ai[all-optional]'\n```\n\nTo work with fairness/bias, install the `fairness` module. You can find extra dependencies in [requirements-fairness.txt](https://github.com/oracle/guardian-ai/blob/main/guardian_ai/requirements-fairness.txt).\n\n```bash\npython3 -m pip install 'oracle-guardian-ai[fairness]'\n```\n\nTo work with privacy estimation, install the `privacy` module. You can find extra dependencies in [requirements-privacy.txt](https://github.com/oracle/guardian-ai/blob/main/guardian_ai/requirements-privacy.txt).\n\n```bash\npython3 -m pip install 'oracle-guardian-ai[privacy]'\n```\n\n## Documentation\n  - [Oracle Guardian AI Documentation](https://oracle-guardian-ai.readthedocs.io/en/latest/index.html)\n  - [OCI Data Science and AI services Examples](https://github.com/oracle/oci-data-science-ai-samples)\n  - [Oracle AI & Data Science Blog](https://blogs.oracle.com/ai-and-datascience/)\n\n## Examples\n\n### Measurement with a Fairness Metric\n\n```python\nfrom guardian_ai.fairness.metrics import ModelStatisticalParityScorer\nfairness_score = ModelStatisticalParityScorer(protected_attributes='<target_attribute>')\n```\n\n### Bias Mitigation\n\n```python\nfrom guardian_ai.fairness.bias_mitigation import ModelBiasMitigator\nbias_mitigated_model = ModelBiasMitigator(\n    model,\n    protected_attribute_names='<target_attribute>',\n    fairness_metric=\"statistical_parity\",\n    accuracy_metric=\"balanced_accuracy\",\n)\n\nbias_mitigated_model.fit(X_val, y_val)\nbias_mitigated_model.predict(X_test)\n```\n\n\n## Contributing\n\nThis project welcomes contributions from the community. Before submitting a pull request, please review our [contribution guide](./CONTRIBUTING.md).\n\nFind Getting Started instructions for developers in [README-development.md](https://github.com/oracle/guardian-ai/blob/main/README-development.md).\n\n## Security\n\nConsult the security guide [SECURITY.md](https://github.com/oracle/guardian-ai/blob/main/SECURITY.md) for our responsible security vulnerability disclosure process.\n\n## License\n\nCopyright (c) 2023 Oracle and/or its affiliates. Licensed under the [Universal Permissive License v1.0](https://oss.oracle.com/licenses/upl/).\n\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "Oracle Guardian AI Open Source Project",
    "version": "1.2.0",
    "project_urls": {
        "Documentation": "https://oralce-guardian-ai.readthedocs.io/en/latest/index.html",
        "Repository": "https://github.com/oracle/guardian-ai"
    },
    "split_keywords": [
        "oracle cloud infrastructure",
        " oci",
        " fairness",
        " bias",
        " privacy",
        " ai"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "397f6712bd8e43195491c12b9a0a6c36c674090d5c38f376ed2b488fcc548a76",
                "md5": "a3fc477412bf131335a550357aa357e5",
                "sha256": "6529072707353ff8034361da7860963a503a65248af279675625babd5053f702"
            },
            "downloads": -1,
            "filename": "oracle_guardian_ai-1.2.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "a3fc477412bf131335a550357aa357e5",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.9",
            "size": 67989,
            "upload_time": "2024-11-13T01:59:37",
            "upload_time_iso_8601": "2024-11-13T01:59:37.484490Z",
            "url": "https://files.pythonhosted.org/packages/39/7f/6712bd8e43195491c12b9a0a6c36c674090d5c38f376ed2b488fcc548a76/oracle_guardian_ai-1.2.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "111072fe263a88c88b0688e28898589e89271861704c0ff5bdfa13ca62312345",
                "md5": "1a496c689b11f12dac7934cd3aeff088",
                "sha256": "93fba655dc5ec87914b713653223600505c7559609785b5571446e86d2c4d556"
            },
            "downloads": -1,
            "filename": "oracle_guardian_ai-1.2.0.tar.gz",
            "has_sig": false,
            "md5_digest": "1a496c689b11f12dac7934cd3aeff088",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.9",
            "size": 52956,
            "upload_time": "2024-11-13T01:59:38",
            "upload_time_iso_8601": "2024-11-13T01:59:38.952047Z",
            "url": "https://files.pythonhosted.org/packages/11/10/72fe263a88c88b0688e28898589e89271861704c0ff5bdfa13ca62312345/oracle_guardian_ai-1.2.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-11-13 01:59:38",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "oracle",
    "github_project": "guardian-ai",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "oracle-guardian-ai"
}
        
Elapsed time: 1.10445s