pfl


Namepfl JSON
Version 0.2.0 PyPI version JSON
download
home_pagehttps://github.com/apple/pfl-research
SummarySimulation framework for Private Federated Learning
upload_time2024-07-18 22:21:56
maintainerNone
docs_urlNone
authorApple
requires_python<3.12,>=3.10
licenseNone
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # `pfl`: Python framework for Private Federated Learning simulations

[![GitHub License](https://img.shields.io/github/license/apple/pfl-research)](https://github.com/apple/pfl-research/blob/main/LICENSE)
[![CircleCI](https://dl.circleci.com/status-badge/img/gh/apple/pfl-research/tree/main.svg?style=shield)](https://dl.circleci.com/status-badge/redirect/gh/apple/pfl-research/tree/main)
[![PyPI - Python Version](https://img.shields.io/pypi/pyversions/pfl)](https://github.com/apple/pfl-research/blob/main/pyproject.toml#L18)

**Documentation website:** https://apple.github.io/pfl-research

`pfl` is a Python framework developed at Apple to empower researchers to run efficient simulations with privacy-preserving federated learning (FL) and disseminate the results of their research in FL. We are a team comprising engineering and research expertise, and we encourage researchers to publish their papers, with this code, with confidence.

The framework is `not` intended to be used for third-party FL deployments but the results of the simulations can be tremendously useful in actual FL deployments.
We hope that `pfl` will promote open research in FL and its effective dissemination.

``pfl`` provides several useful features, including the following:

* Get started quickly trying out PFL for your use case with your existing model and data.
* Iterate quickly with fast simulations utilizing multiple levels of distributed training (multiple processes, GPUs and machines).
* Flexibility and expressiveness - when a researcher has a PFL idea to try, ``pfl`` has flexible APIs to express these ideas.
* Scalable simulations for large experiments with state-of-the-art algorithms and models.
* Support both PyTorch and TensorFlow.
* Unified benchmarks for datasets that have been vetted for both PyTorch and TensorFlow.
* Support other models in addition to neural networks, e.g. GBDTs. Switching between types of models is seamless.
* Tight integration with privacy features, including common mechanisms for local and central differential privacy.

Results from benchmarks are maintained in [this Weights & Biases report](https://api.wandb.ai/links/pfl/5scd5f66).

## Installation

Installation instructions can be found [here](http://apple.github.io/pfl-research/installation.html).
`pfl` is available on PyPI and a full installation be done with pip:

```
pip install 'pfl[tf,pytorch,trees]'
```

## Getting started - tutorial notebooks

To try out `pfl` immediately without installation, we provide several colab notebooks for learning the different components in `pfl` hands-on.
`<TODO push notebooks to colab>`

Also available as Jupyter notebooks [here](https://github.com/apple/pfl-research/tree/develop/tutorials).

## Getting started - benchmarks

`pfl` aims to streamline the benchmarking process of testing hypotheses in the Federated Learning paradigm. The official benchmarks are available in the [benchmarks](./benchmarks) directory, using a variety of realistic dataset-model combinations with and without differential privacy (yes, we do also have CIFAR10).

**Copying these examples is a great starting point for doing your own research.**
[See the quickstart](./benchmarks#quickstart) on how to start converging a model on the simplest benchmark (CIFAR10) in just a few minutes.

## Contributing

Researchers are invited to contribute to the framework. Please, see [here](http://apple.github.io/pfl-research/support/contributing.html) for more details.

## Citing pfl-research

```
@software{pfl2024,
  author = {Filip Granqvist and Congzheng Song and Áine Cahill and Rogier van Dalen and Martin Pelikan and Yi Sheng Chan and Xiaojun Feng and Natarajan Krishnaswami and Mona Chitnis and Vojta Jina},
  title = {{pfl}: simulation framework for accelerating research in Private Federated Learning},
  url = {https://github.com/apple/pfl-research},
  version = {0.0},
  year = {2024},
}
```

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/apple/pfl-research",
    "name": "pfl",
    "maintainer": null,
    "docs_url": null,
    "requires_python": "<3.12,>=3.10",
    "maintainer_email": null,
    "keywords": null,
    "author": "Apple",
    "author_email": null,
    "download_url": "https://files.pythonhosted.org/packages/66/4b/1b661a548413881358d51832a7e3be52a7677ebe35783d25d8df38ae25f8/pfl-0.2.0.tar.gz",
    "platform": null,
    "description": "# `pfl`: Python framework for Private Federated Learning simulations\n\n[![GitHub License](https://img.shields.io/github/license/apple/pfl-research)](https://github.com/apple/pfl-research/blob/main/LICENSE)\n[![CircleCI](https://dl.circleci.com/status-badge/img/gh/apple/pfl-research/tree/main.svg?style=shield)](https://dl.circleci.com/status-badge/redirect/gh/apple/pfl-research/tree/main)\n[![PyPI - Python Version](https://img.shields.io/pypi/pyversions/pfl)](https://github.com/apple/pfl-research/blob/main/pyproject.toml#L18)\n\n**Documentation website:** https://apple.github.io/pfl-research\n\n`pfl` is a Python framework developed at Apple to empower researchers to run efficient simulations with privacy-preserving federated learning (FL) and disseminate the results of their research in FL. We are a team comprising engineering and research expertise, and we encourage researchers to publish their papers, with this code, with confidence.\n\nThe framework is `not` intended to be used for third-party FL deployments but the results of the simulations can be tremendously useful in actual FL deployments.\nWe hope that `pfl` will promote open research in FL and its effective dissemination.\n\n``pfl`` provides several useful features, including the following:\n\n* Get started quickly trying out PFL for your use case with your existing model and data.\n* Iterate quickly with fast simulations utilizing multiple levels of distributed training (multiple processes, GPUs and machines).\n* Flexibility and expressiveness - when a researcher has a PFL idea to try, ``pfl`` has flexible APIs to express these ideas.\n* Scalable simulations for large experiments with state-of-the-art algorithms and models.\n* Support both PyTorch and TensorFlow.\n* Unified benchmarks for datasets that have been vetted for both PyTorch and TensorFlow.\n* Support other models in addition to neural networks, e.g. GBDTs. Switching between types of models is seamless.\n* Tight integration with privacy features, including common mechanisms for local and central differential privacy.\n\nResults from benchmarks are maintained in [this Weights & Biases report](https://api.wandb.ai/links/pfl/5scd5f66).\n\n## Installation\n\nInstallation instructions can be found [here](http://apple.github.io/pfl-research/installation.html).\n`pfl` is available on PyPI and a full installation be done with pip:\n\n```\npip install 'pfl[tf,pytorch,trees]'\n```\n\n## Getting started - tutorial notebooks\n\nTo try out `pfl` immediately without installation, we provide several colab notebooks for learning the different components in `pfl` hands-on.\n`<TODO push notebooks to colab>`\n\nAlso available as Jupyter notebooks [here](https://github.com/apple/pfl-research/tree/develop/tutorials).\n\n## Getting started - benchmarks\n\n`pfl` aims to streamline the benchmarking process of testing hypotheses in the Federated Learning paradigm. The official benchmarks are available in the [benchmarks](./benchmarks) directory, using a variety of realistic dataset-model combinations with and without differential privacy (yes, we do also have CIFAR10).\n\n**Copying these examples is a great starting point for doing your own research.**\n[See the quickstart](./benchmarks#quickstart) on how to start converging a model on the simplest benchmark (CIFAR10) in just a few minutes.\n\n## Contributing\n\nResearchers are invited to contribute to the framework. Please, see [here](http://apple.github.io/pfl-research/support/contributing.html) for more details.\n\n## Citing pfl-research\n\n```\n@software{pfl2024,\n  author = {Filip Granqvist and Congzheng Song and \u00c1ine Cahill and Rogier van Dalen and Martin Pelikan and Yi Sheng Chan and Xiaojun Feng and Natarajan Krishnaswami and Mona Chitnis and Vojta Jina},\n  title = {{pfl}: simulation framework for accelerating research in Private Federated Learning},\n  url = {https://github.com/apple/pfl-research},\n  version = {0.0},\n  year = {2024},\n}\n```\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "Simulation framework for Private Federated Learning",
    "version": "0.2.0",
    "project_urls": {
        "Homepage": "https://github.com/apple/pfl-research",
        "Repository": "https://github.com/apple/pfl-research"
    },
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "3c4a82065e076b08752fd3db030f57379e89d85e00567049680dc51b5124ce83",
                "md5": "99fdb849462302f16417d4c79a5b27fc",
                "sha256": "da71453eb488660f3336a26a00c3b186455e496c1694694fbd21205f406f9b7d"
            },
            "downloads": -1,
            "filename": "pfl-0.2.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "99fdb849462302f16417d4c79a5b27fc",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<3.12,>=3.10",
            "size": 215726,
            "upload_time": "2024-07-18T22:21:54",
            "upload_time_iso_8601": "2024-07-18T22:21:54.676057Z",
            "url": "https://files.pythonhosted.org/packages/3c/4a/82065e076b08752fd3db030f57379e89d85e00567049680dc51b5124ce83/pfl-0.2.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "664b1b661a548413881358d51832a7e3be52a7677ebe35783d25d8df38ae25f8",
                "md5": "207baabab8601318ef92efd0fee8ba87",
                "sha256": "2dd1bd5dcbf094f30e6e64c1aac568838694a00b01bd1fe60a42373b6a68bbdb"
            },
            "downloads": -1,
            "filename": "pfl-0.2.0.tar.gz",
            "has_sig": false,
            "md5_digest": "207baabab8601318ef92efd0fee8ba87",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<3.12,>=3.10",
            "size": 161770,
            "upload_time": "2024-07-18T22:21:56",
            "upload_time_iso_8601": "2024-07-18T22:21:56.397967Z",
            "url": "https://files.pythonhosted.org/packages/66/4b/1b661a548413881358d51832a7e3be52a7677ebe35783d25d8df38ae25f8/pfl-0.2.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-07-18 22:21:56",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "apple",
    "github_project": "pfl-research",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "circle": true,
    "lcname": "pfl"
}
        
Elapsed time: 0.63379s