flwr


Nameflwr JSON
Version 1.12.0 PyPI version JSON
download
home_pagehttps://flower.ai
SummaryFlower: A Friendly Federated Learning Framework
upload_time2024-10-14 07:20:25
maintainerNone
docs_urlNone
authorThe Flower Authors
requires_python<4.0,>=3.9
licenseApache-2.0
keywords flower fl federated learning federated analytics federated evaluation machine learning
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # Flower: A Friendly Federated Learning Framework

<p align="center">
  <a href="https://flower.ai/">
    <img src="https://flower.ai/_next/image/?url=%2F_next%2Fstatic%2Fmedia%2Fflower_white_border.c2012e70.png&w=640&q=75" width="140px" alt="Flower Website" />
  </a>
</p>
<p align="center">
    <a href="https://flower.ai/">Website</a> |
    <a href="https://flower.ai/blog">Blog</a> |
    <a href="https://flower.ai/docs/">Docs</a> |
    <a href="https://flower.ai/conf/flower-summit-2022">Conference</a> |
    <a href="https://flower.ai/join-slack">Slack</a>
    <br /><br />
</p>

[![GitHub license](https://img.shields.io/github/license/adap/flower)](https://github.com/adap/flower/blob/main/LICENSE)
[![PRs Welcome](https://img.shields.io/badge/PRs-welcome-brightgreen.svg)](https://github.com/adap/flower/blob/main/CONTRIBUTING.md)
![Build](https://github.com/adap/flower/actions/workflows/framework.yml/badge.svg)
[![Downloads](https://static.pepy.tech/badge/flwr)](https://pepy.tech/project/flwr)
[![Docker Hub](https://img.shields.io/badge/Docker%20Hub-flwr-blue)](https://hub.docker.com/u/flwr)
[![Slack](https://img.shields.io/badge/Chat-Slack-red)](https://flower.ai/join-slack)

Flower (`flwr`) is a framework for building federated learning systems. The
design of Flower is based on a few guiding principles:

- **Customizable**: Federated learning systems vary wildly from one use case to
  another. Flower allows for a wide range of different configurations depending
  on the needs of each individual use case.

- **Extendable**: Flower originated from a research project at the University of
  Oxford, so it was built with AI research in mind. Many components can be
  extended and overridden to build new state-of-the-art systems.

- **Framework-agnostic**: Different machine learning frameworks have different
  strengths. Flower can be used with any machine learning framework, for
  example, [PyTorch](https://pytorch.org), [TensorFlow](https://tensorflow.org), [Hugging Face Transformers](https://huggingface.co/), [PyTorch Lightning](https://pytorchlightning.ai/), [scikit-learn](https://scikit-learn.org/), [JAX](https://jax.readthedocs.io/), [TFLite](https://tensorflow.org/lite/), [MONAI](https://docs.monai.io/en/latest/index.html), [fastai](https://www.fast.ai/), [MLX](https://ml-explore.github.io/mlx/build/html/index.html), [XGBoost](https://xgboost.readthedocs.io/en/stable/), [Pandas](https://pandas.pydata.org/) for federated analytics, or even raw [NumPy](https://numpy.org/)
  for users who enjoy computing gradients by hand.

- **Understandable**: Flower is written with maintainability in mind. The
  community is encouraged to both read and contribute to the codebase.

Meet the Flower community on [flower.ai](https://flower.ai)!

## Federated Learning Tutorial

Flower's goal is to make federated learning accessible to everyone. This series of tutorials introduces the fundamentals of federated learning and how to implement them in Flower.

0. **What is Federated Learning?**

   [![Open in Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/adap/flower/blob/main/doc/source/tutorial-series-what-is-federated-learning.ipynb) (or open the [Jupyter Notebook](https://github.com/adap/flower/blob/main/doc/source/tutorial-series-what-is-federated-learning.ipynb))

1. **An Introduction to Federated Learning**

   [![Open in Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/adap/flower/blob/main/doc/source/tutorial-series-get-started-with-flower-pytorch.ipynb) (or open the [Jupyter Notebook](https://github.com/adap/flower/blob/main/doc/source/tutorial-series-get-started-with-flower-pytorch.ipynb))

2. **Using Strategies in Federated Learning**

   [![Open in Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/adap/flower/blob/main/doc/source/tutorial-series-use-a-federated-learning-strategy-pytorch.ipynb) (or open the [Jupyter Notebook](https://github.com/adap/flower/blob/main/doc/source/tutorial-series-use-a-federated-learning-strategy-pytorch.ipynb))

3. **Building Strategies for Federated Learning**

   [![Open in Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/adap/flower/blob/main/doc/source/tutorial-series-build-a-strategy-from-scratch-pytorch.ipynb) (or open the [Jupyter Notebook](https://github.com/adap/flower/blob/main/doc/source/tutorial-series-build-a-strategy-from-scratch-pytorch.ipynb))

4. **Custom Clients for Federated Learning**

   [![Open in Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/adap/flower/blob/main/doc/source/tutorial-series-customize-the-client-pytorch.ipynb) (or open the [Jupyter Notebook](https://github.com/adap/flower/blob/main/doc/source/tutorial-series-customize-the-client-pytorch.ipynb))

Stay tuned, more tutorials are coming soon. Topics include **Privacy and Security in Federated Learning**, and **Scaling Federated Learning**.

## 30-Minute Federated Learning Tutorial

[![Open in Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/adap/flower/blob/main/examples/flower-in-30-minutes/tutorial.ipynb) (or open the [Jupyter Notebook](https://github.com/adap/flower/blob/main/examples/flower-in-30-minutes/tutorial.ipynb))

## Documentation

[Flower Docs](https://flower.ai/docs):

- [Installation](https://flower.ai/docs/framework/how-to-install-flower.html)
- [Quickstart (TensorFlow)](https://flower.ai/docs/framework/tutorial-quickstart-tensorflow.html)
- [Quickstart (PyTorch)](https://flower.ai/docs/framework/tutorial-quickstart-pytorch.html)
- [Quickstart (Hugging Face)](https://flower.ai/docs/framework/tutorial-quickstart-huggingface.html)
- [Quickstart (PyTorch Lightning)](https://flower.ai/docs/framework/tutorial-quickstart-pytorch-lightning.html)
- [Quickstart (Pandas)](https://flower.ai/docs/framework/tutorial-quickstart-pandas.html)
- [Quickstart (fastai)](https://flower.ai/docs/framework/tutorial-quickstart-fastai.html)
- [Quickstart (JAX)](https://flower.ai/docs/framework/tutorial-quickstart-jax.html)
- [Quickstart (scikit-learn)](https://flower.ai/docs/framework/tutorial-quickstart-scikitlearn.html)
- [Quickstart (Android [TFLite])](https://flower.ai/docs/framework/tutorial-quickstart-android.html)
- [Quickstart (iOS [CoreML])](https://flower.ai/docs/framework/tutorial-quickstart-ios.html)

## Flower Baselines

Flower Baselines is a collection of community-contributed projects that reproduce the experiments performed in popular federated learning publications. Researchers can build on Flower Baselines to quickly evaluate new ideas. The Flower community loves contributions! Make your work more visible and enable others to build on it by contributing it as a baseline!

- [DASHA](https://github.com/adap/flower/tree/main/baselines/dasha)
- [DepthFL](https://github.com/adap/flower/tree/main/baselines/depthfl)
- [FedBN](https://github.com/adap/flower/tree/main/baselines/fedbn)
- [FedMeta](https://github.com/adap/flower/tree/main/baselines/fedmeta)
- [FedMLB](https://github.com/adap/flower/tree/main/baselines/fedmlb)
- [FedPer](https://github.com/adap/flower/tree/main/baselines/fedper)
- [FedProx](https://github.com/adap/flower/tree/main/baselines/fedprox)
- [FedNova](https://github.com/adap/flower/tree/main/baselines/fednova)
- [HeteroFL](https://github.com/adap/flower/tree/main/baselines/heterofl)
- [FedAvgM](https://github.com/adap/flower/tree/main/baselines/fedavgm)
- [FedRep](https://github.com/adap/flower/tree/main/baselines/fedrep)
- [FedStar](https://github.com/adap/flower/tree/main/baselines/fedstar)
- [FedWav2vec2](https://github.com/adap/flower/tree/main/baselines/fedwav2vec2)
- [FjORD](https://github.com/adap/flower/tree/main/baselines/fjord)
- [MOON](https://github.com/adap/flower/tree/main/baselines/moon)
- [niid-Bench](https://github.com/adap/flower/tree/main/baselines/niid_bench)
- [TAMUNA](https://github.com/adap/flower/tree/main/baselines/tamuna)
- [FedVSSL](https://github.com/adap/flower/tree/main/baselines/fedvssl)
- [FedXGBoost](https://github.com/adap/flower/tree/main/baselines/hfedxgboost)
- [FedPara](https://github.com/adap/flower/tree/main/baselines/fedpara)
- [FedAvg](https://github.com/adap/flower/tree/main/baselines/flwr_baselines/flwr_baselines/publications/fedavg_mnist)
- [FedOpt](https://github.com/adap/flower/tree/main/baselines/flwr_baselines/flwr_baselines/publications/adaptive_federated_optimization)

Please refer to the [Flower Baselines Documentation](https://flower.ai/docs/baselines/) for a detailed categorization of baselines and for additional info including:
* [How to use Flower Baselines](https://flower.ai/docs/baselines/how-to-use-baselines.html)
* [How to contribute a new Flower Baseline](https://flower.ai/docs/baselines/how-to-contribute-baselines.html)

## Flower Usage Examples

Several code examples show different usage scenarios of Flower (in combination with popular machine learning frameworks such as PyTorch or TensorFlow).

Quickstart examples:

- [Quickstart (TensorFlow)](https://github.com/adap/flower/tree/main/examples/quickstart-tensorflow)
- [Quickstart (PyTorch)](https://github.com/adap/flower/tree/main/examples/quickstart-pytorch)
- [Quickstart (Hugging Face)](https://github.com/adap/flower/tree/main/examples/quickstart-huggingface)
- [Quickstart (PyTorch Lightning)](https://github.com/adap/flower/tree/main/examples/quickstart-pytorch-lightning)
- [Quickstart (fastai)](https://github.com/adap/flower/tree/main/examples/quickstart-fastai)
- [Quickstart (Pandas)](https://github.com/adap/flower/tree/main/examples/quickstart-pandas)
- [Quickstart (JAX)](https://github.com/adap/flower/tree/main/examples/quickstart-jax)
- [Quickstart (MONAI)](https://github.com/adap/flower/tree/main/examples/quickstart-monai)
- [Quickstart (scikit-learn)](https://github.com/adap/flower/tree/main/examples/sklearn-logreg-mnist)
- [Quickstart (Android [TFLite])](https://github.com/adap/flower/tree/main/examples/android)
- [Quickstart (iOS [CoreML])](https://github.com/adap/flower/tree/main/examples/ios)
- [Quickstart (MLX)](https://github.com/adap/flower/tree/main/examples/quickstart-mlx)
- [Quickstart (XGBoost)](https://github.com/adap/flower/tree/main/examples/xgboost-quickstart)

Other [examples](https://github.com/adap/flower/tree/main/examples):

- [Raspberry Pi & Nvidia Jetson Tutorial](https://github.com/adap/flower/tree/main/examples/embedded-devices)
- [PyTorch: From Centralized to Federated](https://github.com/adap/flower/tree/main/examples/pytorch-from-centralized-to-federated)
- [Vertical FL](https://github.com/adap/flower/tree/main/examples/vertical-fl)
- [Federated Finetuning of OpenAI's Whisper](https://github.com/adap/flower/tree/main/examples/whisper-federated-finetuning)
- [Federated Finetuning of Large Language Model](https://github.com/adap/flower/tree/main/examples/flowertune-llm)
- [Federated Finetuning of a Vision Transformer](https://github.com/adap/flower/tree/main/examples/flowertune-vit)
- [Advanced Flower with TensorFlow/Keras](https://github.com/adap/flower/tree/main/examples/advanced-tensorflow)
- [Advanced Flower with PyTorch](https://github.com/adap/flower/tree/main/examples/advanced-pytorch)
- [Comprehensive Flower+XGBoost](https://github.com/adap/flower/tree/main/examples/xgboost-comprehensive)
- [Flower through Docker Compose and with Grafana dashboard](https://github.com/adap/flower/tree/main/examples/flower-via-docker-compose)
- [Flower with KaplanMeierFitter from the lifelines library](https://github.com/adap/flower/tree/main/examples/federated-kaplan-meier-fitter)
- [Sample Level Privacy with Opacus](https://github.com/adap/flower/tree/main/examples/opacus)
- [Sample Level Privacy with TensorFlow-Privacy](https://github.com/adap/flower/tree/main/examples/tensorflow-privacy)
- [Flower with a Tabular Dataset](https://github.com/adap/flower/tree/main/examples/fl-tabular)

## Community

Flower is built by a wonderful community of researchers and engineers. [Join Slack](https://flower.ai/join-slack) to meet them, [contributions](#contributing-to-flower) are welcome.

<a href="https://github.com/adap/flower/graphs/contributors">
  <img src="https://contrib.rocks/image?repo=adap/flower&columns=10" />
</a>

## Citation

If you publish work that uses Flower, please cite Flower as follows:

```bibtex
@article{beutel2020flower,
  title={Flower: A Friendly Federated Learning Research Framework},
  author={Beutel, Daniel J and Topal, Taner and Mathur, Akhil and Qiu, Xinchi and Fernandez-Marques, Javier and Gao, Yan and Sani, Lorenzo and Kwing, Hei Li and Parcollet, Titouan and Gusmão, Pedro PB de and Lane, Nicholas D},
  journal={arXiv preprint arXiv:2007.14390},
  year={2020}
}
```

Please also consider adding your publication to the list of Flower-based publications in the docs, just open a Pull Request.

## Contributing to Flower

We welcome contributions. Please see [CONTRIBUTING.md](CONTRIBUTING.md) to get started!

            

Raw data

            {
    "_id": null,
    "home_page": "https://flower.ai",
    "name": "flwr",
    "maintainer": null,
    "docs_url": null,
    "requires_python": "<4.0,>=3.9",
    "maintainer_email": null,
    "keywords": "flower, fl, federated learning, federated analytics, federated evaluation, machine learning",
    "author": "The Flower Authors",
    "author_email": "hello@flower.ai",
    "download_url": "https://files.pythonhosted.org/packages/0c/84/ab391acb7d216ba989941fa61c78fde50799b18574815efb7ddb4b3948e6/flwr-1.12.0.tar.gz",
    "platform": null,
    "description": "# Flower: A Friendly Federated Learning Framework\n\n<p align=\"center\">\n  <a href=\"https://flower.ai/\">\n    <img src=\"https://flower.ai/_next/image/?url=%2F_next%2Fstatic%2Fmedia%2Fflower_white_border.c2012e70.png&w=640&q=75\" width=\"140px\" alt=\"Flower Website\" />\n  </a>\n</p>\n<p align=\"center\">\n    <a href=\"https://flower.ai/\">Website</a> |\n    <a href=\"https://flower.ai/blog\">Blog</a> |\n    <a href=\"https://flower.ai/docs/\">Docs</a> |\n    <a href=\"https://flower.ai/conf/flower-summit-2022\">Conference</a> |\n    <a href=\"https://flower.ai/join-slack\">Slack</a>\n    <br /><br />\n</p>\n\n[![GitHub license](https://img.shields.io/github/license/adap/flower)](https://github.com/adap/flower/blob/main/LICENSE)\n[![PRs Welcome](https://img.shields.io/badge/PRs-welcome-brightgreen.svg)](https://github.com/adap/flower/blob/main/CONTRIBUTING.md)\n![Build](https://github.com/adap/flower/actions/workflows/framework.yml/badge.svg)\n[![Downloads](https://static.pepy.tech/badge/flwr)](https://pepy.tech/project/flwr)\n[![Docker Hub](https://img.shields.io/badge/Docker%20Hub-flwr-blue)](https://hub.docker.com/u/flwr)\n[![Slack](https://img.shields.io/badge/Chat-Slack-red)](https://flower.ai/join-slack)\n\nFlower (`flwr`) is a framework for building federated learning systems. The\ndesign of Flower is based on a few guiding principles:\n\n- **Customizable**: Federated learning systems vary wildly from one use case to\n  another. Flower allows for a wide range of different configurations depending\n  on the needs of each individual use case.\n\n- **Extendable**: Flower originated from a research project at the University of\n  Oxford, so it was built with AI research in mind. Many components can be\n  extended and overridden to build new state-of-the-art systems.\n\n- **Framework-agnostic**: Different machine learning frameworks have different\n  strengths. Flower can be used with any machine learning framework, for\n  example, [PyTorch](https://pytorch.org), [TensorFlow](https://tensorflow.org), [Hugging Face Transformers](https://huggingface.co/), [PyTorch Lightning](https://pytorchlightning.ai/), [scikit-learn](https://scikit-learn.org/), [JAX](https://jax.readthedocs.io/), [TFLite](https://tensorflow.org/lite/), [MONAI](https://docs.monai.io/en/latest/index.html), [fastai](https://www.fast.ai/), [MLX](https://ml-explore.github.io/mlx/build/html/index.html), [XGBoost](https://xgboost.readthedocs.io/en/stable/), [Pandas](https://pandas.pydata.org/) for federated analytics, or even raw [NumPy](https://numpy.org/)\n  for users who enjoy computing gradients by hand.\n\n- **Understandable**: Flower is written with maintainability in mind. The\n  community is encouraged to both read and contribute to the codebase.\n\nMeet the Flower community on [flower.ai](https://flower.ai)!\n\n## Federated Learning Tutorial\n\nFlower's goal is to make federated learning accessible to everyone. This series of tutorials introduces the fundamentals of federated learning and how to implement them in Flower.\n\n0. **What is Federated Learning?**\n\n   [![Open in Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/adap/flower/blob/main/doc/source/tutorial-series-what-is-federated-learning.ipynb) (or open the [Jupyter Notebook](https://github.com/adap/flower/blob/main/doc/source/tutorial-series-what-is-federated-learning.ipynb))\n\n1. **An Introduction to Federated Learning**\n\n   [![Open in Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/adap/flower/blob/main/doc/source/tutorial-series-get-started-with-flower-pytorch.ipynb) (or open the [Jupyter Notebook](https://github.com/adap/flower/blob/main/doc/source/tutorial-series-get-started-with-flower-pytorch.ipynb))\n\n2. **Using Strategies in Federated Learning**\n\n   [![Open in Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/adap/flower/blob/main/doc/source/tutorial-series-use-a-federated-learning-strategy-pytorch.ipynb) (or open the [Jupyter Notebook](https://github.com/adap/flower/blob/main/doc/source/tutorial-series-use-a-federated-learning-strategy-pytorch.ipynb))\n\n3. **Building Strategies for Federated Learning**\n\n   [![Open in Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/adap/flower/blob/main/doc/source/tutorial-series-build-a-strategy-from-scratch-pytorch.ipynb) (or open the [Jupyter Notebook](https://github.com/adap/flower/blob/main/doc/source/tutorial-series-build-a-strategy-from-scratch-pytorch.ipynb))\n\n4. **Custom Clients for Federated Learning**\n\n   [![Open in Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/adap/flower/blob/main/doc/source/tutorial-series-customize-the-client-pytorch.ipynb) (or open the [Jupyter Notebook](https://github.com/adap/flower/blob/main/doc/source/tutorial-series-customize-the-client-pytorch.ipynb))\n\nStay tuned, more tutorials are coming soon. Topics include **Privacy and Security in Federated Learning**, and **Scaling Federated Learning**.\n\n## 30-Minute Federated Learning Tutorial\n\n[![Open in Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/adap/flower/blob/main/examples/flower-in-30-minutes/tutorial.ipynb) (or open the [Jupyter Notebook](https://github.com/adap/flower/blob/main/examples/flower-in-30-minutes/tutorial.ipynb))\n\n## Documentation\n\n[Flower Docs](https://flower.ai/docs):\n\n- [Installation](https://flower.ai/docs/framework/how-to-install-flower.html)\n- [Quickstart (TensorFlow)](https://flower.ai/docs/framework/tutorial-quickstart-tensorflow.html)\n- [Quickstart (PyTorch)](https://flower.ai/docs/framework/tutorial-quickstart-pytorch.html)\n- [Quickstart (Hugging Face)](https://flower.ai/docs/framework/tutorial-quickstart-huggingface.html)\n- [Quickstart (PyTorch Lightning)](https://flower.ai/docs/framework/tutorial-quickstart-pytorch-lightning.html)\n- [Quickstart (Pandas)](https://flower.ai/docs/framework/tutorial-quickstart-pandas.html)\n- [Quickstart (fastai)](https://flower.ai/docs/framework/tutorial-quickstart-fastai.html)\n- [Quickstart (JAX)](https://flower.ai/docs/framework/tutorial-quickstart-jax.html)\n- [Quickstart (scikit-learn)](https://flower.ai/docs/framework/tutorial-quickstart-scikitlearn.html)\n- [Quickstart (Android [TFLite])](https://flower.ai/docs/framework/tutorial-quickstart-android.html)\n- [Quickstart (iOS [CoreML])](https://flower.ai/docs/framework/tutorial-quickstart-ios.html)\n\n## Flower Baselines\n\nFlower Baselines is a collection of community-contributed projects that reproduce the experiments performed in popular federated learning publications. Researchers can build on Flower Baselines to quickly evaluate new ideas. The Flower community loves contributions! Make your work more visible and enable others to build on it by contributing it as a baseline!\n\n- [DASHA](https://github.com/adap/flower/tree/main/baselines/dasha)\n- [DepthFL](https://github.com/adap/flower/tree/main/baselines/depthfl)\n- [FedBN](https://github.com/adap/flower/tree/main/baselines/fedbn)\n- [FedMeta](https://github.com/adap/flower/tree/main/baselines/fedmeta)\n- [FedMLB](https://github.com/adap/flower/tree/main/baselines/fedmlb)\n- [FedPer](https://github.com/adap/flower/tree/main/baselines/fedper)\n- [FedProx](https://github.com/adap/flower/tree/main/baselines/fedprox)\n- [FedNova](https://github.com/adap/flower/tree/main/baselines/fednova)\n- [HeteroFL](https://github.com/adap/flower/tree/main/baselines/heterofl)\n- [FedAvgM](https://github.com/adap/flower/tree/main/baselines/fedavgm)\n- [FedRep](https://github.com/adap/flower/tree/main/baselines/fedrep)\n- [FedStar](https://github.com/adap/flower/tree/main/baselines/fedstar)\n- [FedWav2vec2](https://github.com/adap/flower/tree/main/baselines/fedwav2vec2)\n- [FjORD](https://github.com/adap/flower/tree/main/baselines/fjord)\n- [MOON](https://github.com/adap/flower/tree/main/baselines/moon)\n- [niid-Bench](https://github.com/adap/flower/tree/main/baselines/niid_bench)\n- [TAMUNA](https://github.com/adap/flower/tree/main/baselines/tamuna)\n- [FedVSSL](https://github.com/adap/flower/tree/main/baselines/fedvssl)\n- [FedXGBoost](https://github.com/adap/flower/tree/main/baselines/hfedxgboost)\n- [FedPara](https://github.com/adap/flower/tree/main/baselines/fedpara)\n- [FedAvg](https://github.com/adap/flower/tree/main/baselines/flwr_baselines/flwr_baselines/publications/fedavg_mnist)\n- [FedOpt](https://github.com/adap/flower/tree/main/baselines/flwr_baselines/flwr_baselines/publications/adaptive_federated_optimization)\n\nPlease refer to the [Flower Baselines Documentation](https://flower.ai/docs/baselines/) for a detailed categorization of baselines and for additional info including:\n* [How to use Flower Baselines](https://flower.ai/docs/baselines/how-to-use-baselines.html)\n* [How to contribute a new Flower Baseline](https://flower.ai/docs/baselines/how-to-contribute-baselines.html)\n\n## Flower Usage Examples\n\nSeveral code examples show different usage scenarios of Flower (in combination with popular machine learning frameworks such as PyTorch or TensorFlow).\n\nQuickstart examples:\n\n- [Quickstart (TensorFlow)](https://github.com/adap/flower/tree/main/examples/quickstart-tensorflow)\n- [Quickstart (PyTorch)](https://github.com/adap/flower/tree/main/examples/quickstart-pytorch)\n- [Quickstart (Hugging Face)](https://github.com/adap/flower/tree/main/examples/quickstart-huggingface)\n- [Quickstart (PyTorch Lightning)](https://github.com/adap/flower/tree/main/examples/quickstart-pytorch-lightning)\n- [Quickstart (fastai)](https://github.com/adap/flower/tree/main/examples/quickstart-fastai)\n- [Quickstart (Pandas)](https://github.com/adap/flower/tree/main/examples/quickstart-pandas)\n- [Quickstart (JAX)](https://github.com/adap/flower/tree/main/examples/quickstart-jax)\n- [Quickstart (MONAI)](https://github.com/adap/flower/tree/main/examples/quickstart-monai)\n- [Quickstart (scikit-learn)](https://github.com/adap/flower/tree/main/examples/sklearn-logreg-mnist)\n- [Quickstart (Android [TFLite])](https://github.com/adap/flower/tree/main/examples/android)\n- [Quickstart (iOS [CoreML])](https://github.com/adap/flower/tree/main/examples/ios)\n- [Quickstart (MLX)](https://github.com/adap/flower/tree/main/examples/quickstart-mlx)\n- [Quickstart (XGBoost)](https://github.com/adap/flower/tree/main/examples/xgboost-quickstart)\n\nOther [examples](https://github.com/adap/flower/tree/main/examples):\n\n- [Raspberry Pi & Nvidia Jetson Tutorial](https://github.com/adap/flower/tree/main/examples/embedded-devices)\n- [PyTorch: From Centralized to Federated](https://github.com/adap/flower/tree/main/examples/pytorch-from-centralized-to-federated)\n- [Vertical FL](https://github.com/adap/flower/tree/main/examples/vertical-fl)\n- [Federated Finetuning of OpenAI's Whisper](https://github.com/adap/flower/tree/main/examples/whisper-federated-finetuning)\n- [Federated Finetuning of Large Language Model](https://github.com/adap/flower/tree/main/examples/flowertune-llm)\n- [Federated Finetuning of a Vision Transformer](https://github.com/adap/flower/tree/main/examples/flowertune-vit)\n- [Advanced Flower with TensorFlow/Keras](https://github.com/adap/flower/tree/main/examples/advanced-tensorflow)\n- [Advanced Flower with PyTorch](https://github.com/adap/flower/tree/main/examples/advanced-pytorch)\n- [Comprehensive Flower+XGBoost](https://github.com/adap/flower/tree/main/examples/xgboost-comprehensive)\n- [Flower through Docker Compose and with Grafana dashboard](https://github.com/adap/flower/tree/main/examples/flower-via-docker-compose)\n- [Flower with KaplanMeierFitter from the lifelines library](https://github.com/adap/flower/tree/main/examples/federated-kaplan-meier-fitter)\n- [Sample Level Privacy with Opacus](https://github.com/adap/flower/tree/main/examples/opacus)\n- [Sample Level Privacy with TensorFlow-Privacy](https://github.com/adap/flower/tree/main/examples/tensorflow-privacy)\n- [Flower with a Tabular Dataset](https://github.com/adap/flower/tree/main/examples/fl-tabular)\n\n## Community\n\nFlower is built by a wonderful community of researchers and engineers. [Join Slack](https://flower.ai/join-slack) to meet them, [contributions](#contributing-to-flower) are welcome.\n\n<a href=\"https://github.com/adap/flower/graphs/contributors\">\n  <img src=\"https://contrib.rocks/image?repo=adap/flower&columns=10\" />\n</a>\n\n## Citation\n\nIf you publish work that uses Flower, please cite Flower as follows:\n\n```bibtex\n@article{beutel2020flower,\n  title={Flower: A Friendly Federated Learning Research Framework},\n  author={Beutel, Daniel J and Topal, Taner and Mathur, Akhil and Qiu, Xinchi and Fernandez-Marques, Javier and Gao, Yan and Sani, Lorenzo and Kwing, Hei Li and Parcollet, Titouan and Gusm\u00e3o, Pedro PB de and Lane, Nicholas D},\n  journal={arXiv preprint arXiv:2007.14390},\n  year={2020}\n}\n```\n\nPlease also consider adding your publication to the list of Flower-based publications in the docs, just open a Pull Request.\n\n## Contributing to Flower\n\nWe welcome contributions. Please see [CONTRIBUTING.md](CONTRIBUTING.md) to get started!\n",
    "bugtrack_url": null,
    "license": "Apache-2.0",
    "summary": "Flower: A Friendly Federated Learning Framework",
    "version": "1.12.0",
    "project_urls": {
        "Documentation": "https://flower.ai",
        "Homepage": "https://flower.ai",
        "Repository": "https://github.com/adap/flower"
    },
    "split_keywords": [
        "flower",
        " fl",
        " federated learning",
        " federated analytics",
        " federated evaluation",
        " machine learning"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "5d06714912b49a3007fbdcc377666e0f540c66d7d76d4af83ebbcb541937a8b7",
                "md5": "72636e5aa0ba480bfe584156a0c6c17e",
                "sha256": "50cae8f3042ee573354c8c23a0e72ffb70b0d006170ea04a6916f35d60117ed4"
            },
            "downloads": -1,
            "filename": "flwr-1.12.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "72636e5aa0ba480bfe584156a0c6c17e",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<4.0,>=3.9",
            "size": 478583,
            "upload_time": "2024-10-14T07:20:23",
            "upload_time_iso_8601": "2024-10-14T07:20:23.824753Z",
            "url": "https://files.pythonhosted.org/packages/5d/06/714912b49a3007fbdcc377666e0f540c66d7d76d4af83ebbcb541937a8b7/flwr-1.12.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "0c84ab391acb7d216ba989941fa61c78fde50799b18574815efb7ddb4b3948e6",
                "md5": "3460ae779168eda0712e1ce898425736",
                "sha256": "b3cea1ff164737ca43ba064be3ced115d6fcb8a0a6f306e5682cb5925f23ba59"
            },
            "downloads": -1,
            "filename": "flwr-1.12.0.tar.gz",
            "has_sig": false,
            "md5_digest": "3460ae779168eda0712e1ce898425736",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<4.0,>=3.9",
            "size": 275892,
            "upload_time": "2024-10-14T07:20:25",
            "upload_time_iso_8601": "2024-10-14T07:20:25.088435Z",
            "url": "https://files.pythonhosted.org/packages/0c/84/ab391acb7d216ba989941fa61c78fde50799b18574815efb7ddb4b3948e6/flwr-1.12.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-10-14 07:20:25",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "adap",
    "github_project": "flower",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "flwr"
}
        
Elapsed time: 0.73135s