flwr_attacks


Nameflwr_attacks JSON
Version 0.1.3 PyPI version JSON
download
home_pagehttps://github.com/n45os/flwr_attacks
SummaryFlower Attacks: This is an extension of the Flower framework for Federated Learning. It provides an extenion module that allows the user to perform various attacks on the federated learning process.
upload_time2024-03-01 13:50:59
maintainer
docs_urlNone
authorNassos Bountioukos Spinaris
requires_python>=3.8
license
keywords federated learning flower attacks adversarial attacks machine learning privacy
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            <p align="center">
    <img src="https://github.com/n45os/flwr_attacks/blob/main/misc/flwr-att-lg.png?raw=true" width="140px" alt="modified flwr logo" />
</p>

# Flower Attacks  


![GitHub tag (latest by date)](https://img.shields.io/github/v/tag/n45os/flwr_attacks?label=version)
[![GitHub license](https://img.shields.io/github/license/adap/flower)](https://github.com/adap/flower/blob/main/LICENSE)


This repository is an extension of the Flower Framework that makes possible creating, running, testing and simulating various adversary threats within the FLower Framework. The structure is made very similar to the strategy module to be easy and smooth for someone working with Flower to implement. 

## Features 
- **Federated Learning Attacks**: Implement and simulate various types of attacks on federated learning processes to assess their resilience and security
 
- **Integration with Flower**: Seamlessly integrates with the Flower framework, allowing for easy experimentation and extension. 

- **Extensible Design**: Designed to be easily extended with new types of attacks or modifications to existing ones.  

## Installation 
To install `flwr_attacks`, you can use pip: 

```bash 
pip install flwr_attacks
```

Usage
-----

After installation, you can use `flwr_attacks` as part of your federated learning experiments. Here is a basic example of how to integrate it with your Flower-based federated learning setup:



Configuration for the attack (assuming cfg is an existing configuration object)
```python
from flwr_attacks import MinMax, AttackServer, generate_cids

adversary_cids, benign_cids = generate_cids(NUM_CLIENTS, adversary_fraction=0.4)
all_cids = adversary_cids + benign_cids
```

 Initialize the MinMax attack with your configuration
```python
attack = MinMaxAttack(
    adversary_fraction=0.2,  # 20% of clients are adversaries
    activation_round=5,  # Activate attack at round 5
    adversary_clients=adversary_cids, # by default the attack will be able to access only the adversary clients. Use the argument adversary_accessed_cids to add specific access.
)

strategy = ...

# Create the AttackServer with the specified attack and strategy
attack_server = AttackServer(
    strategy=strategy,
    attack=attack,
)
```
Use the server as in a typical Flower server


Use simulation
```python 
history = fl.simulation.start_simulation(
    client_fn=client_fn,
    clients_ids=all_cids,
    config=fl.server.ServerConfig(num_rounds=cfg.num_rounds),
    server=attack_server,
)
```
or start the server
```python 

fl.server.start_server(
    server=attack_server,
)
```

Contributing
------------

Contributions to `flwr_attacks` are welcome! If you have a new attack implementation, improvements or bug fixes, open an issue or a pull request.

License
-------

`flwr_attacks` is released under the Flower's Apache-2.0 License. See the LICENSE file for more details.

Contact
-------

For any questions or feedback, please contact Nassos Bountioukos Spinaris at nassosbountioukos@gmail.com.

Acknowledgments
---------------

Special thanks to the Flower framework team for providing a solid foundation for federated learning experiments.

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/n45os/flwr_attacks",
    "name": "flwr_attacks",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.8",
    "maintainer_email": "",
    "keywords": "federated learning,flower,attacks,adversarial attacks,machine learning,privacy",
    "author": "Nassos Bountioukos Spinaris",
    "author_email": "nassosbountioukos@gmail.com",
    "download_url": "https://files.pythonhosted.org/packages/b7/39/f17a176144559ae5beb0921293aedf2ac2a428c8edfe6f7311dc8f4d6b05/flwr_attacks-0.1.3.tar.gz",
    "platform": null,
    "description": "<p align=\"center\">\n    <img src=\"https://github.com/n45os/flwr_attacks/blob/main/misc/flwr-att-lg.png?raw=true\" width=\"140px\" alt=\"modified flwr logo\" />\n</p>\n\n# Flower Attacks  \n\n\n![GitHub tag (latest by date)](https://img.shields.io/github/v/tag/n45os/flwr_attacks?label=version)\n[![GitHub license](https://img.shields.io/github/license/adap/flower)](https://github.com/adap/flower/blob/main/LICENSE)\n\n\nThis repository is an extension of the Flower Framework that makes possible creating, running, testing and simulating various adversary threats within the FLower Framework. The structure is made very similar to the strategy module to be easy and smooth for someone working with Flower to implement. \n\n## Features \n- **Federated Learning Attacks**: Implement and simulate various types of attacks on federated learning processes to assess their resilience and security\n \n- **Integration with Flower**: Seamlessly integrates with the Flower framework, allowing for easy experimentation and extension. \n\n- **Extensible Design**: Designed to be easily extended with new types of attacks or modifications to existing ones.  \n\n## Installation \nTo install `flwr_attacks`, you can use pip: \n\n```bash \npip install flwr_attacks\n```\n\nUsage\n-----\n\nAfter installation, you can use `flwr_attacks` as part of your federated learning experiments. Here is a basic example of how to integrate it with your Flower-based federated learning setup:\n\n\n\nConfiguration for the attack (assuming cfg is an existing configuration object)\n```python\nfrom flwr_attacks import MinMax, AttackServer, generate_cids\n\nadversary_cids, benign_cids = generate_cids(NUM_CLIENTS, adversary_fraction=0.4)\nall_cids = adversary_cids + benign_cids\n```\n\n Initialize the MinMax attack with your configuration\n```python\nattack = MinMaxAttack(\n    adversary_fraction=0.2,  # 20% of clients are adversaries\n    activation_round=5,  # Activate attack at round 5\n    adversary_clients=adversary_cids, # by default the attack will be able to access only the adversary clients. Use the argument adversary_accessed_cids to add specific access.\n)\n\nstrategy = ...\n\n# Create the AttackServer with the specified attack and strategy\nattack_server = AttackServer(\n    strategy=strategy,\n    attack=attack,\n)\n```\nUse the server as in a typical Flower server\n\n\nUse simulation\n```python \nhistory = fl.simulation.start_simulation(\n    client_fn=client_fn,\n    clients_ids=all_cids,\n    config=fl.server.ServerConfig(num_rounds=cfg.num_rounds),\n    server=attack_server,\n)\n```\nor start the server\n```python \n\nfl.server.start_server(\n    server=attack_server,\n)\n```\n\nContributing\n------------\n\nContributions to `flwr_attacks` are welcome! If you have a new attack implementation, improvements or bug fixes, open an issue or a pull request.\n\nLicense\n-------\n\n`flwr_attacks` is released under the Flower's Apache-2.0 License. See the LICENSE file for more details.\n\nContact\n-------\n\nFor any questions or feedback, please contact Nassos Bountioukos Spinaris at nassosbountioukos@gmail.com.\n\nAcknowledgments\n---------------\n\nSpecial thanks to the Flower framework team for providing a solid foundation for federated learning experiments.\n",
    "bugtrack_url": null,
    "license": "",
    "summary": "Flower Attacks: This is an extension of the Flower framework for Federated Learning. It provides an extenion module that allows the user to perform various attacks on the federated learning process.",
    "version": "0.1.3",
    "project_urls": {
        "Homepage": "https://github.com/n45os/flwr_attacks",
        "Repository": "https://github.com/n45os/flwr_attacks"
    },
    "split_keywords": [
        "federated learning",
        "flower",
        "attacks",
        "adversarial attacks",
        "machine learning",
        "privacy"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "8eb8ce81e796d5ea4fb63c5cee018e61e325dc86a467e59f3985138f990f8d4c",
                "md5": "4eeb991d8677e9579424acb1fc038d7a",
                "sha256": "32541373293f0a2bcf63a3a2653af89e3072e80b2ae533ce4f4f2f5b422504fd"
            },
            "downloads": -1,
            "filename": "flwr_attacks-0.1.3-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "4eeb991d8677e9579424acb1fc038d7a",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.8",
            "size": 27406,
            "upload_time": "2024-03-01T13:50:57",
            "upload_time_iso_8601": "2024-03-01T13:50:57.744005Z",
            "url": "https://files.pythonhosted.org/packages/8e/b8/ce81e796d5ea4fb63c5cee018e61e325dc86a467e59f3985138f990f8d4c/flwr_attacks-0.1.3-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "b739f17a176144559ae5beb0921293aedf2ac2a428c8edfe6f7311dc8f4d6b05",
                "md5": "8c4865da50548792aa4a448457476bce",
                "sha256": "181be176dab31ae9abc8cfe92cba20cb8dc99e18d90c0ce46853326c22e18bec"
            },
            "downloads": -1,
            "filename": "flwr_attacks-0.1.3.tar.gz",
            "has_sig": false,
            "md5_digest": "8c4865da50548792aa4a448457476bce",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.8",
            "size": 21872,
            "upload_time": "2024-03-01T13:50:59",
            "upload_time_iso_8601": "2024-03-01T13:50:59.036099Z",
            "url": "https://files.pythonhosted.org/packages/b7/39/f17a176144559ae5beb0921293aedf2ac2a428c8edfe6f7311dc8f4d6b05/flwr_attacks-0.1.3.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-03-01 13:50:59",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "n45os",
    "github_project": "flwr_attacks",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "requirements": [],
    "lcname": "flwr_attacks"
}
        
Elapsed time: 0.22991s