flextrees


Nameflextrees JSON
Version 0.1.0 PyPI version JSON
download
home_page
Summary
upload_time2024-03-14 07:15:58
maintainer
docs_urlNone
authorAlberto Argente-Garrido
requires_python
license
keywords decision-trees explainable fl federated-learning flexible
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # flex-trees

The flex-trees package consists of a set of tools and utilities to work with Decision Tree (DT) models in Federated Learning (FL). It is designed to be used with the [FLEXible](https://github.com/FLEXible-FL/FLEXible/) framework, as it is an extension of it.

flex-trees comes with some state-of-the-art decision tree models for federated learning. It also provides multiple tabular datasets to test the models.

The methods implemented in the repository are:
| `Model`            | `Description`      | `Citation`              |
| ------------------ | :------------------: | -------------------: |
| Federated ID3 | The ID3 model adapted to a federated learning scenario. | [A Hybrid Approach to Privacy-Preserving Federated Learning](https://arxiv.org/pdf/1812.03224.pdf) |
| Federated Random Forest | The Random Forest (RF) model adapted to a federated learning scenario. Each client builds a RF locally, then `N` trees are randomly sampled from each client to get a global RF composed from the `N` trees retrieved from the clients. | [Federated Random Forests can improve local performance of predictive models for various healthcare applications](https://pubmed.ncbi.nlm.nih.gov/35139148/) |
| Federated Gradient Boosting Decision Trees | The Gradient Boosting Decision Trees model adapted to a federated learning scenario. In this model a global hash table is first created to aling the data between the clients within sharing it. After that, `N` trees (CART) are built by the clients. The process of building the ensemble is iterative, and one client builds the tree, then it is added to the ensemble, and after that the weights of the instances is updated, so the next client can build the next tree with the weights updated.| [Practical Federated Gradient Boosting Decision Trees](https://arxiv.org/abs/1911.04206) |

The tabular datasets available in the repository are:
| `Dataset`            | `Description`      | `Citation`              |
| ------------------ | :------------------: | -------------------: |
| Adult | The Adult dataset is a dataset that contains demographic information about the people, and the task is to predict if the income of the person is greater than 50K. | [UCI Machine Learning Repository](https://archive.ics.uci.edu/ml/datasets/adult) |
| Breast Cancer | The Breast Cancer dataset is a dataset that contains information about the breast cancer, and the task is to predict if the cancer is benign or malignant. | [UCI Machine Learning Repository](https://archive.ics.uci.edu/ml/datasets/Breast+Cancer+Wisconsin+%28Diagnostic%29) |
| Credit Card | The Credit Card dataset is a dataset that contains information about the credit card transactions, and the task is to predict if the transaction is fraudulent or not. | [Kaggle](https://www.kaggle.com/mlg-ulb/creditcardfraud) |
| ILPD | The ILPD dataset is a dataset that contains information about the Indian Liver Patient, and the task is to predict if the patient has liver disease or not. | [UCI Machine Learning Repository](https://archive.ics.uci.edu/ml/datasets/ILPD+%28Indian+Liver+Patient+Dataset%29) |
| Nursery | The Nursery dataset is a dataset that contains information about the nursery, and the task is to predict the acceptability of the nursery. | [UCI Machine Learning Repository](https://archive.ics.uci.edu/ml/datasets/nursery) |
| Bank Marketing | The Bank Marketing dataset is a dataset that contains information about the bank marketing, and the task is to predict if the client will subscribe to a term deposit. | [UCI Machine Learning Repository](https://archive.ics.uci.edu/ml/datasets/bank+marketing) |
| Magic Gamma | The Magic Gamma dataset is a dataset that contains information about the magic gamma, and the task is to predict if the gamma is signal or background. | [UCI Machine Learning Repository](https://archive.ics.uci.edu/ml/datasets/magic+gamma+telescope) |

##  Tutorials

To get started with flex-trees, you can check the [notebooks](https://github.com/FLEXible-FL/flex-trees/tree/main/notebooks) available in the repository. They cover the following topics:

- [Federated ID3 with FLEXible](https://github.com/FLEXible-FL/flex-trees/blob/main/notebooks/Federated%20Random%20Forest%20with%20FLEX.ipynb).
- [Federated Random Forest with FLEXible](https://github.com/FLEXible-FL/flex-nlp/blob/main/notebooks/Federated%20QA%20with%20Hugginface%20using%20FLEXIBLE.ipynb).
- [Practical Federated Gradient Boosting Decision Trees with FLEXible](https://github.com/FLEXible-FL/flex-trees/blob/main/notebooks/Federated%20Gradient%20Boosting%20Decision%20Trees%20with%20FLEX.ipynb).

## Installation

We recommend Anaconda/Miniconda as the package manager. The following is the corresponding `flex-trees` versions and supported Python versions.

| `flex`            | `flex-trees`      | Python              |
| :------------------: | :------------------: | :-------------------: |
| `main` / `nightly` | `main` / `nightly` | `>=3.8`, `<=3.11`   |
| `v0.6.0`           | `v0.1.0`           | `>=3.8`, `<=3.11`    |

To install the package, you can use the following commands:

Using pip:
```
pip install flextrees
```

Download the repository and install it locally:
```
git clone git@github.com:FLEXible-FL/flex-trees.git
cd flex-trees
pip install -e .
```


## Citation

If you use this package, please cite the following paper:

``` TODO: Add citation ```

            

Raw data

            {
    "_id": null,
    "home_page": "",
    "name": "flextrees",
    "maintainer": "",
    "docs_url": null,
    "requires_python": "",
    "maintainer_email": "",
    "keywords": "decision-trees explainable FL federated-learning flexible",
    "author": "Alberto Argente-Garrido",
    "author_email": "",
    "download_url": "https://files.pythonhosted.org/packages/86/93/e6066e5cae3bb5ef8e7a94bbd42966ef15e37725c794b9333a70a275cc56/flextrees-0.1.0.tar.gz",
    "platform": null,
    "description": "# flex-trees\n\nThe flex-trees package consists of a set of tools and utilities to work with Decision Tree (DT) models in Federated Learning (FL). It is designed to be used with the [FLEXible](https://github.com/FLEXible-FL/FLEXible/) framework, as it is an extension of it.\n\nflex-trees comes with some state-of-the-art decision tree models for federated learning. It also provides multiple tabular datasets to test the models.\n\nThe methods implemented in the repository are:\n| `Model`            | `Description`      | `Citation`              |\n| ------------------ | :------------------: | -------------------: |\n| Federated ID3 | The ID3 model adapted to a federated learning scenario. | [A Hybrid Approach to Privacy-Preserving Federated Learning](https://arxiv.org/pdf/1812.03224.pdf) |\n| Federated Random Forest | The Random Forest (RF) model adapted to a federated learning scenario. Each client builds a RF locally, then `N` trees are randomly sampled from each client to get a global RF composed from the `N` trees retrieved from the clients. | [Federated Random Forests can improve local performance of predictive models for various healthcare applications](https://pubmed.ncbi.nlm.nih.gov/35139148/) |\n| Federated Gradient Boosting Decision Trees | The Gradient Boosting Decision Trees model adapted to a federated learning scenario. In this model a global hash table is first created to aling the data between the clients within sharing it. After that, `N` trees (CART) are built by the clients. The process of building the ensemble is iterative, and one client builds the tree, then it is added to the ensemble, and after that the weights of the instances is updated, so the next client can build the next tree with the weights updated.| [Practical Federated Gradient Boosting Decision Trees](https://arxiv.org/abs/1911.04206) |\n\nThe tabular datasets available in the repository are:\n| `Dataset`            | `Description`      | `Citation`              |\n| ------------------ | :------------------: | -------------------: |\n| Adult | The Adult dataset is a dataset that contains demographic information about the people, and the task is to predict if the income of the person is greater than 50K. | [UCI Machine Learning Repository](https://archive.ics.uci.edu/ml/datasets/adult) |\n| Breast Cancer | The Breast Cancer dataset is a dataset that contains information about the breast cancer, and the task is to predict if the cancer is benign or malignant. | [UCI Machine Learning Repository](https://archive.ics.uci.edu/ml/datasets/Breast+Cancer+Wisconsin+%28Diagnostic%29) |\n| Credit Card | The Credit Card dataset is a dataset that contains information about the credit card transactions, and the task is to predict if the transaction is fraudulent or not. | [Kaggle](https://www.kaggle.com/mlg-ulb/creditcardfraud) |\n| ILPD | The ILPD dataset is a dataset that contains information about the Indian Liver Patient, and the task is to predict if the patient has liver disease or not. | [UCI Machine Learning Repository](https://archive.ics.uci.edu/ml/datasets/ILPD+%28Indian+Liver+Patient+Dataset%29) |\n| Nursery | The Nursery dataset is a dataset that contains information about the nursery, and the task is to predict the acceptability of the nursery. | [UCI Machine Learning Repository](https://archive.ics.uci.edu/ml/datasets/nursery) |\n| Bank Marketing | The Bank Marketing dataset is a dataset that contains information about the bank marketing, and the task is to predict if the client will subscribe to a term deposit. | [UCI Machine Learning Repository](https://archive.ics.uci.edu/ml/datasets/bank+marketing) |\n| Magic Gamma | The Magic Gamma dataset is a dataset that contains information about the magic gamma, and the task is to predict if the gamma is signal or background. | [UCI Machine Learning Repository](https://archive.ics.uci.edu/ml/datasets/magic+gamma+telescope) |\n\n## \u00a0Tutorials\n\nTo get started with flex-trees, you can check the [notebooks](https://github.com/FLEXible-FL/flex-trees/tree/main/notebooks) available in the repository. They cover the following topics:\n\n- [Federated ID3 with FLEXible](https://github.com/FLEXible-FL/flex-trees/blob/main/notebooks/Federated%20Random%20Forest%20with%20FLEX.ipynb).\n- [Federated Random Forest with FLEXible](https://github.com/FLEXible-FL/flex-nlp/blob/main/notebooks/Federated%20QA%20with%20Hugginface%20using%20FLEXIBLE.ipynb).\n- [Practical Federated Gradient Boosting Decision Trees with FLEXible](https://github.com/FLEXible-FL/flex-trees/blob/main/notebooks/Federated%20Gradient%20Boosting%20Decision%20Trees%20with%20FLEX.ipynb).\n\n## Installation\n\nWe recommend Anaconda/Miniconda as the package manager. The following is the corresponding `flex-trees` versions and supported Python versions.\n\n| `flex`            | `flex-trees`      | Python              |\n| :------------------: | :------------------: | :-------------------: |\n| `main` / `nightly` | `main` / `nightly` | `>=3.8`, `<=3.11`   |\n| `v0.6.0`           | `v0.1.0`           | `>=3.8`, `<=3.11`    |\n\nTo install the package, you can use the following commands:\n\nUsing pip:\n```\npip install flextrees\n```\n\nDownload the repository and install it locally:\n```\ngit clone git@github.com:FLEXible-FL/flex-trees.git\ncd flex-trees\npip install -e .\n```\n\n\n##\u00a0Citation\n\nIf you use this package, please cite the following paper:\n\n``` TODO: Add citation ```\n",
    "bugtrack_url": null,
    "license": "",
    "summary": "",
    "version": "0.1.0",
    "project_urls": null,
    "split_keywords": [
        "decision-trees",
        "explainable",
        "fl",
        "federated-learning",
        "flexible"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "f5e56af83ee3307f28faefef855dee4c1801762661d328b05ad6c72a8ab51897",
                "md5": "855651663f80007844d3ca8d018c8e6b",
                "sha256": "2e3e612c73f7f16fa13a06d922a863725aff86fbd7afa66f12525d1395bc1f87"
            },
            "downloads": -1,
            "filename": "flextrees-0.1.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "855651663f80007844d3ca8d018c8e6b",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": null,
            "size": 54178,
            "upload_time": "2024-03-14T07:15:56",
            "upload_time_iso_8601": "2024-03-14T07:15:56.698558Z",
            "url": "https://files.pythonhosted.org/packages/f5/e5/6af83ee3307f28faefef855dee4c1801762661d328b05ad6c72a8ab51897/flextrees-0.1.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "8693e6066e5cae3bb5ef8e7a94bbd42966ef15e37725c794b9333a70a275cc56",
                "md5": "fda2ac866e0f392a494cf6374af78297",
                "sha256": "95a69b964040a8dd20beeb73c20fb9fba3cc3ed3eec195abb9161179e2983b69"
            },
            "downloads": -1,
            "filename": "flextrees-0.1.0.tar.gz",
            "has_sig": false,
            "md5_digest": "fda2ac866e0f392a494cf6374af78297",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": null,
            "size": 42910,
            "upload_time": "2024-03-14T07:15:58",
            "upload_time_iso_8601": "2024-03-14T07:15:58.710349Z",
            "url": "https://files.pythonhosted.org/packages/86/93/e6066e5cae3bb5ef8e7a94bbd42966ef15e37725c794b9333a70a275cc56/flextrees-0.1.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-03-14 07:15:58",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "lcname": "flextrees"
}
        
Elapsed time: 0.20325s