chefboost


Namechefboost JSON
Version 0.0.18 PyPI version JSON
download
home_pagehttps://github.com/serengil/chefboost
SummaryLightweight Decision Tree Framework Supporting GBM, Random Forest and Adaboost
upload_time2024-06-08 21:33:52
maintainerNone
docs_urlNone
authorSefik Ilkin Serengil
requires_python>=3.6
licenseNone
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # 👨‍🍳 ChefBoost

<div align="center">

[![Downloads](https://pepy.tech/badge/chefboost)](https://pepy.tech/project/chefboost)
[![Stars](https://img.shields.io/github/stars/serengil/chefboost?color=yellow)](https://github.com/serengil/chefboost)
[![License](http://img.shields.io/:license-MIT-green.svg?style=flat)](https://github.com/serengil/chefboost/blob/master/LICENSE)
[![Tests](https://github.com/serengil/chefboost/actions/workflows/tests.yml/badge.svg)](https://github.com/serengil/chefboost/actions/workflows/tests.yml)
[![DOI](http://img.shields.io/:DOI-10.5281/zenodo.5576203-blue.svg?style=flat)](https://doi.org/10.5281/zenodo.5576203)

[![Blog](https://img.shields.io/:blog-sefiks.com-blue.svg?style=flat&logo=wordpress)](https://sefiks.com)
[![YouTube](https://img.shields.io/:youtube-@sefiks-red.svg?style=flat&logo=youtube)](https://www.youtube.com/@sefiks?sub_confirmation=1)
[![Twitter](https://img.shields.io/:follow-@serengil-blue.svg?style=flat&logo=twitter)](https://twitter.com/intent/user?screen_name=serengil)
[![Support me on Patreon](https://img.shields.io/endpoint.svg?url=https%3A%2F%2Fshieldsio-patreon.vercel.app%2Fapi%3Fusername%3Dserengil%26type%3Dpatrons&style=flat)](https://www.patreon.com/serengil?repo=chefboost)
[![GitHub Sponsors](https://img.shields.io/github/sponsors/serengil?logo=GitHub&color=lightgray)](https://github.com/sponsors/serengil)

</div>

**ChefBoost** is a lightweight decision tree framework for Python **with categorical feature support**. It covers regular decision tree algorithms: [ID3](https://sefiks.com/2017/11/20/a-step-by-step-id3-decision-tree-example/), [C4.5](https://sefiks.com/2018/05/13/a-step-by-step-c4-5-decision-tree-example/), [CART](https://sefiks.com/2018/08/27/a-step-by-step-cart-decision-tree-example/), [CHAID](https://sefiks.com/2020/03/18/a-step-by-step-chaid-decision-tree-example/) and [regression tree](https://sefiks.com/2018/08/28/a-step-by-step-regression-decision-tree-example/); also some advanved techniques: [gradient boosting](https://sefiks.com/2018/10/04/a-step-by-step-gradient-boosting-decision-tree-example/), [random forest](https://sefiks.com/2017/11/19/how-random-forests-can-keep-you-from-decision-tree/) and [adaboost](https://sefiks.com/2018/11/02/a-step-by-step-adaboost-example/). You just need to write **a few lines of code** to build decision trees with Chefboost.

**Installation** - [`Demo`](https://youtu.be/YYF993HTHf8)

The easiest way to install ChefBoost framework is to download it from [from PyPI](https://pypi.org/project/chefboost). It's going to install the library itself and its prerequisites as well.

```
pip install chefboost
```

Then, you will be able to import the library and use its functionalities

```python
from chefboost import Chefboost as chef
```

**Usage** - [`Demo`](https://youtu.be/Z93qE5eb6eg)

Basically, you just need to pass the dataset as pandas data frame and the optional tree configurations as illustrated below.

```python
import pandas as pd

df = pd.read_csv("dataset/golf.txt")
config = {'algorithm': 'C4.5'}
model = chef.fit(df, config = config, target_label = 'Decision')
```

**Pre-processing**

Chefboost handles the both numeric and nominal features and target values in contrast to its alternatives. So, you don't have to apply any pre-processing to build trees.

**Outcomes**

Built decision trees are stored as python if statements in the `tests/outputs/rules` directory. A sample of decision rules is demonstrated below.

```python
def findDecision(Outlook, Temperature, Humidity, Wind):
   if Outlook == 'Rain':
      if Wind == 'Weak':
         return 'Yes'
      elif Wind == 'Strong':
         return 'No'
      else:
         return 'No'
   elif Outlook == 'Sunny':
      if Humidity == 'High':
         return 'No'
      elif Humidity == 'Normal':
         return 'Yes'
      else:
         return 'Yes'
   elif Outlook == 'Overcast':
      return 'Yes'
   else:
      return 'Yes'
 ```

**Testing for custom instances**

Decision rules will be stored in `outputs/rules/` folder when you build decision trees. You can run the built decision tree for new instances as illustrated below.

```python
prediction = chef.predict(model, param = ['Sunny', 'Hot', 'High', 'Weak'])
```

You can consume built decision trees directly as well. In this way, you can restore already built decision trees and skip learning steps, or apply [transfer learning](https://youtu.be/9hX8ir7_ZtA). Loaded trees offer you findDecision method to test for new instances.

```python
module_name = "outputs/rules/rules" #this will load outputs/rules/rules.py
tree = chef.restoreTree(module_name)
prediction = tree.findDecision(['Sunny', 'Hot', 'High', 'Weak'])
```

tests/global-unit-test.py will guide you how to build a different decision trees and make predictions.

**Model save and restoration**

You can save your trained models. This makes your model ready for transfer learning.

```python
chef.save_model(model, "model.pkl")
```

In this way, you can use the same model later to just make predictions. This skips the training steps. Restoration requires to store .py and .pkl files under `outputs/rules`.

```python
model = chef.load_model("model.pkl")
prediction = chef.predict(model, ['Sunny',85,85,'Weak'])
```

### Sample configurations

ChefBoost supports several decision tree, bagging and boosting algorithms. You just need to pass the configuration to use different algorithms.

**Regular Decision Trees**

Regular decision tree algorithms find the best feature and the best split point maximizing the information gain. It builds decision trees recursively in child nodes.

```python
config = {'algorithm': 'C4.5'} #Set algorithm to ID3, C4.5, CART, CHAID or Regression
model = chef.fit(df, config)
```

The following regular decision tree algorithms are wrapped in the library.

| Algorithm  | Metric | Tutorial | Demo |
| ---        | --- | ---      | ---  |
| ID3        | Entropy, Information Gain |[`Tutorial`](https://sefiks.com/2017/11/20/a-step-by-step-id3-decision-tree-example/) | [`Demo`](https://youtu.be/Z93qE5eb6eg) |
| C4.5       | Entropy, Gain Ratio | [`Tutorial`](https://sefiks.com/2018/05/13/a-step-by-step-c4-5-decision-tree-example/) | [`Demo`](https://youtu.be/kjhQHmtDaAA) |
| CART       | GINI | [`Tutorial`](https://sefiks.com/2018/08/27/a-step-by-step-cart-decision-tree-example/) | [`Demo`](https://youtu.be/CSApBetgukM) |
| CHAID      | Chi Square | [`Tutorial`](https://sefiks.com/2020/03/18/a-step-by-step-chaid-decision-tree-example/) | [`Demo`](https://youtu.be/dcnFuS4QILg) |
| Regression | Standard Deviation | [`Tutorial`](https://sefiks.com/2018/08/28/a-step-by-step-regression-decision-tree-example/) | [`Demo`](https://youtu.be/pCQ2RCa20Bg) |

**Gradient Boosting** [`Tutorial`](https://sefiks.com/2018/10/04/a-step-by-step-gradient-boosting-decision-tree-example/), [`Demo`](https://youtu.be/KFsnZKMKNAE)

Gradient boosting is basically based on building a tree, and then building another based on the previous one's error. In this way, it boosts results. Predictions will be the sum of each tree'e prediction result.

```python
config = {'enableGBM': True, 'epochs': 7, 'learning_rate': 1, 'max_depth': 5}
```

**Random Forest** [`Tutorial`](https://sefiks.com/2017/11/19/how-random-forests-can-keep-you-from-decision-tree/), [`Demo`](https://youtu.be/J7hDtV261PQ)

Random forest basically splits the data set into several sub data sets and builds different data set for those sub data sets. Predictions will be the average of each tree's prediction result.

```python
config = {'enableRandomForest': True, 'num_of_trees': 5}
```

**Adaboost** [`Tutorial`](https://sefiks.com/2018/11/02/a-step-by-step-adaboost-example/), [`Demo`](https://youtu.be/Obj208F6e7k)

Adaboost applies a decision stump instead of a decision tree. This is a weak classifier and aims to get min 50% score. It then increases the unclassified ones and decreases the classified ones. In this way, it aims to have a high score with weak classifiers.

```python
config = {'enableAdaboost': True, 'num_of_weak_classifier': 4}
```

**Feature Importance** - [`Demo`](https://youtu.be/NFLQT6Ta4-k)

Decision trees are naturally interpretable and explainable algorithms. A decision is clear made by a single tree. Still we need some extra layers to understand the built models. Besides, random forest and GBM are hard to explain. Herein, [feature importance](https://sefiks.com/2020/04/06/feature-importance-in-decision-trees/) is one of the most common way to see the big picture and understand built models.

```python
df = chef.feature_importance("outputs/rules/rules.py")
```

| feature     | final_importance |
| ---         | ---              |
| Humidity    | 0.3688           |
| Wind        | 0.3688           |
| Outlook     | 0.2624           |
| Temperature | 0.0000           |

### Paralellism

ChefBoost offers parallelism to speed model building up. Branches of a decision tree will be created in parallel in this way. You should set enableParallelism argument to False in the configuration if you don't want to use parallelism. Its default value is True. It allocates half of the total number of cores in your environment if parallelism is enabled.

```python
if __name__ == '__main__':
   config = {'algorithm': 'C4.5', 'enableParallelism': True, 'num_cores': 2}
   model = chef.fit(df, config)
```

Notice that you have to locate training step in an if block and it should check you are in main.

To not use parallelism set the parameter to False.

```python
config = {'algorithm': 'C4.5', 'enableParallelism': False}
model = chef.fit(df, config)
```

## Contribution [![Tests](https://github.com/serengil/chefboost/actions/workflows/tests.yml/badge.svg)](https://github.com/serengil/chefboost/actions/workflows/tests.yml)

Pull requests are more than welcome! You should run the unit tests and linting locally by running `make test` and `make lint` commands before creating a PR. Once a PR created, GitHub test workflow will be run automatically and unit test results will be available in [GitHub actions](https://github.com/serengil/chefboosts/actions) before approval.


### Support

There are many ways to support a project - starring⭐️ the GitHub repos is just one 🙏

You can also support this work on [Patreon](https://www.patreon.com/serengil?repo=chefboost)

<a href="https://www.patreon.com/serengil?repo=chefboost">
<img src="https://raw.githubusercontent.com/serengil/chefboost/master/icon/patreon.png" width="30%" height="30%">
</a>

### Citation

Please cite [ChefBoost](https://doi.org/10.5281/zenodo.5576203) in your publications if it helps your research. Here is an example BibTeX entry:

```BibTeX
@misc{serengil2021chefboost,
  author       = {Serengil, Sefik Ilkin},
  title        = {ChefBoost: A Lightweight Boosted Decision Tree Framework},
  month        = oct,
  year         = 2021,
  publisher    = {Zenodo},
  doi          = {10.5281/zenodo.5576203},
  howpublished = {https://doi.org/10.5281/zenodo.5576203}
}
```

Also, if you use chefboost in your GitHub projects, please add chefboost in the requirements.txt.

### Licence

ChefBoost is licensed under the MIT License - see [`LICENSE`](https://github.com/serengil/chefboost/blob/master/LICENSE) for more details.



            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/serengil/chefboost",
    "name": "chefboost",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.6",
    "maintainer_email": null,
    "keywords": null,
    "author": "Sefik Ilkin Serengil",
    "author_email": "serengil@gmail.com",
    "download_url": "https://files.pythonhosted.org/packages/e1/9c/0fd170ec7bb1c9ae2737c450b39f2c765e891a86a2b63c913cf310f860d8/chefboost-0.0.18.tar.gz",
    "platform": null,
    "description": "# \ud83d\udc68\u200d\ud83c\udf73 ChefBoost\n\n<div align=\"center\">\n\n[![Downloads](https://pepy.tech/badge/chefboost)](https://pepy.tech/project/chefboost)\n[![Stars](https://img.shields.io/github/stars/serengil/chefboost?color=yellow)](https://github.com/serengil/chefboost)\n[![License](http://img.shields.io/:license-MIT-green.svg?style=flat)](https://github.com/serengil/chefboost/blob/master/LICENSE)\n[![Tests](https://github.com/serengil/chefboost/actions/workflows/tests.yml/badge.svg)](https://github.com/serengil/chefboost/actions/workflows/tests.yml)\n[![DOI](http://img.shields.io/:DOI-10.5281/zenodo.5576203-blue.svg?style=flat)](https://doi.org/10.5281/zenodo.5576203)\n\n[![Blog](https://img.shields.io/:blog-sefiks.com-blue.svg?style=flat&logo=wordpress)](https://sefiks.com)\n[![YouTube](https://img.shields.io/:youtube-@sefiks-red.svg?style=flat&logo=youtube)](https://www.youtube.com/@sefiks?sub_confirmation=1)\n[![Twitter](https://img.shields.io/:follow-@serengil-blue.svg?style=flat&logo=twitter)](https://twitter.com/intent/user?screen_name=serengil)\n[![Support me on Patreon](https://img.shields.io/endpoint.svg?url=https%3A%2F%2Fshieldsio-patreon.vercel.app%2Fapi%3Fusername%3Dserengil%26type%3Dpatrons&style=flat)](https://www.patreon.com/serengil?repo=chefboost)\n[![GitHub Sponsors](https://img.shields.io/github/sponsors/serengil?logo=GitHub&color=lightgray)](https://github.com/sponsors/serengil)\n\n</div>\n\n**ChefBoost** is a lightweight decision tree framework for Python **with categorical feature support**. It covers regular decision tree algorithms: [ID3](https://sefiks.com/2017/11/20/a-step-by-step-id3-decision-tree-example/), [C4.5](https://sefiks.com/2018/05/13/a-step-by-step-c4-5-decision-tree-example/), [CART](https://sefiks.com/2018/08/27/a-step-by-step-cart-decision-tree-example/), [CHAID](https://sefiks.com/2020/03/18/a-step-by-step-chaid-decision-tree-example/) and [regression tree](https://sefiks.com/2018/08/28/a-step-by-step-regression-decision-tree-example/); also some advanved techniques: [gradient boosting](https://sefiks.com/2018/10/04/a-step-by-step-gradient-boosting-decision-tree-example/), [random forest](https://sefiks.com/2017/11/19/how-random-forests-can-keep-you-from-decision-tree/) and [adaboost](https://sefiks.com/2018/11/02/a-step-by-step-adaboost-example/). You just need to write **a few lines of code** to build decision trees with Chefboost.\n\n**Installation** - [`Demo`](https://youtu.be/YYF993HTHf8)\n\nThe easiest way to install ChefBoost framework is to download it from [from PyPI](https://pypi.org/project/chefboost). It's going to install the library itself and its prerequisites as well.\n\n```\npip install chefboost\n```\n\nThen, you will be able to import the library and use its functionalities\n\n```python\nfrom chefboost import Chefboost as chef\n```\n\n**Usage** - [`Demo`](https://youtu.be/Z93qE5eb6eg)\n\nBasically, you just need to pass the dataset as pandas data frame and the optional tree configurations as illustrated below.\n\n```python\nimport pandas as pd\n\ndf = pd.read_csv(\"dataset/golf.txt\")\nconfig = {'algorithm': 'C4.5'}\nmodel = chef.fit(df, config = config, target_label = 'Decision')\n```\n\n**Pre-processing**\n\nChefboost handles the both numeric and nominal features and target values in contrast to its alternatives. So, you don't have to apply any pre-processing to build trees.\n\n**Outcomes**\n\nBuilt decision trees are stored as python if statements in the `tests/outputs/rules` directory. A sample of decision rules is demonstrated below.\n\n```python\ndef findDecision(Outlook, Temperature, Humidity, Wind):\n   if Outlook == 'Rain':\n      if Wind == 'Weak':\n         return 'Yes'\n      elif Wind == 'Strong':\n         return 'No'\n      else:\n         return 'No'\n   elif Outlook == 'Sunny':\n      if Humidity == 'High':\n         return 'No'\n      elif Humidity == 'Normal':\n         return 'Yes'\n      else:\n         return 'Yes'\n   elif Outlook == 'Overcast':\n      return 'Yes'\n   else:\n      return 'Yes'\n ```\n\n**Testing for custom instances**\n\nDecision rules will be stored in `outputs/rules/` folder when you build decision trees. You can run the built decision tree for new instances as illustrated below.\n\n```python\nprediction = chef.predict(model, param = ['Sunny', 'Hot', 'High', 'Weak'])\n```\n\nYou can consume built decision trees directly as well. In this way, you can restore already built decision trees and skip learning steps, or apply [transfer learning](https://youtu.be/9hX8ir7_ZtA). Loaded trees offer you findDecision method to test for new instances.\n\n```python\nmodule_name = \"outputs/rules/rules\" #this will load outputs/rules/rules.py\ntree = chef.restoreTree(module_name)\nprediction = tree.findDecision(['Sunny', 'Hot', 'High', 'Weak'])\n```\n\ntests/global-unit-test.py will guide you how to build a different decision trees and make predictions.\n\n**Model save and restoration**\n\nYou can save your trained models. This makes your model ready for transfer learning.\n\n```python\nchef.save_model(model, \"model.pkl\")\n```\n\nIn this way, you can use the same model later to just make predictions. This skips the training steps. Restoration requires to store .py and .pkl files under `outputs/rules`.\n\n```python\nmodel = chef.load_model(\"model.pkl\")\nprediction = chef.predict(model, ['Sunny',85,85,'Weak'])\n```\n\n### Sample configurations\n\nChefBoost supports several decision tree, bagging and boosting algorithms. You just need to pass the configuration to use different algorithms.\n\n**Regular Decision Trees**\n\nRegular decision tree algorithms find the best feature and the best split point maximizing the information gain. It builds decision trees recursively in child nodes.\n\n```python\nconfig = {'algorithm': 'C4.5'} #Set algorithm to ID3, C4.5, CART, CHAID or Regression\nmodel = chef.fit(df, config)\n```\n\nThe following regular decision tree algorithms are wrapped in the library.\n\n| Algorithm  | Metric | Tutorial | Demo |\n| ---        | --- | ---      | ---  |\n| ID3        | Entropy, Information Gain |[`Tutorial`](https://sefiks.com/2017/11/20/a-step-by-step-id3-decision-tree-example/) | [`Demo`](https://youtu.be/Z93qE5eb6eg) |\n| C4.5       | Entropy, Gain Ratio | [`Tutorial`](https://sefiks.com/2018/05/13/a-step-by-step-c4-5-decision-tree-example/) | [`Demo`](https://youtu.be/kjhQHmtDaAA) |\n| CART       | GINI | [`Tutorial`](https://sefiks.com/2018/08/27/a-step-by-step-cart-decision-tree-example/) | [`Demo`](https://youtu.be/CSApBetgukM) |\n| CHAID      | Chi Square | [`Tutorial`](https://sefiks.com/2020/03/18/a-step-by-step-chaid-decision-tree-example/) | [`Demo`](https://youtu.be/dcnFuS4QILg) |\n| Regression | Standard Deviation | [`Tutorial`](https://sefiks.com/2018/08/28/a-step-by-step-regression-decision-tree-example/) | [`Demo`](https://youtu.be/pCQ2RCa20Bg) |\n\n**Gradient Boosting** [`Tutorial`](https://sefiks.com/2018/10/04/a-step-by-step-gradient-boosting-decision-tree-example/), [`Demo`](https://youtu.be/KFsnZKMKNAE)\n\nGradient boosting is basically based on building a tree, and then building another based on the previous one's error. In this way, it boosts results. Predictions will be the sum of each tree'e prediction result.\n\n```python\nconfig = {'enableGBM': True, 'epochs': 7, 'learning_rate': 1, 'max_depth': 5}\n```\n\n**Random Forest** [`Tutorial`](https://sefiks.com/2017/11/19/how-random-forests-can-keep-you-from-decision-tree/), [`Demo`](https://youtu.be/J7hDtV261PQ)\n\nRandom forest basically splits the data set into several sub data sets and builds different data set for those sub data sets. Predictions will be the average of each tree's prediction result.\n\n```python\nconfig = {'enableRandomForest': True, 'num_of_trees': 5}\n```\n\n**Adaboost** [`Tutorial`](https://sefiks.com/2018/11/02/a-step-by-step-adaboost-example/), [`Demo`](https://youtu.be/Obj208F6e7k)\n\nAdaboost applies a decision stump instead of a decision tree. This is a weak classifier and aims to get min 50% score. It then increases the unclassified ones and decreases the classified ones. In this way, it aims to have a high score with weak classifiers.\n\n```python\nconfig = {'enableAdaboost': True, 'num_of_weak_classifier': 4}\n```\n\n**Feature Importance** - [`Demo`](https://youtu.be/NFLQT6Ta4-k)\n\nDecision trees are naturally interpretable and explainable algorithms. A decision is clear made by a single tree. Still we need some extra layers to understand the built models. Besides, random forest and GBM are hard to explain. Herein, [feature importance](https://sefiks.com/2020/04/06/feature-importance-in-decision-trees/) is one of the most common way to see the big picture and understand built models.\n\n```python\ndf = chef.feature_importance(\"outputs/rules/rules.py\")\n```\n\n| feature     | final_importance |\n| ---         | ---              |\n| Humidity    | 0.3688           |\n| Wind        | 0.3688           |\n| Outlook     | 0.2624           |\n| Temperature | 0.0000           |\n\n### Paralellism\n\nChefBoost offers parallelism to speed model building up. Branches of a decision tree will be created in parallel in this way. You should set enableParallelism argument to False in the configuration if you don't want to use parallelism. Its default value is True. It allocates half of the total number of cores in your environment if parallelism is enabled.\n\n```python\nif __name__ == '__main__':\n   config = {'algorithm': 'C4.5', 'enableParallelism': True, 'num_cores': 2}\n   model = chef.fit(df, config)\n```\n\nNotice that you have to locate training step in an if block and it should check you are in main.\n\nTo not use parallelism set the parameter to False.\n\n```python\nconfig = {'algorithm': 'C4.5', 'enableParallelism': False}\nmodel = chef.fit(df, config)\n```\n\n## Contribution [![Tests](https://github.com/serengil/chefboost/actions/workflows/tests.yml/badge.svg)](https://github.com/serengil/chefboost/actions/workflows/tests.yml)\n\nPull requests are more than welcome! You should run the unit tests and linting locally by running `make test` and `make lint` commands before creating a PR. Once a PR created, GitHub test workflow will be run automatically and unit test results will be available in [GitHub actions](https://github.com/serengil/chefboosts/actions) before approval.\n\n\n### Support\n\nThere are many ways to support a project - starring\u2b50\ufe0f the GitHub repos is just one \ud83d\ude4f\n\nYou can also support this work on [Patreon](https://www.patreon.com/serengil?repo=chefboost)\n\n<a href=\"https://www.patreon.com/serengil?repo=chefboost\">\n<img src=\"https://raw.githubusercontent.com/serengil/chefboost/master/icon/patreon.png\" width=\"30%\" height=\"30%\">\n</a>\n\n### Citation\n\nPlease cite [ChefBoost](https://doi.org/10.5281/zenodo.5576203) in your publications if it helps your research. Here is an example BibTeX entry:\n\n```BibTeX\n@misc{serengil2021chefboost,\n  author       = {Serengil, Sefik Ilkin},\n  title        = {ChefBoost: A Lightweight Boosted Decision Tree Framework},\n  month        = oct,\n  year         = 2021,\n  publisher    = {Zenodo},\n  doi          = {10.5281/zenodo.5576203},\n  howpublished = {https://doi.org/10.5281/zenodo.5576203}\n}\n```\n\nAlso, if you use chefboost in your GitHub projects, please add chefboost in the requirements.txt.\n\n### Licence\n\nChefBoost is licensed under the MIT License - see [`LICENSE`](https://github.com/serengil/chefboost/blob/master/LICENSE) for more details.\n\n\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "Lightweight Decision Tree Framework Supporting GBM, Random Forest and Adaboost",
    "version": "0.0.18",
    "project_urls": {
        "Homepage": "https://github.com/serengil/chefboost"
    },
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "c0cc9cf4c9851833dedc32ec3c3ffe68dc6509c40f1e24863857dc88481947f1",
                "md5": "30c8cb190ed20e040506881dee29cb06",
                "sha256": "c0ef43b403f2b57c23fbc6e134b1cace4b17828d219b81c2b8a6b0d7a8def98b"
            },
            "downloads": -1,
            "filename": "chefboost-0.0.18-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "30c8cb190ed20e040506881dee29cb06",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.6",
            "size": 32327,
            "upload_time": "2024-06-08T21:33:50",
            "upload_time_iso_8601": "2024-06-08T21:33:50.612007Z",
            "url": "https://files.pythonhosted.org/packages/c0/cc/9cf4c9851833dedc32ec3c3ffe68dc6509c40f1e24863857dc88481947f1/chefboost-0.0.18-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "e19c0fd170ec7bb1c9ae2737c450b39f2c765e891a86a2b63c913cf310f860d8",
                "md5": "503f198eae7ca2c73080884cdc026777",
                "sha256": "3e95e36ba886d5e7f991ada8e8b4e4088d65acd476e8c860c3533ee4aa0a1641"
            },
            "downloads": -1,
            "filename": "chefboost-0.0.18.tar.gz",
            "has_sig": false,
            "md5_digest": "503f198eae7ca2c73080884cdc026777",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.6",
            "size": 30680,
            "upload_time": "2024-06-08T21:33:52",
            "upload_time_iso_8601": "2024-06-08T21:33:52.253446Z",
            "url": "https://files.pythonhosted.org/packages/e1/9c/0fd170ec7bb1c9ae2737c450b39f2c765e891a86a2b63c913cf310f860d8/chefboost-0.0.18.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-06-08 21:33:52",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "serengil",
    "github_project": "chefboost",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "requirements": [],
    "lcname": "chefboost"
}
        
Elapsed time: 0.99591s