featuristic


Namefeaturistic JSON
Version 1.1.0 PyPI version JSON
download
home_pageNone
SummaryGenetic algorithms for automated feature engineering and feature selection
upload_time2024-04-10 14:40:08
maintainerNone
docs_urlNone
authorNone
requires_python>=3.8
licenseCopyright 2024 Martin Eastwood Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
keywords data machine learning feature engineering feature selection model selection model evaluation genetic algorithm optimization hyperparameter tuning
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            <p align="center">
<img width=50% src="https://raw.githubusercontent.com/martineastwood/featuristic/dev/docs/_static/logo.png" alt="Featuristic" />
</p>

<p align="center">
<i>"Because feature engineering should be a science, not an art."</i>
</p>

<div align="center">

  <a href="">[![Python Version](https://img.shields.io/pypi/pyversions/featuristic)](https://pypi.org/project/featuristic/)</a>
  <a href="">[![PyPI](https://img.shields.io/pypi/v/featuristic.svg)](https://pypi.org/project/featuristic/)</a>
  <a href="">[![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)</a>
  <a href='https://coveralls.io/github/martineastwood/featuristic?branch=dev'><img src='https://coveralls.io/repos/github/martineastwood/featuristic/badge.svg?branch=dev' alt='Coverage Status' /></a>
  <a href="">[![Code style: black](https://img.shields.io/badge/code%20style-black-000000.svg)](https://github.com/psf/black)</a>
  <a href="">[![Code style: pre-commit](https://img.shields.io/badge/pre--commit-enabled-brightgreen?logo=pre-commit&logoColor=white)](https://github.com/pre-commit/pre-commit)</a>

</div>

[Featuristic](https://www.featuristic.co.uk/) uses Genetic Algorithms to automate the process of **feature engineering** and **feature selection**, enhancing the performance of machine learning models by optimizing their predictive capabilities.

See the [documentation](https://www.featuristic.co.uk/) for more detailed information.

## Installation
Install with pip

```
python3 -m pip install featuristic
```

## Understanding Genetic Feature Synthesis

Featuristic uses symbolic regression to intelligently derive interpretable mathematical formulas, which are then used to create new features from your dataset.

Initially, Featuristic creates a diverse population of formulas using fundamental mathematical operators such as `add`, `subtract`, `sin`, `tan`, `square`, `sqrt`, and more.

For instance, a formula generated by Featuristic might look like this: `(square(feature_1) - abs(feature_2)) * feature_3`.

Next, Featuristic assesses the importance of these formulas by quantifying how well they correlate with the target variable. Those formulas yielding features with the strongest correlations are then selected and recombined using a genetic algorithm to produce offspring, as illustrated below.

![Symbolic Regression Example](https://raw.githubusercontent.com/martineastwood/featuristic/dev/docs/_static/symbolic_regression_example.png "Symbolic Regression Example")

These offspring may also undergo point mutations, which causes alterations to random operators within the formula. This process introduces slight variations to the formulas, enhancing the diversity of the population and potentially leading to the discovery of novel and more effective feature representations.

![Mutation Example](https://raw.githubusercontent.com/martineastwood/featuristic/dev/docs/_static/mutation_example.png "Mutation Example")

This iterative process continues across multiple generations, continually refining the population of formulas with the goal of generating features that exhibit strong correlations with the target variable.

## Example

Below is an example of using Featuristic's Genetic Feature Synthesis (GFS) to perform automated feature engineering. We'll start off by downloading the well known `cars` dataset from the UCI Machine Learning Repository and split it into training and testing sets. The training set will be used to train our model, while the testing set will remain unseen during the training process and will serve as an independent dataset to evaluate the model's performance.

```python
from sklearn.linear_model import LinearRegression
from sklearn.model_selection import train_test_split, cross_val_score
from sklearn.metrics import mean_absolute_error
import featuristic as ft
import numpy as np

np.random.seed(8888)

X, y = ft.fetch_cars_dataset()

X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.3)
```

Next, we'll initiate the Genetic Feature Synthesis process. We've configured the genetic algorithm to synthesize 5 new features for us. This entails evolving a population consisting of 200 individuals iteratively over 100 generations. To ensure optimal performance, we've set the genetic algorithm to halt early if it fails to improve upon the best feature identified within 25 generations. Additionally, for enhanced computational efficiency, we've designated n_jobs as -1, enabling concurrent execution across all available CPUs on our computer.

```python
synth = ft.GeneticFeatureSynthesis(
    num_features=5,
    population_size=200,
    max_generations=100,
    early_termination_iters=25,
    parsimony_coefficient=0.035,
    n_jobs=1,
)

synth.fit(X_train, y_train)
```

We can call the `transform` function to generate a dataframe containing our new features. By default, the `GeneticFeatureSynthesis` class will return both the original features and the newly synthesised features. However, we return just the new features by setting the `return_all_features` argument to `False` when we create the class. We can also combine both the `fit` and `transform` steps into one step by calling `fit_transform` instead.

```python
generated_features = synth.transform(X_train)

print(generated_features.head())
```

|    |   displacement |   cylinders |   horsepower |   weight |   acceleration |   model_year |   origin |   feature_0 |   feature_4 |   feature_11 |   feature_1 |   feature_22 |
|---:|---------------:|------------:|-------------:|---------:|---------------:|-------------:|---------:|------------:|------------:|-------------:|------------:|-------------:|
|  0 |             89 |           4 |           62 |     2050 |           17.3 |           81 |        3 |    -8571.63 |   -0.312535 |     -96.7449 |   -105.823  |    -0.624987 |
|  1 |            318 |           8 |          150 |     4077 |           14   |           72 |        1 |    -2488.32 |   -0.786564 |     -75.1698 |    -34.56   |    -1.57302  |
|  2 |            383 |           8 |          170 |     3563 |           10   |           70 |        1 |    -2017.65 |   -0.727317 |     -71.8277 |    -28.8235 |    -1.45446  |
|  3 |            260 |           8 |          110 |     4060 |           19   |           77 |        1 |    -4150.3  |   -0.684937 |     -82.6269 |    -53.9    |    -1.36971  |
|  4 |            318 |           8 |          140 |     4080 |           13.7 |           78 |        1 |    -3389.66 |   -0.670713 |     -81.3604 |    -43.4571 |    -1.34132  |

Our newly engineered features currently have generic names. However, since Featuristic synthesizes these features by the applying mathematical expressions to the data, we can look at the underlying formulas responsible for each feature's creation.

```python
info = synth.get_feature_info()
print(info["formula"].iloc[0])
```

```
-(abs((cube(model_year) / horsepower)))
```

Following the synthesis of our new features, we can now use another genetic algorithm for [feature selection](https://en.wikipedia.org/wiki/Feature_selection). This process sifts through all our features to identify the subset that optimally contributes to predictive performance while minimizing redundancy.

To do this, we define a custom objective function that the Genetic Feature Selection algorithm will use to quantify how well each subset of features predicts the target. Please note that the function should return a value to minimize so a smaller value is better. If you want to maximize a metric, you should multiply the output of your objective_function by -1, as shown in the example below.

```python
def objective_function(X, y):
    model = LinearRegression()
    scores = cross_val_score(model, X, y, cv=3, scoring="neg_mean_absolute_error")
    return scores.mean() * -1
```

Next, we set up the Genetic Feature Selector. We've configured the genetic algorithm to evolve a population consisting of 200 individuals iteratively over 100 generations. To ensure optimal performance, we've set the genetic algorithm to halt early if it fails to improve upon the best feature set identified within 25 generations. Additionally, for enhanced computational efficiency, we've set n_jobs as -1, enabling concurrent execution across all available CPUs on our computer.

```python
selector = ft.GeneticFeatureSelector(
    objective_function,
    population_size=200,
    max_generations=100,
    early_termination_iters=25,
    n_jobs=-1,
)

selector.fit(generated_features, y_train)

selected_features = selector.transform(generated_features)
```

Let's print out the selected features to see what the Genetic Feature Selection algorithm kept. You can see below that featuristic has kept four of the original features ("weight", "acceleration", "model_year" and "origin") plus four of the features created via the Genetic Feature Synthesis.

```python
print(selected_features.head())
```

|    |   weight |   acceleration |   model_year |   origin |   feature_0 |   feature_4 |   feature_11 |   feature_1 |
|---:|---------:|---------------:|-------------:|---------:|------------:|------------:|-------------:|------------:|
|  0 |     2050 |           17.3 |           81 |        3 |    -8571.63 |   -0.312535 |     -96.7449 |   -105.823  |
|  1 |     4077 |           14   |           72 |        1 |    -2488.32 |   -0.786564 |     -75.1698 |    -34.56   |
|  2 |     3563 |           10   |           70 |        1 |    -2017.65 |   -0.727317 |     -71.8277 |    -28.8235 |
|  3 |     4060 |           19   |           77 |        1 |    -4150.3  |   -0.684937 |     -82.6269 |    -53.9    |
|  4 |     4080 |           13.7 |           78 |        1 |    -3389.66 |   -0.670713 |     -81.3604 |    -43.4571 |

Now that we've selected our features, let's see whether they actually help our model's predictive performance on our test data set. We'll start off with the original features as a baseline.

```python
model = LinearRegression()
model.fit(X_train, y_train)
preds = model.predict(X_test)
original_mae = mean_absolute_error(y_test, preds)
print(original_mae)
```

```
2.5888868138669303
```

And now, let's see how the model performs with our synthesised feature set.

```python
model = LinearRegression()
model.fit(selected_features, y_train)
test_features = selector.transform(synth.transform(X_test))
preds = model.predict(test_features)
featuristic_mae = mean_absolute_error(y_test, preds)
print(featuristic_mae)
```

```
1.9497667311649802
```

```python
print(f"Original MAE: {original_mae}")
print(f"Featuristic MAE: {featuristic_mae}")
print(f"Improvement: {round((1 - (featuristic_mae / original_mae))* 100, 1)}%")
```

```
Original MAE: 2.5888868138669303
Featuristic MAE: 1.9497667311649802
Improvement: 24.7%
```

The new features generated / selected by the Genetic Feature Synthesis have successfully reduced our mean absolute error &#128512;

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "featuristic",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.8",
    "maintainer_email": null,
    "keywords": "data, machine learning, feature engineering, feature selection, model selection, model evaluation, genetic algorithm, optimization, hyperparameter tuning",
    "author": null,
    "author_email": null,
    "download_url": "https://files.pythonhosted.org/packages/df/3a/7471005c5514cefe4fdd985b9ba36bf736355be215746193e5158aa49183/featuristic-1.1.0.tar.gz",
    "platform": null,
    "description": "<p align=\"center\">\n<img width=50% src=\"https://raw.githubusercontent.com/martineastwood/featuristic/dev/docs/_static/logo.png\" alt=\"Featuristic\" />\n</p>\n\n<p align=\"center\">\n<i>\"Because feature engineering should be a science, not an art.\"</i>\n</p>\n\n<div align=\"center\">\n\n  <a href=\"\">[![Python Version](https://img.shields.io/pypi/pyversions/featuristic)](https://pypi.org/project/featuristic/)</a>\n  <a href=\"\">[![PyPI](https://img.shields.io/pypi/v/featuristic.svg)](https://pypi.org/project/featuristic/)</a>\n  <a href=\"\">[![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)</a>\n  <a href='https://coveralls.io/github/martineastwood/featuristic?branch=dev'><img src='https://coveralls.io/repos/github/martineastwood/featuristic/badge.svg?branch=dev' alt='Coverage Status' /></a>\n  <a href=\"\">[![Code style: black](https://img.shields.io/badge/code%20style-black-000000.svg)](https://github.com/psf/black)</a>\n  <a href=\"\">[![Code style: pre-commit](https://img.shields.io/badge/pre--commit-enabled-brightgreen?logo=pre-commit&logoColor=white)](https://github.com/pre-commit/pre-commit)</a>\n\n</div>\n\n[Featuristic](https://www.featuristic.co.uk/) uses Genetic Algorithms to automate the process of **feature engineering** and **feature selection**, enhancing the performance of machine learning models by optimizing their predictive capabilities.\n\nSee the [documentation](https://www.featuristic.co.uk/) for more detailed information.\n\n## Installation\nInstall with pip\n\n```\npython3 -m pip install featuristic\n```\n\n## Understanding Genetic Feature Synthesis\n\nFeaturistic uses symbolic regression to intelligently derive interpretable mathematical formulas, which are then used to create new features from your dataset.\n\nInitially, Featuristic creates a diverse population of formulas using fundamental mathematical operators such as `add`, `subtract`, `sin`, `tan`, `square`, `sqrt`, and more.\n\nFor instance, a formula generated by Featuristic might look like this: `(square(feature_1) - abs(feature_2)) * feature_3`.\n\nNext, Featuristic assesses the importance of these formulas by quantifying how well they correlate with the target variable. Those formulas yielding features with the strongest correlations are then selected and recombined using a genetic algorithm to produce offspring, as illustrated below.\n\n![Symbolic Regression Example](https://raw.githubusercontent.com/martineastwood/featuristic/dev/docs/_static/symbolic_regression_example.png \"Symbolic Regression Example\")\n\nThese offspring may also undergo point mutations, which causes alterations to random operators within the formula. This process introduces slight variations to the formulas, enhancing the diversity of the population and potentially leading to the discovery of novel and more effective feature representations.\n\n![Mutation Example](https://raw.githubusercontent.com/martineastwood/featuristic/dev/docs/_static/mutation_example.png \"Mutation Example\")\n\nThis iterative process continues across multiple generations, continually refining the population of formulas with the goal of generating features that exhibit strong correlations with the target variable.\n\n## Example\n\nBelow is an example of using Featuristic's Genetic Feature Synthesis (GFS) to perform automated feature engineering. We'll start off by downloading the well known `cars` dataset from the UCI Machine Learning Repository and split it into training and testing sets. The training set will be used to train our model, while the testing set will remain unseen during the training process and will serve as an independent dataset to evaluate the model's performance.\n\n```python\nfrom sklearn.linear_model import LinearRegression\nfrom sklearn.model_selection import train_test_split, cross_val_score\nfrom sklearn.metrics import mean_absolute_error\nimport featuristic as ft\nimport numpy as np\n\nnp.random.seed(8888)\n\nX, y = ft.fetch_cars_dataset()\n\nX_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.3)\n```\n\nNext, we'll initiate the Genetic Feature Synthesis process. We've configured the genetic algorithm to synthesize 5 new features for us. This entails evolving a population consisting of 200 individuals iteratively over 100 generations. To ensure optimal performance, we've set the genetic algorithm to halt early if it fails to improve upon the best feature identified within 25 generations. Additionally, for enhanced computational efficiency, we've designated n_jobs as -1, enabling concurrent execution across all available CPUs on our computer.\n\n```python\nsynth = ft.GeneticFeatureSynthesis(\n    num_features=5,\n    population_size=200,\n    max_generations=100,\n    early_termination_iters=25,\n    parsimony_coefficient=0.035,\n    n_jobs=1,\n)\n\nsynth.fit(X_train, y_train)\n```\n\nWe can call the `transform` function to generate a dataframe containing our new features. By default, the `GeneticFeatureSynthesis` class will return both the original features and the newly synthesised features. However, we return just the new features by setting the `return_all_features` argument to `False` when we create the class. We can also combine both the `fit` and `transform` steps into one step by calling `fit_transform` instead.\n\n```python\ngenerated_features = synth.transform(X_train)\n\nprint(generated_features.head())\n```\n\n|    |   displacement |   cylinders |   horsepower |   weight |   acceleration |   model_year |   origin |   feature_0 |   feature_4 |   feature_11 |   feature_1 |   feature_22 |\n|---:|---------------:|------------:|-------------:|---------:|---------------:|-------------:|---------:|------------:|------------:|-------------:|------------:|-------------:|\n|  0 |             89 |           4 |           62 |     2050 |           17.3 |           81 |        3 |    -8571.63 |   -0.312535 |     -96.7449 |   -105.823  |    -0.624987 |\n|  1 |            318 |           8 |          150 |     4077 |           14   |           72 |        1 |    -2488.32 |   -0.786564 |     -75.1698 |    -34.56   |    -1.57302  |\n|  2 |            383 |           8 |          170 |     3563 |           10   |           70 |        1 |    -2017.65 |   -0.727317 |     -71.8277 |    -28.8235 |    -1.45446  |\n|  3 |            260 |           8 |          110 |     4060 |           19   |           77 |        1 |    -4150.3  |   -0.684937 |     -82.6269 |    -53.9    |    -1.36971  |\n|  4 |            318 |           8 |          140 |     4080 |           13.7 |           78 |        1 |    -3389.66 |   -0.670713 |     -81.3604 |    -43.4571 |    -1.34132  |\n\nOur newly engineered features currently have generic names. However, since Featuristic synthesizes these features by the applying mathematical expressions to the data, we can look at the underlying formulas responsible for each feature's creation.\n\n```python\ninfo = synth.get_feature_info()\nprint(info[\"formula\"].iloc[0])\n```\n\n```\n-(abs((cube(model_year) / horsepower)))\n```\n\nFollowing the synthesis of our new features, we can now use another genetic algorithm for [feature selection](https://en.wikipedia.org/wiki/Feature_selection). This process sifts through all our features to identify the subset that optimally contributes to predictive performance while minimizing redundancy.\n\nTo do this, we define a custom objective function that the Genetic Feature Selection algorithm will use to quantify how well each subset of features predicts the target. Please note that the function should return a value to minimize so a smaller value is better. If you want to maximize a metric, you should multiply the output of your objective_function by -1, as shown in the example below.\n\n```python\ndef objective_function(X, y):\n    model = LinearRegression()\n    scores = cross_val_score(model, X, y, cv=3, scoring=\"neg_mean_absolute_error\")\n    return scores.mean() * -1\n```\n\nNext, we set up the Genetic Feature Selector. We've configured the genetic algorithm to evolve a population consisting of 200 individuals iteratively over 100 generations. To ensure optimal performance, we've set the genetic algorithm to halt early if it fails to improve upon the best feature set identified within 25 generations. Additionally, for enhanced computational efficiency, we've set n_jobs as -1, enabling concurrent execution across all available CPUs on our computer.\n\n```python\nselector = ft.GeneticFeatureSelector(\n    objective_function,\n    population_size=200,\n    max_generations=100,\n    early_termination_iters=25,\n    n_jobs=-1,\n)\n\nselector.fit(generated_features, y_train)\n\nselected_features = selector.transform(generated_features)\n```\n\nLet's print out the selected features to see what the Genetic Feature Selection algorithm kept. You can see below that featuristic has kept four of the original features (\"weight\", \"acceleration\", \"model_year\" and \"origin\") plus four of the features created via the Genetic Feature Synthesis.\n\n```python\nprint(selected_features.head())\n```\n\n|    |   weight |   acceleration |   model_year |   origin |   feature_0 |   feature_4 |   feature_11 |   feature_1 |\n|---:|---------:|---------------:|-------------:|---------:|------------:|------------:|-------------:|------------:|\n|  0 |     2050 |           17.3 |           81 |        3 |    -8571.63 |   -0.312535 |     -96.7449 |   -105.823  |\n|  1 |     4077 |           14   |           72 |        1 |    -2488.32 |   -0.786564 |     -75.1698 |    -34.56   |\n|  2 |     3563 |           10   |           70 |        1 |    -2017.65 |   -0.727317 |     -71.8277 |    -28.8235 |\n|  3 |     4060 |           19   |           77 |        1 |    -4150.3  |   -0.684937 |     -82.6269 |    -53.9    |\n|  4 |     4080 |           13.7 |           78 |        1 |    -3389.66 |   -0.670713 |     -81.3604 |    -43.4571 |\n\nNow that we've selected our features, let's see whether they actually help our model's predictive performance on our test data set. We'll start off with the original features as a baseline.\n\n```python\nmodel = LinearRegression()\nmodel.fit(X_train, y_train)\npreds = model.predict(X_test)\noriginal_mae = mean_absolute_error(y_test, preds)\nprint(original_mae)\n```\n\n```\n2.5888868138669303\n```\n\nAnd now, let's see how the model performs with our synthesised feature set.\n\n```python\nmodel = LinearRegression()\nmodel.fit(selected_features, y_train)\ntest_features = selector.transform(synth.transform(X_test))\npreds = model.predict(test_features)\nfeaturistic_mae = mean_absolute_error(y_test, preds)\nprint(featuristic_mae)\n```\n\n```\n1.9497667311649802\n```\n\n```python\nprint(f\"Original MAE: {original_mae}\")\nprint(f\"Featuristic MAE: {featuristic_mae}\")\nprint(f\"Improvement: {round((1 - (featuristic_mae / original_mae))* 100, 1)}%\")\n```\n\n```\nOriginal MAE: 2.5888868138669303\nFeaturistic MAE: 1.9497667311649802\nImprovement: 24.7%\n```\n\nThe new features generated / selected by the Genetic Feature Synthesis have successfully reduced our mean absolute error &#128512;\n",
    "bugtrack_url": null,
    "license": "Copyright 2024 Martin Eastwood  Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the \"Software\"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:  The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.  THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.",
    "summary": "Genetic algorithms for automated feature engineering and feature selection",
    "version": "1.1.0",
    "project_urls": {
        "homepage": "https://www.featuristic.co.uk",
        "repository": "https://github.com/martineastwood/featuristic.git"
    },
    "split_keywords": [
        "data",
        " machine learning",
        " feature engineering",
        " feature selection",
        " model selection",
        " model evaluation",
        " genetic algorithm",
        " optimization",
        " hyperparameter tuning"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "fe54da484ac2cf1e94fd578dc483fe02ccfc0f01eb1153714a652ef6953e3bac",
                "md5": "2c73def72682fde4d241b7ce135ddbd2",
                "sha256": "7c590c361eaa512ff3f1c390ee5e02a32b3e6d285b7e276b04b34f5c3ed74fcf"
            },
            "downloads": -1,
            "filename": "featuristic-1.1.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "2c73def72682fde4d241b7ce135ddbd2",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.8",
            "size": 23994,
            "upload_time": "2024-04-10T14:40:03",
            "upload_time_iso_8601": "2024-04-10T14:40:03.983473Z",
            "url": "https://files.pythonhosted.org/packages/fe/54/da484ac2cf1e94fd578dc483fe02ccfc0f01eb1153714a652ef6953e3bac/featuristic-1.1.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "df3a7471005c5514cefe4fdd985b9ba36bf736355be215746193e5158aa49183",
                "md5": "44bad2cc605dd426911f756430bc21e5",
                "sha256": "356bd88468b53c1dd8426e439700cef94f2b359f9adf4e155c9cfd7d62d9e677"
            },
            "downloads": -1,
            "filename": "featuristic-1.1.0.tar.gz",
            "has_sig": false,
            "md5_digest": "44bad2cc605dd426911f756430bc21e5",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.8",
            "size": 25683,
            "upload_time": "2024-04-10T14:40:08",
            "upload_time_iso_8601": "2024-04-10T14:40:08.160268Z",
            "url": "https://files.pythonhosted.org/packages/df/3a/7471005c5514cefe4fdd985b9ba36bf736355be215746193e5158aa49183/featuristic-1.1.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-04-10 14:40:08",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "martineastwood",
    "github_project": "featuristic",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "featuristic"
}
        
Elapsed time: 0.22063s