<p align="center">
<img width="800" src="https://github.com/conect2ai/Conect2Py-Package/assets/56210040/60055d32-77f0-4381-bfc1-c9300eb30920" />
</p>
# Conect2Ai - TAC python package
Conect2Py-Package is the name given for the Conect2ai Python software package. The package contains the implementation of TAC, an algorithm for data compression using TAC (Tiny Anomaly Compression). The TAC algorithm is based on the concept the data eccentricity and does not require previously established mathematical models or any assumptions about the underlying data distribution. Additionally, it uses recursive equations, which enables an efficient computation with low computational cost, using little memory and processing power.
<p align="right">
<img alt="version" src="https://img.shields.io/badge/version-0.1.1-blue">
</p>
- [Installation](#installation)
- [Usage](#usage-example)
- [References](#literature-reference)
- [License](#license)
---
#### Dependencies
```bash
Pandas, Numpy, Matplotlib, Seaborn, Scikit-learn, Ipython
```
---
## Installation
#### You can download our package from the PyPi repository using the following command:
```bash
pip install conect2py
```
#### If you want to install it locally you download the Wheel distribution from [Build Distribution](https://pypi.org/project/conect2py/0.1.1/#files).
*First navigate to the folder where you downloaded the file and run the following command:*
```bash
pip install conect2py-0.1.1-py3-none-any.whl
```
---
## Usage Example
This usage example can be run on a Colab Notebook here
[![Jupyter](https://img.shields.io/badge/-Notebook-191A1B?style=flat-square&logo=jupyter)](https://colab.research.google.com/drive/1tIjo1skrBGJhSckNrAQos3K7pX3pka-s?usp=sharing)
To begin you can import `conect2py` using
```Python
# FULL PACKAGE
import conect2py
```
Or try each of our implemented functionalities
```Python
# MODEL FUNCTIONS
from conect2py.models import TAC
from conect2py.models import AutoTAC
# RUN FUNCTIONS
from conect2py.run.single import print_run_details, run_single_online, run_single_offline
from conect2py.run.multiple import run_multiple_instances, get_optimal_params, display_multirun_optimal_values, run_optimal_combination
# UTILS FUNCTIONS
from conect2py.utils.format_save import create_param_combinations, create_compressor_list, create_eval_df
from conect2py.utils.metrics import get_compression_report, print_compression_report, calc_statistics
from conect2py.utils.plots import plot_curve_comparison, plot_dist_comparison, plot_multirun_metric_results
```
### *Running Multiple tests with TAC*
- Setting up the initial variables
```Python
model_name = 'TAC_Compression'
params = {
'window_size': np.arange(2, 30, 1),
'm': np.round(np.arange(0.1, 2.1, 0.1), 2),
}
param_combination = create_param_combinations(params)
compressor_list = create_compressor_list(param_combination)
```
- Once you created the list of compressors you can run
```Python
result_df = run_multiple_instances(compressor_list=compressor_list,
param_list=param_combination,
series_to_compress=dataframe['sensor_data'].dropna(),
cf_score_beta=2
)
```
- This function returns a pandas Dataframe containing the results of all compression methods. You can expect something like:
| | param | reduction_rate | reduction_factor | mse | rmse | nrmse | mae | psnr | ncc | cf_score |
| - | --------- | -------------- | ---------------- | ------- | ------ | ------- | ------- | ------- | ------ | --------- |
| 0 | (2, 0.1) | 0.4507 | 1.8204 | 0.0648 | 0.2545 | 0.0609 | 0.0127 | 39.9824 | 0.9982 | 0.8031 |
| 1 | (2, 0.2) | 0.4507 | 1.8204 | 0.0648 | 0.2545 | 0.0609 | 0.0127 | 39.9823 | 0.9982 | 0.8031 |
| 2 | (2, 0.3) | 0.4507 | 1.8204 | 0.0648 | 0.2545 | 0.0609 | 0.0127 | 39.9823 | 0.9982 | 0.8031 |
| 3 | (2, 0.4) | 0.4508 | 1.8209 | 0.0648 | 0.2545 | 0.0609 | 0.0127 | 39.9824 | 0.9982 | 0.8032 |
| 4 | (2, 0.5) | 0.4511 | 1.8217 | 0.0648 | 0.2545 | 0.0609 | 0.0128 | 39.9823 | 0.9982 | 0.8033 |
- You can also check the optimal combination by running the following code:
```Python
display_multirun_optimal_values(result_df=result_df)
```
> Parameter combinations for MAX CF_SCORE
>
> param reduction_rate reduction_factor mse rmse nrmse \
> 440 (24, 0.1) 0.9224 12.8919 0.6085 0.7801 0.1867
>
> mae psnr ncc cf_score
> 440 0.1294 30.254 0.9825 0.9698
> Parameter combinations for NEAR MAX CF_SCORE
>
>
> param reduction_rate reduction_factor mse rmse nrmse \
> 521 (28, 0.2) 0.9336 15.0531 1.1504 1.0726 0.2567
> 364 (20, 0.5) 0.9118 11.3396 0.9458 0.9725 0.2328
> 262 (15, 0.3) 0.8810 8.4029 0.6337 0.7960 0.1905
> 363 (20, 0.4) 0.9102 11.1352 0.9084 0.9531 0.2281
> 543 (29, 0.4) 0.9372 15.9222 1.1474 1.0712 0.2564
>
> mae psnr ncc cf_score
> 521 0.1810 27.4883 0.9666 0.9598
> 364 0.1431 28.3388 0.9726 0.9598
> 262 0.0907 30.0780 0.9817 0.9598
> 363 0.1323 28.5140 0.9737 0.9603
> 543 0.1925 27.4996 0.9667 0.9607
---
### *Visualize multirun results with a plot*
- By default this plot returns a visualization for the metrics `reduction_rate`, `ncc` and `cf_score`.
```Python
plot_multirun_metric_results(result_df=result_df)
```
- The result should look like this;
![image](https://github.com/conect2ai/Conect2Py-Package/assets/56210040/143b1da9-3e45-4ebc-bcc0-2cafd44ec925)
---
### *Running a single complession with the optimal parameter found*
- You don't need to run the visualization and the `display_multirun_optimal_values` in order to get the optimal compressor created, by running the following code it's possible to get the best result:
```Python
optimal_param_list = get_optimal_params(result_df=result_df)
print("Best compressor param combination: ", optimal_param_list)
```
- With the list of optimal parameter (There is a possibility that multiple compressors are considered the best) run the function below to get get the compression result.
```Python
points_to_keep, optimal_results_details = run_optimal_combination(optimal_list=optimal_param_list,
serie_to_compress=dataframe['sensor_data'].dropna(),
model='TAC'
)
```
- If you want to see the result details use:
```Python
print_run_details(optimal_results_details)
```
> POINTS:
> - total checked: 30889
> - total kept: 1199
> - percentage discaded: 96.12 %
>
> POINT EVALUATION TIMES (ms):
> - mean: 0.003636738161744472
> - std: 0.15511020000857362
> - median: 0.0
> - max: 13.513565063476562
> - min: 0.0
> - total: 112.335205078125
>
> RUN TIME (ms):
> - total: 124.2864
---
### *Evaluating the Results*
- Now, to finish the process of the compression, you should follow the next steps:
**1. Step - Create the evaluation dataframe:**
```Python
evaluation_df = create_eval_df(original=dataframe['sensor_data'].dropna(), flag=points_to_keep)
evaluation_df.info()
```
**2. Step - Evaluate the performance:**
```Python
report = get_compression_report(
original=evaluation_df['original'],
compressed=evaluation_df['compressed'],
decompressed=evaluation_df['decompressed'],
cf_score_beta=2
)
print_compression_report(
report,
model_name=model_name,
cf_score_beta=2,
model_params=optimal_param_list
)
```
After that you expect to see something like the following informations:
> RUN INFO
> - Model: TAC_Compression
> - Optimal Params: [(24, 0.1)]
> - CF-Score Beta: 2
>
> RESULTS
>
> SAMPLES NUMBER reduction
> - Original length: 30889 samples
> - Reduced length: 1199 samples
> - Samples reduced by a factor of 25.76 times
> - Sample reduction rate: 96.12%
>
> FILE SIZE compression
> - Original size: 385549 Bytes
> - Compressed size: 14974 Bytes
> - file compressed by a factor of 25.75 times
> - file compression rate: 96.12%
>
> METRICS
> - MSE: 0.622
> - RMSE: 0.7886
> - NRMSE: 0.1888
> - MAE: 0.1384
> - PSNR: 30.1591
> - NCC: 0.9821
> - CF-Score: 0.9778
**3. Step - Create the model visualizations:**
```Python
# plot the curves comparison (original vs decompressed)
plot_curve_comparison(
evaluation_df.original,
evaluation_df.decompressed,
show=True
)
```
And finally here is a example of the result:
![image](https://github.com/conect2ai/Conect2Py-Package/assets/56210040/978fdeaa-688c-4c8a-90c7-7726eab96302)
## Other Models
Please check the [informations](https://github.com/conect2ai/Conect2Py-Package/blob/main/Info/AutoCloud.md) for more information about the other models been implemented in this package.
# Literature reference
1. Signoretti, G.; Silva, M.; Andrade, P.; Silva, I.; Sisinni, E.; Ferrari, P. "An Evolving TinyML Compression Algorithm for IoT Environments Based on Data Eccentricity". Sensors 2021, 21, 4153. https://doi.org/10.3390/s21124153
2. Medeiros, T.; Amaral, M.; Targino, M; Silva, M.; Silva, I.; Sisinni, E.; Ferrari, P.; "TinyML Custom AI Algorithms for Low-Power IoT Data Compression: A Bridge Monitoring Case Study" - 2023 IEEE International Workshop on Metrology for Industry 4.0 & IoT (MetroInd4.0&IoT), 2023. [10.1109/MetroInd4.0IoT57462.2023.10180152](https://ieeexplore.ieee.org/document/10180152])
# License
This package is licensed under the [MIT License](https://github.com/conect2ai/Conect2Py-Package/blob/main/LICENSE) - © 2023 Conect2ai.
Raw data
{
"_id": null,
"home_page": "https://github.com/conect2ai/Conect2Py-Package",
"name": "conect2py",
"maintainer": null,
"docs_url": null,
"requires_python": null,
"maintainer_email": null,
"keywords": "TEDA, TAC, Annomaly Detection, Data Compression, IoT, Eccentricity",
"author": "Conect2ai",
"author_email": "conect2ai@gmail.com",
"download_url": "https://files.pythonhosted.org/packages/22/84/5e34aacaa84010859b72b17c972fcb8d5baad792653c4b276206c61139ab/conect2py-0.1.4.tar.gz",
"platform": null,
"description": " \r\n \r\n<p align=\"center\">\r\n <img width=\"800\" src=\"https://github.com/conect2ai/Conect2Py-Package/assets/56210040/60055d32-77f0-4381-bfc1-c9300eb30920\" />\r\n</p> \r\n\r\n \r\n\r\n\r\n# Conect2Ai - TAC python package\r\n\r\nConect2Py-Package is the name given for the Conect2ai Python software package. The package contains the implementation of TAC, an algorithm for data compression using TAC (Tiny Anomaly Compression). The TAC algorithm is based on the concept the data eccentricity and does not require previously established mathematical models or any assumptions about the underlying data distribution. Additionally, it uses recursive equations, which enables an efficient computation with low computational cost, using little memory and processing power.\r\n\r\n<p align=\"right\">\r\n <img alt=\"version\" src=\"https://img.shields.io/badge/version-0.1.1-blue\">\r\n</p>\r\n\r\n\r\n\r\n- [Installation](#installation)\r\n- [Usage](#usage-example)\r\n- [References](#literature-reference)\r\n- [License](#license)\r\n\r\n---\r\n#### Dependencies\r\n\r\n```bash\r\nPandas, Numpy, Matplotlib, Seaborn, Scikit-learn, Ipython\r\n```\r\n\r\n---\r\n## Installation\r\n\r\n#### You can download our package from the PyPi repository using the following command:\r\n\r\n```bash\r\npip install conect2py\r\n```\r\n\r\n#### If you want to install it locally you download the Wheel distribution from [Build Distribution](https://pypi.org/project/conect2py/0.1.1/#files).\r\n\r\n*First navigate to the folder where you downloaded the file and run the following command:*\r\n\r\n```bash\r\npip install conect2py-0.1.1-py3-none-any.whl\r\n```\r\n\r\n---\r\n\r\n## Usage Example\r\n\r\nThis usage example can be run on a Colab Notebook here\r\n[![Jupyter](https://img.shields.io/badge/-Notebook-191A1B?style=flat-square&logo=jupyter)](https://colab.research.google.com/drive/1tIjo1skrBGJhSckNrAQos3K7pX3pka-s?usp=sharing)\r\n\r\nTo begin you can import `conect2py` using\r\n\r\n```Python\r\n# FULL PACKAGE\r\nimport conect2py\r\n```\r\n\r\nOr try each of our implemented functionalities\r\n\r\n```Python\r\n# MODEL FUNCTIONS\r\nfrom conect2py.models import TAC\r\nfrom conect2py.models import AutoTAC\r\n\r\n# RUN FUNCTIONS\r\nfrom conect2py.run.single import print_run_details, run_single_online, run_single_offline\r\nfrom conect2py.run.multiple import run_multiple_instances, get_optimal_params, display_multirun_optimal_values, run_optimal_combination\r\n\r\n# UTILS FUNCTIONS\r\nfrom conect2py.utils.format_save import create_param_combinations, create_compressor_list, create_eval_df \r\nfrom conect2py.utils.metrics import get_compression_report, print_compression_report, calc_statistics\r\nfrom conect2py.utils.plots import plot_curve_comparison, plot_dist_comparison, plot_multirun_metric_results \r\n\r\n```\r\n\r\n### *Running Multiple tests with TAC*\r\n- Setting up the initial variables\r\n\r\n```Python\r\nmodel_name = 'TAC_Compression'\r\n\r\nparams = {\r\n 'window_size': np.arange(2, 30, 1),\r\n 'm': np.round(np.arange(0.1, 2.1, 0.1), 2),\r\n}\r\n\r\nparam_combination = create_param_combinations(params)\r\ncompressor_list = create_compressor_list(param_combination)\r\n```\r\n\r\n- Once you created the list of compressors you can run\r\n\r\n```Python\r\nresult_df = run_multiple_instances(compressor_list=compressor_list, \r\n param_list=param_combination,\r\n series_to_compress=dataframe['sensor_data'].dropna(),\r\n cf_score_beta=2\r\n )\r\n```\r\n\r\n- This function returns a pandas Dataframe containing the results of all compression methods. You can expect something like:\r\n\r\n| | param |\treduction_rate | reduction_factor |\tmse\t | rmse |\tnrmse |\tmae |\tpsnr\t | ncc\t | cf_score |\r\n| - | --------- | -------------- | ---------------- | ------- | ------ | ------- | ------- | ------- | ------ | --------- |\r\n| 0\t| (2, 0.1)\t| 0.4507 | 1.8204 | 0.0648\t| 0.2545 |\t0.0609 | 0.0127\t | 39.9824 | 0.9982\t| 0.8031 |\r\n| 1\t| (2, 0.2)\t| 0.4507\t | 1.8204 | 0.0648\t| 0.2545 |\t0.0609 | 0.0127\t | 39.9823 | 0.9982\t| 0.8031 |\r\n| 2\t| (2, 0.3)\t| 0.4507\t | 1.8204 | 0.0648\t| 0.2545 |\t0.0609 | 0.0127\t | 39.9823 | 0.9982\t| 0.8031 |\r\n| 3\t| (2, 0.4)\t| 0.4508\t | 1.8209 |\t0.0648\t| 0.2545 |\t0.0609 | 0.0127\t | 39.9824 | 0.9982\t| 0.8032 |\r\n| 4\t| (2, 0.5)\t| 0.4511\t | 1.8217 |\t0.0648\t| 0.2545 |\t0.0609 | 0.0128\t | 39.9823 | 0.9982\t| 0.8033 |\r\n\r\n\r\n- You can also check the optimal combination by running the following code:\r\n\r\n```Python\r\ndisplay_multirun_optimal_values(result_df=result_df)\r\n```\r\n> Parameter combinations for MAX CF_SCORE\r\n> \r\n> param reduction_rate reduction_factor mse rmse nrmse \\\r\n> 440 (24, 0.1) 0.9224 12.8919 0.6085 0.7801 0.1867 \r\n>\r\n> mae psnr ncc cf_score \r\n> 440 0.1294 30.254 0.9825 0.9698\r\n> Parameter combinations for NEAR MAX CF_SCORE\r\n>\r\n>\r\n> param reduction_rate reduction_factor mse rmse nrmse \\\r\n> 521 (28, 0.2) 0.9336 15.0531 1.1504 1.0726 0.2567 \r\n> 364 (20, 0.5) 0.9118 11.3396 0.9458 0.9725 0.2328 \r\n> 262 (15, 0.3) 0.8810 8.4029 0.6337 0.7960 0.1905 \r\n> 363 (20, 0.4) 0.9102 11.1352 0.9084 0.9531 0.2281 \r\n> 543 (29, 0.4) 0.9372 15.9222 1.1474 1.0712 0.2564 \r\n>\r\n> mae psnr ncc cf_score \r\n> 521 0.1810 27.4883 0.9666 0.9598 \r\n> 364 0.1431 28.3388 0.9726 0.9598 \r\n> 262 0.0907 30.0780 0.9817 0.9598 \r\n> 363 0.1323 28.5140 0.9737 0.9603 \r\n> 543 0.1925 27.4996 0.9667 0.9607 \r\n\r\n\r\n---\r\n\r\n### *Visualize multirun results with a plot*\r\n\r\n- By default this plot returns a visualization for the metrics `reduction_rate`, `ncc` and `cf_score`. \r\n```Python\r\nplot_multirun_metric_results(result_df=result_df)\r\n```\r\n- The result should look like this;\r\n\r\n![image](https://github.com/conect2ai/Conect2Py-Package/assets/56210040/143b1da9-3e45-4ebc-bcc0-2cafd44ec925)\r\n\r\n\r\n\r\n---\r\n\r\n### *Running a single complession with the optimal parameter found*\r\n\r\n- You don't need to run the visualization and the `display_multirun_optimal_values` in order to get the optimal compressor created, by running the following code it's possible to get the best result: \r\n```Python\r\noptimal_param_list = get_optimal_params(result_df=result_df)\r\nprint(\"Best compressor param combination: \", optimal_param_list)\r\n```\r\n\r\n- With the list of optimal parameter (There is a possibility that multiple compressors are considered the best) run the function below to get get the compression result. \r\n\r\n```Python\r\npoints_to_keep, optimal_results_details = run_optimal_combination(optimal_list=optimal_param_list,\r\n serie_to_compress=dataframe['sensor_data'].dropna(),\r\n model='TAC'\r\n )\r\n```\r\n\r\n- If you want to see the result details use:\r\n```Python\r\nprint_run_details(optimal_results_details)\r\n```\r\n> POINTS:\r\n> - total checked: 30889\r\n> - total kept: 1199\r\n> - percentage discaded: 96.12 %\r\n>\r\n> POINT EVALUATION TIMES (ms): \r\n> - mean: 0.003636738161744472\r\n> - std: 0.15511020000857362\r\n> - median: 0.0\r\n> - max: 13.513565063476562\r\n> - min: 0.0\r\n> - total: 112.335205078125\r\n>\r\n> RUN TIME (ms):\r\n> - total: 124.2864\r\n\r\n---\r\n\r\n### *Evaluating the Results*\r\n\r\n- Now, to finish the process of the compression, you should follow the next steps:\r\n\r\n**1. Step - Create the evaluation dataframe:**\r\n \r\n ```Python\r\n evaluation_df = create_eval_df(original=dataframe['sensor_data'].dropna(), flag=points_to_keep)\r\n evaluation_df.info()\r\n ```\r\n\r\n**2. Step - Evaluate the performance:**\r\n \r\n```Python\r\nreport = get_compression_report(\r\n original=evaluation_df['original'],\r\n compressed=evaluation_df['compressed'],\r\n decompressed=evaluation_df['decompressed'],\r\n cf_score_beta=2\r\n)\r\n\r\nprint_compression_report(\r\n report, \r\n model_name=model_name,\r\n cf_score_beta=2,\r\n model_params=optimal_param_list\r\n)\r\n```\r\n\r\nAfter that you expect to see something like the following informations:\r\n\r\n> RUN INFO \r\n> - Model: TAC_Compression\r\n> - Optimal Params: [(24, 0.1)]\r\n> - CF-Score Beta: 2\r\n>\r\n> RESULTS \r\n>\r\n> SAMPLES NUMBER reduction\r\n> - Original length: 30889 samples\r\n> - Reduced length: 1199 samples\r\n> - Samples reduced by a factor of 25.76 times\r\n> - Sample reduction rate: 96.12%\r\n>\r\n> FILE SIZE compression\r\n> - Original size: 385549 Bytes\r\n> - Compressed size: 14974 Bytes\r\n> - file compressed by a factor of 25.75 times\r\n> - file compression rate: 96.12%\r\n>\r\n> METRICS\r\n> - MSE: 0.622\r\n> - RMSE: 0.7886\r\n> - NRMSE: 0.1888\r\n> - MAE: 0.1384\r\n> - PSNR: 30.1591\r\n> - NCC: 0.9821\r\n> - CF-Score: 0.9778\r\n\r\n\r\n**3. Step - Create the model visualizations:**\r\n\r\n```Python\r\n# plot the curves comparison (original vs decompressed)\r\nplot_curve_comparison(\r\n evaluation_df.original,\r\n evaluation_df.decompressed,\r\n show=True\r\n)\r\n\r\n```\r\n\r\nAnd finally here is a example of the result:\r\n\r\n![image](https://github.com/conect2ai/Conect2Py-Package/assets/56210040/978fdeaa-688c-4c8a-90c7-7726eab96302)\r\n\r\n\r\n\r\n\r\n## Other Models\r\n\r\nPlease check the [informations](https://github.com/conect2ai/Conect2Py-Package/blob/main/Info/AutoCloud.md) for more information about the other models been implemented in this package.\r\n\r\n\r\n# Literature reference\r\n\r\n\r\n1. Signoretti, G.; Silva, M.; Andrade, P.; Silva, I.; Sisinni, E.; Ferrari, P. \"An Evolving TinyML Compression Algorithm for IoT Environments Based on Data Eccentricity\". Sensors 2021, 21, 4153. https://doi.org/10.3390/s21124153\r\n\r\n2. Medeiros, T.; Amaral, M.; Targino, M; Silva, M.; Silva, I.; Sisinni, E.; Ferrari, P.; \"TinyML Custom AI Algorithms for Low-Power IoT Data Compression: A Bridge Monitoring Case Study\" - 2023 IEEE International Workshop on Metrology for Industry 4.0 & IoT (MetroInd4.0&IoT), 2023. [10.1109/MetroInd4.0IoT57462.2023.10180152](https://ieeexplore.ieee.org/document/10180152])\r\n\r\n# License\r\n\r\nThis package is licensed under the [MIT License](https://github.com/conect2ai/Conect2Py-Package/blob/main/LICENSE) - \u00c2\u00a9 2023 Conect2ai.\r\n",
"bugtrack_url": null,
"license": null,
"summary": "A python library for data compression using TAC (Tiny Anomaly Compression)",
"version": "0.1.4",
"project_urls": {
"Homepage": "https://github.com/conect2ai/Conect2Py-Package"
},
"split_keywords": [
"teda",
" tac",
" annomaly detection",
" data compression",
" iot",
" eccentricity"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "c9d7741acd8ccfaf6804ef44ce21c1b36d95b7a22d203b8994e04b9557e779e3",
"md5": "454b0301b2e40648ee0a126ee0dd3f52",
"sha256": "19a4481e71d2a1a1b2484e307d9ac1729128de789972d7dd4e773edab61b1b59"
},
"downloads": -1,
"filename": "conect2py-0.1.4-py3-none-any.whl",
"has_sig": false,
"md5_digest": "454b0301b2e40648ee0a126ee0dd3f52",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": null,
"size": 27709,
"upload_time": "2024-06-18T14:38:47",
"upload_time_iso_8601": "2024-06-18T14:38:47.706765Z",
"url": "https://files.pythonhosted.org/packages/c9/d7/741acd8ccfaf6804ef44ce21c1b36d95b7a22d203b8994e04b9557e779e3/conect2py-0.1.4-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "22845e34aacaa84010859b72b17c972fcb8d5baad792653c4b276206c61139ab",
"md5": "463c482c7692df650d4fc9f20fbb09aa",
"sha256": "4259583e48dc980718b6394d7704203a5a1c8c548e75d747a1c9e7ad2b68fa2a"
},
"downloads": -1,
"filename": "conect2py-0.1.4.tar.gz",
"has_sig": false,
"md5_digest": "463c482c7692df650d4fc9f20fbb09aa",
"packagetype": "sdist",
"python_version": "source",
"requires_python": null,
"size": 27018,
"upload_time": "2024-06-18T14:38:53",
"upload_time_iso_8601": "2024-06-18T14:38:53.457279Z",
"url": "https://files.pythonhosted.org/packages/22/84/5e34aacaa84010859b72b17c972fcb8d5baad792653c4b276206c61139ab/conect2py-0.1.4.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-06-18 14:38:53",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "conect2ai",
"github_project": "Conect2Py-Package",
"travis_ci": false,
"coveralls": false,
"github_actions": false,
"requirements": [
{
"name": "asttokens",
"specs": [
[
"==",
"2.4.0"
]
]
},
{
"name": "backcall",
"specs": [
[
"==",
"0.2.0"
]
]
},
{
"name": "certifi",
"specs": [
[
"==",
"2023.7.22"
]
]
},
{
"name": "charset-normalizer",
"specs": [
[
"==",
"3.3.1"
]
]
},
{
"name": "colorama",
"specs": [
[
"==",
"0.4.6"
]
]
},
{
"name": "contourpy",
"specs": [
[
"==",
"1.1.1"
]
]
},
{
"name": "cycler",
"specs": [
[
"==",
"0.12.1"
]
]
},
{
"name": "decorator",
"specs": [
[
"==",
"5.1.1"
]
]
},
{
"name": "docutils",
"specs": [
[
"==",
"0.20.1"
]
]
},
{
"name": "executing",
"specs": [
[
"==",
"2.0.0"
]
]
},
{
"name": "fonttools",
"specs": [
[
"==",
"4.43.1"
]
]
},
{
"name": "idna",
"specs": [
[
"==",
"3.4"
]
]
},
{
"name": "importlib-metadata",
"specs": [
[
"==",
"6.8.0"
]
]
},
{
"name": "ipython",
"specs": [
[
"==",
"8.16.1"
]
]
},
{
"name": "jaraco.classes",
"specs": [
[
"==",
"3.3.0"
]
]
},
{
"name": "jedi",
"specs": [
[
"==",
"0.19.1"
]
]
},
{
"name": "joblib",
"specs": [
[
"==",
"1.3.2"
]
]
},
{
"name": "keyring",
"specs": [
[
"==",
"24.2.0"
]
]
},
{
"name": "kiwisolver",
"specs": [
[
"==",
"1.4.5"
]
]
},
{
"name": "markdown-it-py",
"specs": [
[
"==",
"3.0.0"
]
]
},
{
"name": "matplotlib",
"specs": [
[
"==",
"3.8.0"
]
]
},
{
"name": "matplotlib-inline",
"specs": [
[
"==",
"0.1.6"
]
]
},
{
"name": "mdurl",
"specs": [
[
"==",
"0.1.2"
]
]
},
{
"name": "more-itertools",
"specs": [
[
"==",
"10.1.0"
]
]
},
{
"name": "nh3",
"specs": [
[
"==",
"0.2.14"
]
]
},
{
"name": "numpy",
"specs": [
[
"==",
"1.26.1"
]
]
},
{
"name": "packaging",
"specs": [
[
"==",
"23.2"
]
]
},
{
"name": "pandas",
"specs": [
[
"==",
"2.1.1"
]
]
},
{
"name": "parso",
"specs": [
[
"==",
"0.8.3"
]
]
},
{
"name": "pickleshare",
"specs": [
[
"==",
"0.7.5"
]
]
},
{
"name": "Pillow",
"specs": [
[
"==",
"10.2.0"
]
]
},
{
"name": "pkginfo",
"specs": [
[
"==",
"1.9.6"
]
]
},
{
"name": "prompt-toolkit",
"specs": [
[
"==",
"3.0.39"
]
]
},
{
"name": "pure-eval",
"specs": [
[
"==",
"0.2.2"
]
]
},
{
"name": "Pygments",
"specs": [
[
"==",
"2.16.1"
]
]
},
{
"name": "pyparsing",
"specs": [
[
"==",
"3.1.1"
]
]
},
{
"name": "python-dateutil",
"specs": [
[
"==",
"2.8.2"
]
]
},
{
"name": "pytz",
"specs": [
[
"==",
"2023.3.post1"
]
]
},
{
"name": "pywin32-ctypes",
"specs": [
[
"==",
"0.2.2"
]
]
},
{
"name": "readme-renderer",
"specs": [
[
"==",
"42.0"
]
]
},
{
"name": "requests",
"specs": [
[
"==",
"2.31.0"
]
]
},
{
"name": "requests-toolbelt",
"specs": [
[
"==",
"1.0.0"
]
]
},
{
"name": "rfc3986",
"specs": [
[
"==",
"2.0.0"
]
]
},
{
"name": "rich",
"specs": [
[
"==",
"13.6.0"
]
]
},
{
"name": "scikit-learn",
"specs": [
[
"==",
"1.3.2"
]
]
},
{
"name": "scipy",
"specs": [
[
"==",
"1.11.3"
]
]
},
{
"name": "seaborn",
"specs": [
[
"==",
"0.13.0"
]
]
},
{
"name": "six",
"specs": [
[
"==",
"1.16.0"
]
]
},
{
"name": "stack-data",
"specs": [
[
"==",
"0.6.3"
]
]
},
{
"name": "threadpoolctl",
"specs": [
[
"==",
"3.2.0"
]
]
},
{
"name": "traitlets",
"specs": [
[
"==",
"5.11.2"
]
]
},
{
"name": "twine",
"specs": [
[
"==",
"4.0.2"
]
]
},
{
"name": "tzdata",
"specs": [
[
"==",
"2023.3"
]
]
},
{
"name": "urllib3",
"specs": [
[
"==",
"2.0.7"
]
]
},
{
"name": "wcwidth",
"specs": [
[
"==",
"0.2.8"
]
]
},
{
"name": "zipp",
"specs": [
[
"==",
"3.17.0"
]
]
}
],
"lcname": "conect2py"
}