# PyXAI - Python eXplainable AI
- Documentation: [http://www.cril.univ-artois.fr/pyxai/](http://www.cril.univ-artois.fr/pyxai/)
- Git: [https://github.com/crillab/pyxai](https://github.com/crillab/pyxai)
- Installation: [http://www.cril.univ-artois.fr/pyxai/documentation/installation/](http://www.cril.univ-artois.fr/pyxai/documentation/installation/)
> <b> New features in version 1:</b>
> <ul>
> <li>Regression for Boosted Trees</li>
> <li>Support of Theories</li>
> <li>Easier importation of models</li>
> <li>PyXAI's Graphical User Interface (GUI)</li>
> <li>Supports multiple image formats for imaging datasets</li>
> <li>Supports data pre-processing (tool for preparing and cleansing a dataset)</li>
> </ul>
<figure>
<img src="https://lh3.googleusercontent.com/drive-viewer/AITFw-xLC9-pvcsp0MGlTOBODqrs8aJogGpnAlAnVrh41EetySebz-VNzJW9PkHLmYUIBb_SaqlOpGBLsAm8IY5WIo73xNj0=s1600" alt="pyxai" />
<figcaption>PyXAI's main steps for producing explanations.</figcaption>
</figure>
<figure>
<img src="https://lh3.googleusercontent.com/drive-viewer/AITFw-yqn-ZOIW2u7a2XxVH9UNcr5SQQnxUH8b1wfLoReVa2f7zm68-S4GAbr7RWUYW1lKLJ957gLPaFn3077l4qZXUyv82T=s1600" alt="pyxai" />
<img src="https://lh3.googleusercontent.com/drive-viewer/AITFw-xDdbVt_DCAmsvJhRlMj3jxgADUVkFzHbnxmQnabdjfuPaylcyeHTyBgDZs4Xna_N_oT6pwxXBv_ls2nqRUwd8RiWgM=s1600" alt="pyxai" />
<figcaption>PyXAI's Graphical User Interface (GUI) for visualizing explanations.</figcaption>
</figure>
<h3>What is PyXAI ?</h3>
<p align="justify">
<b>PyXAI (Python eXplainable AI)</b> is a <a href="https://www.python.org/">Python</a> library (version 3.6 or later) allowing to bring explanations of various forms suited to <b>(regression or classification) tree-based ML models</b> (Decision Trees, Random Forests, Boosted Trees, ...). In contrast to many approaches to XAI (SHAP, Lime, ...), PyXAI algorithms are <b>model-specific</b>. Furthermore, PyXAI algorithms <b>guarantee certain properties</b> about the explanations generated, that can be of several types:
</p>
<ul>
<li><b>Abductive explanations</b> for an instance $X$ are intended to explain why $X$ has been classified in the way it has been classified by the ML model (thus, addressing the “Why?” question). For the regression tasks, abductive explanations for $X$ are intended to explain why the regression value on $X$ is in a given interval.</li>
<li><b>Contrastive explanations</b> for $X$ is to explain why $X$ has not been classified by the ML model as the user expected it (thus, addressing the “Why not?” question).</li>
</ul>
<h3>What is the difference between PyXAI and other methods ?</h3>
<p align="justify">
The most popular approaches (SHAP, Lime, ...) to XAI <b>are model-agnostic, but do not offer any guarantees</b> of rigor. A number of <a href="https://arxiv.org/pdf/2307.07514.pdf">works</a> have highlighted several misconceptions about informal approaches to XAI (see the <a href="{{ site.baseurl }}/papers/">related papers</a>). Contrastingly, <b>PyXAI algorithms rely on logic-based, model-precise</b> approaches for computing explanations. Although formal explainability has a number of drawbacks, particularly in terms of the computational complexity of logical reasoning needed to derive explanations, <b>steady progress has been made since its inception</b>.
</p>
<h3>Which models can be explained with PyXAI ?</h3>
<p align="justify">
Models are the resulting objects of an experimental ML protocol through a chosen <b>cross-validation method</b> (for example, the result of a training phase on a classifier). Importantly, in PyXAI, there is a complete separation between the learning phase and the explaining phase: <b>you produce/load/save models, and you find explanations for some instances given such models</b>. Currently, with PyXAI, you can use methods to find explanations suited to different <b>ML models for classification or regression tasks</b>:
</p>
<ul>
<li><a href="https://en.wikipedia.org/wiki/Decision_tree_learning">Decision Tree</a> (DT)</li>
<li><a href="https://en.wikipedia.org/wiki/Random_forest">Random Forest</a> (RF)</li>
<li><a href="https://en.wikipedia.org/wiki/Gradient_boosting">Boosted Tree (Gradient boosting)</a> (BT)</li>
</ul>
<p align="justify">
In addition to finding explanations, PyXAI also provides methods that perform operations (production, saving, loading) on models and instances. Currently, these methods are available for three <b>ML libraries</b>:
</p>
<ul>
<li><a href="https://scikit-learn.org/stable/">Scikit-learn</a>: a software machine learning library</li>
<li><a href="https://xgboost.readthedocs.io/en/stable/">XGBoost</a>: an optimized distributed gradient boosting library</li>
<li><a href="https://lightgbm.readthedocs.io/en/stable/">LightGBM</a>: a gradient boosting framework that uses tree based learning algorithms</li>
</ul>
<p align="justify">
It is possible to also leverage PyXAI to find explanations suited to models learned using other libraries.
</p>
<h3>What does this website offer ?</h3>
<p align="justify">
In this website, you can find all what you need to know about PyXAI, with more than 10 <a href="https://jupyter.org/">Jupyter</a> Notebooks, including:
</p>
<ul>
<li>The <a href="{{ site.baseurl }}/documentation/installation/">installation guide</a> and the <a href="{{ site.baseurl }}/documentation/quickstart/">quick start</a></li>
<li>About obtaining models:</li>
<ul>
<li>How to <b>prepare and clean a dataset</b> using the PyXAI <a href="{{ site.baseurl }}/documentation/preprocessor/">preprocessor</a> object?</li>
<li>How to <b>import a model</b>, whatever its format? <a href="{{ site.baseurl }}/documentation/importing/"> Importing Models</a> </li>
<li>How to <b>generate a model using a ML cross-validation method</b>? <a href="{{ site.baseurl }}/documentation/learning/generating/">Generating Models</a> </li>
<li>How to <b>build a model from trees directly built by the user</b>? <a href="{{ site.baseurl }}/documentation/learning/builder/">Building Models</a></li>
<li>How to <b>save and load models</b> with the PyXAI learning module? <a href="{{ site.baseurl }}/documentation/saving/">Saving/Loading Models</a></li>
</ul>
<li>About obtaining explanations:</li>
<ul>
<li>The <b>concepts of the PyXAI explainer module</b>: <a href="{{ site.baseurl }}/documentation/explainer/concepts/">Concepts</a> </li>
<li>How to use a <b>time limit</b>? <a href="{{ site.baseurl }}/documentation/explainer/time_limit/">Time Limit</a> </li>
<li>The PyXAI library offers the possibility to process user preferences (<b>prefer some explanations to others or exclude some features</b>): <a href="{{ site.baseurl }}/documentation/explainer/preferences/">Preferences</a> </li>
<li><b>Theories are knowledge about the dataset.</b> PyXAI offers the possibility of encoding a theory when calculating explanations in order to avoid calculating impossible explanations: <a href="{{ site.baseurl }}/documentation/explainer/theories/">Theories</a> </li>
<li>How to <b>compute explanations for classification tasks</b>? <a href="{{ site.baseurl }}/documentation/classification/">Explaining Classification</a> </li>
<li>How to <b>compute explanations for regression tasks</b>? <a href="{{ site.baseurl }}/documentation/regression/">Explaining Regression</a> </li>
</ul>
<li>How to use the <b>PyXAI's Graphical User Interface (GUI)</b> for <a href="{{ site.baseurl }}/documentation/visualization/">visualizing explanations</a>?</li>
</ul>
<h3>How to use PyXAI ?</h3>
<p align="justify">
Here is an example (it comes from the <a href="{{ site.baseurl }}/documentation/quickstart">Quick Start page</a>):
</p>
<h4 class="example">PyXAI in action</h4>
```python
from pyxai import Learning, Explainer
learner = Learning.Scikitlearn("tests/iris.csv", learner_type=Learning.CLASSIFICATION)
model = learner.evaluate(method=Learning.HOLD_OUT, output=Learning.DT)
instance, prediction = learner.get_instances(model, n=1, correct=True, predictions=[0])
explainer = Explainer.initialize(model, instance)
print("instance:", instance)
print("binary representation:", explainer.binary_representation)
sufficient_reason = explainer.sufficient_reason(n=1)
print("sufficient_reason:", sufficient_reason)
print("to_features:", explainer.to_features(sufficient_reason))
instance, prediction = learner.get_instances(model, n=1, correct=False)
explainer.set_instance(instance)
contrastive_reason = explainer.contrastive_reason()
print("contrastive reason", contrastive_reason)
print("to_features:", explainer.to_features(contrastive_reason, contrastive=True))
explainer.show()
```
<img src="https://lh3.googleusercontent.com/drive-viewer/AITFw-xbHs56zfQ_EHQ0-XqdHxy7mdL3fBxFRVnfW6pPCCCpSg89GStqQCBD5ElFLn3NaZmB-2mwY9hdu5TH0gPajOI2xwSCJQ=s1600" alt="pyxai" />
<p>As illustrated by this example, with a few lines of code, PyXAI allows you to train a model, extract instances, and get explanations about the classifications made.</p>
<br /><br />
<p align="center">
<a href="http://www.cril.univ-artois.fr"><img width="120px" src="https://lh3.googleusercontent.com/drive-viewer/AITFw-wsxZnVjsY1ypy7nGs2m__Iz5pDphw1wbc3a78HHVVqBhAFOx35hcvCGFaTfgDFlqGB_ChMWBfC-tlXUfX0twpqAnNfVg=s2560" /></a>
<a href="https://www.irt-systemx.fr/"><img width="120px" style="width: 80px;" src="https://lh3.googleusercontent.com/drive-viewer/AITFw-xuRWtP8WNuRXXaff32Tzd7OT4guc8vNEeXurAKIQiaeuIdeEYXo9hiA1HeGCgUY7I7NeT70U5yQt5BbwK6H4lv5jabQA=s2560" /></a>
<a href="https://www.cnrs.fr/"><img width="80px" style="width: 80px;" src="https://lh3.googleusercontent.com/drive-viewer/AITFw-xBV_ILK1g_mKMJ0Hk0wJtFmdKLnAT68QA7fMa5i663Tbla_Q2RjALnH6cER8BGAPThh_ZaOKpcO9ggkI1DAmU4zaEG=s1600" /></a>
<a href="https://www.confiance.ai/"><img width="120px" style="width: 80px;" src="https://lh3.googleusercontent.com/drive-viewer/AITFw-wiEyfiP29DKvwP5webvNRDXwXsS1PxQnTIZEdMpQ9xV9JN23-86HOqzNEBi9F4Ng8h-Kd8W5NKaWqefnGhhhQmxneu=s1600" /></a>
<a href="http://univ-artois.fr"><img width="120px" src="https://lh3.googleusercontent.com/drive-viewer/AITFw-wA-x2qgHNNrxLEaI33jDH64TM7sudMsTt781ICTAvzBsPaEtL2Ky_1Ba-QWm6YyqCmTuFGpylJ2sSXRgjzu7BM7iC8Xg=s2560" /></a>
</p>
Raw data
{
"_id": null,
"home_page": "",
"name": "pyxai-experimental",
"maintainer": "Gilles Audemard, Nicolas Szczepanski",
"docs_url": null,
"requires_python": ">=3.6",
"maintainer_email": "audemard@cril.fr, szczepanski@cril.fr",
"keywords": "XAI AI XML ML explainable learning",
"author": "Gilles Audemard, Steve Bellart, Louenas Bounia, Jean-Marie Lagniez, Pierre Marquis, Nicolas Szczepanski:",
"author_email": "audemard@cril.fr, bellart@cril.fr, bounia@cril.fr, lagniez@cril.fr, marquis@cril.fr, szczepanski@cril.fr",
"download_url": "",
"platform": "LINUX",
"description": "\n# PyXAI - Python eXplainable AI\n\n- Documentation: [http://www.cril.univ-artois.fr/pyxai/](http://www.cril.univ-artois.fr/pyxai/)\n- Git: [https://github.com/crillab/pyxai](https://github.com/crillab/pyxai)\n- Installation: [http://www.cril.univ-artois.fr/pyxai/documentation/installation/](http://www.cril.univ-artois.fr/pyxai/documentation/installation/)\n\n> <b> New features in version 1:</b>\n> <ul>\n> <li>Regression for Boosted Trees</li>\n> <li>Support of Theories</li>\n> <li>Easier importation of models</li>\n> <li>PyXAI's Graphical User Interface (GUI)</li>\n> <li>Supports multiple image formats for imaging datasets</li>\n> <li>Supports data pre-processing (tool for preparing and cleansing a dataset)</li>\n> </ul> \n\n<figure>\n <img src=\"https://lh3.googleusercontent.com/drive-viewer/AITFw-xLC9-pvcsp0MGlTOBODqrs8aJogGpnAlAnVrh41EetySebz-VNzJW9PkHLmYUIBb_SaqlOpGBLsAm8IY5WIo73xNj0=s1600\" alt=\"pyxai\" />\n <figcaption>PyXAI's main steps for producing explanations.</figcaption>\n</figure>\n\n<figure>\n <img src=\"https://lh3.googleusercontent.com/drive-viewer/AITFw-yqn-ZOIW2u7a2XxVH9UNcr5SQQnxUH8b1wfLoReVa2f7zm68-S4GAbr7RWUYW1lKLJ957gLPaFn3077l4qZXUyv82T=s1600\" alt=\"pyxai\" />\n <img src=\"https://lh3.googleusercontent.com/drive-viewer/AITFw-xDdbVt_DCAmsvJhRlMj3jxgADUVkFzHbnxmQnabdjfuPaylcyeHTyBgDZs4Xna_N_oT6pwxXBv_ls2nqRUwd8RiWgM=s1600\" alt=\"pyxai\" />\n <figcaption>PyXAI's Graphical User Interface (GUI) for visualizing explanations.</figcaption>\n</figure>\n\n<h3>What is PyXAI ?</h3>\n<p align=\"justify\">\n<b>PyXAI (Python eXplainable AI)</b> is a <a href=\"https://www.python.org/\">Python</a> library (version 3.6 or later) allowing to bring explanations of various forms suited to <b>(regression or classification) tree-based ML models</b> (Decision Trees, Random Forests, Boosted Trees, ...). In contrast to many approaches to XAI (SHAP, Lime, ...), PyXAI algorithms are <b>model-specific</b>. Furthermore, PyXAI algorithms <b>guarantee certain properties</b> about the explanations generated, that can be of several types:\n</p>\n<ul>\n <li><b>Abductive explanations</b> for an instance $X$ are intended to explain why $X$ has been classified in the way it has been classified by the ML model (thus, addressing the \u201cWhy?\u201d question). For the regression tasks, abductive explanations for $X$ are intended to explain why the regression value on $X$ is in a given interval.</li>\n <li><b>Contrastive explanations</b> for $X$ is to explain why $X$ has not been classified by the ML model as the user expected it (thus, addressing the \u201cWhy not?\u201d question).</li>\n</ul>\n\n<h3>What is the difference between PyXAI and other methods ?</h3>\n<p align=\"justify\">\n\nThe most popular approaches (SHAP, Lime, ...) to XAI <b>are model-agnostic, but do not offer any guarantees</b> of rigor. A number of <a href=\"https://arxiv.org/pdf/2307.07514.pdf\">works</a> have highlighted several misconceptions about informal approaches to XAI (see the <a href=\"{{ site.baseurl }}/papers/\">related papers</a>). Contrastingly, <b>PyXAI algorithms rely on logic-based, model-precise</b> approaches for computing explanations. Although formal explainability has a number of drawbacks, particularly in terms of the computational complexity of logical reasoning needed to derive explanations, <b>steady progress has been made since its inception</b>. \n</p>\n\n\n<h3>Which models can be explained with PyXAI ?</h3>\n<p align=\"justify\">\nModels are the resulting objects of an experimental ML protocol through a chosen <b>cross-validation method</b> (for example, the result of a training phase on a classifier). Importantly, in PyXAI, there is a complete separation between the learning phase and the explaining phase: <b>you produce/load/save models, and you find explanations for some instances given such models</b>. Currently, with PyXAI, you can use methods to find explanations suited to different <b>ML models for classification or regression tasks</b>:\n</p>\n<ul>\n <li><a href=\"https://en.wikipedia.org/wiki/Decision_tree_learning\">Decision Tree</a> (DT)</li> \n <li><a href=\"https://en.wikipedia.org/wiki/Random_forest\">Random Forest</a> (RF)</li>\n <li><a href=\"https://en.wikipedia.org/wiki/Gradient_boosting\">Boosted Tree (Gradient boosting)</a> (BT)</li>\n</ul> \n<p align=\"justify\">\nIn addition to finding explanations, PyXAI also provides methods that perform operations (production, saving, loading) on models and instances. Currently, these methods are available for three <b>ML libraries</b>:\n</p>\n<ul>\n <li><a href=\"https://scikit-learn.org/stable/\">Scikit-learn</a>: a software machine learning library</li> \n <li><a href=\"https://xgboost.readthedocs.io/en/stable/\">XGBoost</a>: an optimized distributed gradient boosting library</li>\n <li><a href=\"https://lightgbm.readthedocs.io/en/stable/\">LightGBM</a>: a gradient boosting framework that uses tree based learning algorithms</li>\n</ul> \n<p align=\"justify\">\nIt is possible to also leverage PyXAI to find explanations suited to models learned using other libraries.\n</p>\n\n<h3>What does this website offer ?</h3>\n<p align=\"justify\">\nIn this website, you can find all what you need to know about PyXAI, with more than 10 <a href=\"https://jupyter.org/\">Jupyter</a> Notebooks, including:\n</p>\n<ul>\n <li>The <a href=\"{{ site.baseurl }}/documentation/installation/\">installation guide</a> and the <a href=\"{{ site.baseurl }}/documentation/quickstart/\">quick start</a></li>\n \n <li>About obtaining models:</li>\n <ul>\n <li>How to <b>prepare and clean a dataset</b> using the PyXAI <a href=\"{{ site.baseurl }}/documentation/preprocessor/\">preprocessor</a> object?</li>\n <li>How to <b>import a model</b>, whatever its format? <a href=\"{{ site.baseurl }}/documentation/importing/\"> Importing Models</a> </li>\n <li>How to <b>generate a model using a ML cross-validation method</b>? <a href=\"{{ site.baseurl }}/documentation/learning/generating/\">Generating Models</a> </li>\n \n <li>How to <b>build a model from trees directly built by the user</b>? <a href=\"{{ site.baseurl }}/documentation/learning/builder/\">Building Models</a></li>\n <li>How to <b>save and load models</b> with the PyXAI learning module? <a href=\"{{ site.baseurl }}/documentation/saving/\">Saving/Loading Models</a></li>\n </ul>\n\n<li>About obtaining explanations:</li>\n <ul>\n <li>The <b>concepts of the PyXAI explainer module</b>: <a href=\"{{ site.baseurl }}/documentation/explainer/concepts/\">Concepts</a> </li>\n <li>How to use a <b>time limit</b>? <a href=\"{{ site.baseurl }}/documentation/explainer/time_limit/\">Time Limit</a> </li>\n \n <li>The PyXAI library offers the possibility to process user preferences (<b>prefer some explanations to others or exclude some features</b>): <a href=\"{{ site.baseurl }}/documentation/explainer/preferences/\">Preferences</a> </li>\n\n <li><b>Theories are knowledge about the dataset.</b> PyXAI offers the possibility of encoding a theory when calculating explanations in order to avoid calculating impossible explanations: <a href=\"{{ site.baseurl }}/documentation/explainer/theories/\">Theories</a> </li>\n\n <li>How to <b>compute explanations for classification tasks</b>? <a href=\"{{ site.baseurl }}/documentation/classification/\">Explaining Classification</a> </li>\n \n <li>How to <b>compute explanations for regression tasks</b>? <a href=\"{{ site.baseurl }}/documentation/regression/\">Explaining Regression</a> </li>\n \n </ul>\n\n <li>How to use the <b>PyXAI's Graphical User Interface (GUI)</b> for <a href=\"{{ site.baseurl }}/documentation/visualization/\">visualizing explanations</a>?</li>\n \n \n</ul>\n\n<h3>How to use PyXAI ?</h3>\n<p align=\"justify\">\nHere is an example (it comes from the <a href=\"{{ site.baseurl }}/documentation/quickstart\">Quick Start page</a>):\n</p>\n<h4 class=\"example\">PyXAI in action</h4>\n\n```python\nfrom pyxai import Learning, Explainer\n\nlearner = Learning.Scikitlearn(\"tests/iris.csv\", learner_type=Learning.CLASSIFICATION)\nmodel = learner.evaluate(method=Learning.HOLD_OUT, output=Learning.DT)\ninstance, prediction = learner.get_instances(model, n=1, correct=True, predictions=[0])\n\nexplainer = Explainer.initialize(model, instance)\nprint(\"instance:\", instance)\nprint(\"binary representation:\", explainer.binary_representation)\n\nsufficient_reason = explainer.sufficient_reason(n=1)\nprint(\"sufficient_reason:\", sufficient_reason)\nprint(\"to_features:\", explainer.to_features(sufficient_reason))\n\ninstance, prediction = learner.get_instances(model, n=1, correct=False)\nexplainer.set_instance(instance)\ncontrastive_reason = explainer.contrastive_reason()\nprint(\"contrastive reason\", contrastive_reason)\nprint(\"to_features:\", explainer.to_features(contrastive_reason, contrastive=True))\n\nexplainer.show()\n```\n\n<img src=\"https://lh3.googleusercontent.com/drive-viewer/AITFw-xbHs56zfQ_EHQ0-XqdHxy7mdL3fBxFRVnfW6pPCCCpSg89GStqQCBD5ElFLn3NaZmB-2mwY9hdu5TH0gPajOI2xwSCJQ=s1600\" alt=\"pyxai\" />\n\n<p>As illustrated by this example, with a few lines of code, PyXAI allows you to train a model, extract instances, and get explanations about the classifications made.</p>\n\n<br /><br />\n<p align=\"center\">\n <a href=\"http://www.cril.univ-artois.fr\"><img width=\"120px\" src=\"https://lh3.googleusercontent.com/drive-viewer/AITFw-wsxZnVjsY1ypy7nGs2m__Iz5pDphw1wbc3a78HHVVqBhAFOx35hcvCGFaTfgDFlqGB_ChMWBfC-tlXUfX0twpqAnNfVg=s2560\" /></a>\n <a href=\"https://www.irt-systemx.fr/\"><img width=\"120px\" style=\"width: 80px;\" src=\"https://lh3.googleusercontent.com/drive-viewer/AITFw-xuRWtP8WNuRXXaff32Tzd7OT4guc8vNEeXurAKIQiaeuIdeEYXo9hiA1HeGCgUY7I7NeT70U5yQt5BbwK6H4lv5jabQA=s2560\" /></a>\n <a href=\"https://www.cnrs.fr/\"><img width=\"80px\" style=\"width: 80px;\" src=\"https://lh3.googleusercontent.com/drive-viewer/AITFw-xBV_ILK1g_mKMJ0Hk0wJtFmdKLnAT68QA7fMa5i663Tbla_Q2RjALnH6cER8BGAPThh_ZaOKpcO9ggkI1DAmU4zaEG=s1600\" /></a>\n <a href=\"https://www.confiance.ai/\"><img width=\"120px\" style=\"width: 80px;\" src=\"https://lh3.googleusercontent.com/drive-viewer/AITFw-wiEyfiP29DKvwP5webvNRDXwXsS1PxQnTIZEdMpQ9xV9JN23-86HOqzNEBi9F4Ng8h-Kd8W5NKaWqefnGhhhQmxneu=s1600\" /></a>\n <a href=\"http://univ-artois.fr\"><img width=\"120px\" src=\"https://lh3.googleusercontent.com/drive-viewer/AITFw-wA-x2qgHNNrxLEaI33jDH64TM7sudMsTt781ICTAvzBsPaEtL2Ky_1Ba-QWm6YyqCmTuFGpylJ2sSXRgjzu7BM7iC8Xg=s2560\" /></a>\n</p>\n\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "Explaining Machine Learning Classifiers in Python",
"version": "1.0.post1",
"project_urls": {
"Documentation": "http://www.cril.univ-artois.fr/pyxai/",
"Git": "https://github.com/crillab/pyxai",
"Installation": "http://www.cril.univ-artois.fr/pyxai/documentation/installation/"
},
"split_keywords": [
"xai",
"ai",
"xml",
"ml",
"explainable",
"learning"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "f5c80e6c28a6c2553fb3efdafdd9d3364db535de799710e02aae23ca51442a1d",
"md5": "fb23f448e69ae313d2275c539c63d7ed",
"sha256": "893131419c7253f20844d9ec2039730af962c7c31025565796415b27cce360db"
},
"downloads": -1,
"filename": "pyxai_experimental-1.0.post1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl",
"has_sig": false,
"md5_digest": "fb23f448e69ae313d2275c539c63d7ed",
"packagetype": "bdist_wheel",
"python_version": "cp310",
"requires_python": ">=3.6",
"size": 87027620,
"upload_time": "2023-09-04T10:58:11",
"upload_time_iso_8601": "2023-09-04T10:58:11.182111Z",
"url": "https://files.pythonhosted.org/packages/f5/c8/0e6c28a6c2553fb3efdafdd9d3364db535de799710e02aae23ca51442a1d/pyxai_experimental-1.0.post1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "1c40329c81cdab00a2cea68e343f6b7e7bc5dc3697bc2b3425e4cd0917aaa3aa",
"md5": "d3eec4de2678e4dc87cf888a77d0a4cf",
"sha256": "4ff8ef6c9cfdc81478519eaf96f4b89240453586778f9e3b8d9e72254ea592ea"
},
"downloads": -1,
"filename": "pyxai_experimental-1.0.post1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl",
"has_sig": false,
"md5_digest": "d3eec4de2678e4dc87cf888a77d0a4cf",
"packagetype": "bdist_wheel",
"python_version": "cp311",
"requires_python": ">=3.6",
"size": 87028172,
"upload_time": "2023-09-04T10:58:22",
"upload_time_iso_8601": "2023-09-04T10:58:22.273455Z",
"url": "https://files.pythonhosted.org/packages/1c/40/329c81cdab00a2cea68e343f6b7e7bc5dc3697bc2b3425e4cd0917aaa3aa/pyxai_experimental-1.0.post1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "e4eee48110a1b4be2d005b0f35f49783c45096da98417f9bcad7c994e1a5ac6e",
"md5": "5b3222afbde6ada748d4ed0635b496bf",
"sha256": "eb44bb2ed4a51bfd2ec98acb24b2f6c30871b417ddec2ba11a39639f98126adb"
},
"downloads": -1,
"filename": "pyxai_experimental-1.0.post1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl",
"has_sig": false,
"md5_digest": "5b3222afbde6ada748d4ed0635b496bf",
"packagetype": "bdist_wheel",
"python_version": "cp312",
"requires_python": ">=3.6",
"size": 87028111,
"upload_time": "2023-09-04T10:58:30",
"upload_time_iso_8601": "2023-09-04T10:58:30.986089Z",
"url": "https://files.pythonhosted.org/packages/e4/ee/e48110a1b4be2d005b0f35f49783c45096da98417f9bcad7c994e1a5ac6e/pyxai_experimental-1.0.post1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "d967ef2677d945fafc02a573c8ef2351e15f6afd5c5955dcf24bd237ecda8a3c",
"md5": "a53333c547fdfae8d1704af67111f3b7",
"sha256": "dcb0ae6c259106b434a96070230b85c03ad357aa68d8c177719f31a1ae001d52"
},
"downloads": -1,
"filename": "pyxai_experimental-1.0.post1-cp36-cp36m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl",
"has_sig": false,
"md5_digest": "a53333c547fdfae8d1704af67111f3b7",
"packagetype": "bdist_wheel",
"python_version": "cp36",
"requires_python": ">=3.6",
"size": 87023729,
"upload_time": "2023-09-04T10:58:41",
"upload_time_iso_8601": "2023-09-04T10:58:41.852595Z",
"url": "https://files.pythonhosted.org/packages/d9/67/ef2677d945fafc02a573c8ef2351e15f6afd5c5955dcf24bd237ecda8a3c/pyxai_experimental-1.0.post1-cp36-cp36m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "5c8c4be5dd02b542338839c2bac2d0cb1c29d4a9ef7f01983d9409882ba5267c",
"md5": "f8890114c8270c2b48c2d359a2095ec2",
"sha256": "59d21ed4f8a725cabf612bf32b336bec4369e4ebe5e18ad7356ce2e9e030fadf"
},
"downloads": -1,
"filename": "pyxai_experimental-1.0.post1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl",
"has_sig": false,
"md5_digest": "f8890114c8270c2b48c2d359a2095ec2",
"packagetype": "bdist_wheel",
"python_version": "cp37",
"requires_python": ">=3.6",
"size": 87025811,
"upload_time": "2023-09-04T10:58:53",
"upload_time_iso_8601": "2023-09-04T10:58:53.786736Z",
"url": "https://files.pythonhosted.org/packages/5c/8c/4be5dd02b542338839c2bac2d0cb1c29d4a9ef7f01983d9409882ba5267c/pyxai_experimental-1.0.post1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "96a4c7e4c81f2cd325a7fc167c3865b9e6301e965f7fbbfd599b3520b8b8060a",
"md5": "592190d42a2b89c232ee168e34084148",
"sha256": "c50bc02260aee7649c97168407ce7239ad4c4b70781b942b9076ef28872db25f"
},
"downloads": -1,
"filename": "pyxai_experimental-1.0.post1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl",
"has_sig": false,
"md5_digest": "592190d42a2b89c232ee168e34084148",
"packagetype": "bdist_wheel",
"python_version": "cp38",
"requires_python": ">=3.6",
"size": 87026250,
"upload_time": "2023-09-04T10:59:06",
"upload_time_iso_8601": "2023-09-04T10:59:06.223234Z",
"url": "https://files.pythonhosted.org/packages/96/a4/c7e4c81f2cd325a7fc167c3865b9e6301e965f7fbbfd599b3520b8b8060a/pyxai_experimental-1.0.post1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "0ef603edfd2470496eb18ea575ecfa79cb9047d05c0269bc061ca54a6a3de752",
"md5": "0b481c50ea6694f280817ffc111d6859",
"sha256": "0f4097ce069c7bb1492a9a3e770035724124f61a3468acba17f3bc804e96cc6c"
},
"downloads": -1,
"filename": "pyxai_experimental-1.0.post1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl",
"has_sig": false,
"md5_digest": "0b481c50ea6694f280817ffc111d6859",
"packagetype": "bdist_wheel",
"python_version": "cp39",
"requires_python": ">=3.6",
"size": 87027163,
"upload_time": "2023-09-04T10:59:17",
"upload_time_iso_8601": "2023-09-04T10:59:17.462678Z",
"url": "https://files.pythonhosted.org/packages/0e/f6/03edfd2470496eb18ea575ecfa79cb9047d05c0269bc061ca54a6a3de752/pyxai_experimental-1.0.post1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "7fb219870bb04cde92f0f641da64873afd8e87b6dbdb11eb56bab1cd4c486472",
"md5": "6dfc20df54e3e32491967e2a3751fb91",
"sha256": "1253dae581c7973b182237e89733369ac500e7c7fca10b2436c5f4e38690494a"
},
"downloads": -1,
"filename": "pyxai_experimental-1.0.post1-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl",
"has_sig": false,
"md5_digest": "6dfc20df54e3e32491967e2a3751fb91",
"packagetype": "bdist_wheel",
"python_version": "pp310",
"requires_python": ">=3.6",
"size": 86311888,
"upload_time": "2023-09-04T10:59:27",
"upload_time_iso_8601": "2023-09-04T10:59:27.912197Z",
"url": "https://files.pythonhosted.org/packages/7f/b2/19870bb04cde92f0f641da64873afd8e87b6dbdb11eb56bab1cd4c486472/pyxai_experimental-1.0.post1-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "2de8ab963c6e7eae90f687af1fb52baa52678e8dbb5a1cfa797a59f3483773c1",
"md5": "bb80e25b7af1adc2c2e0558013dad952",
"sha256": "0e2a4058d170e649625eb7a59893f215610ce542c4f3843bfae87ee005b02e23"
},
"downloads": -1,
"filename": "pyxai_experimental-1.0.post1-pp37-pypy37_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl",
"has_sig": false,
"md5_digest": "bb80e25b7af1adc2c2e0558013dad952",
"packagetype": "bdist_wheel",
"python_version": "pp37",
"requires_python": ">=3.6",
"size": 86313240,
"upload_time": "2023-09-04T10:59:37",
"upload_time_iso_8601": "2023-09-04T10:59:37.345412Z",
"url": "https://files.pythonhosted.org/packages/2d/e8/ab963c6e7eae90f687af1fb52baa52678e8dbb5a1cfa797a59f3483773c1/pyxai_experimental-1.0.post1-pp37-pypy37_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "4c0ddf44c54a4dbd288e85f332d51fa52f47c9e109cf7f26c6dfc94431a73db6",
"md5": "750d902e8918b489ad1cb4fe21354a86",
"sha256": "7171587c888b2e478f6db062f097b6793b909842b2f91b8599640f078821cf6c"
},
"downloads": -1,
"filename": "pyxai_experimental-1.0.post1-pp38-pypy38_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl",
"has_sig": false,
"md5_digest": "750d902e8918b489ad1cb4fe21354a86",
"packagetype": "bdist_wheel",
"python_version": "pp38",
"requires_python": ">=3.6",
"size": 86311907,
"upload_time": "2023-09-04T10:59:45",
"upload_time_iso_8601": "2023-09-04T10:59:45.954554Z",
"url": "https://files.pythonhosted.org/packages/4c/0d/df44c54a4dbd288e85f332d51fa52f47c9e109cf7f26c6dfc94431a73db6/pyxai_experimental-1.0.post1-pp38-pypy38_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "b085f46162ca9d78032c3558acccdc4f99c50fc486295bfd01011f4166b47796",
"md5": "17128df603eb26fae9fe2e4fa88309d2",
"sha256": "0e11845320017fc621dad85e161846acb2d176cc7e9c195477491cc865d45b1f"
},
"downloads": -1,
"filename": "pyxai_experimental-1.0.post1-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl",
"has_sig": false,
"md5_digest": "17128df603eb26fae9fe2e4fa88309d2",
"packagetype": "bdist_wheel",
"python_version": "pp39",
"requires_python": ">=3.6",
"size": 86311818,
"upload_time": "2023-09-04T10:59:55",
"upload_time_iso_8601": "2023-09-04T10:59:55.166475Z",
"url": "https://files.pythonhosted.org/packages/b0/85/f46162ca9d78032c3558acccdc4f99c50fc486295bfd01011f4166b47796/pyxai_experimental-1.0.post1-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2023-09-04 10:58:11",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "crillab",
"github_project": "pyxai",
"travis_ci": false,
"coveralls": false,
"github_actions": false,
"lcname": "pyxai-experimental"
}