deepod


Namedeepod JSON
Version 0.4.1 PyPI version JSON
download
home_pagehttps://github.com/xuhongzuo/DeepOD
Summary
upload_time2023-09-06 09:19:13
maintainer
docs_urlNone
authorHongzuo Xu
requires_python
licenseMIT License
keywords outlier detection anomaly detection deep anomaly detection deep learning data mining
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI
coveralls test coverage No coveralls.
            Python Deep Outlier/Anomaly Detection (DeepOD)
==================================================

.. image:: https://github.com/xuhongzuo/DeepOD/actions/workflows/testing.yml/badge.svg
   :target: https://github.com/xuhongzuo/DeepOD/actions/workflows/testing.yml
   :alt: testing2

.. image:: https://coveralls.io/repos/github/xuhongzuo/DeepOD/badge.svg?branch=main
    :target: https://coveralls.io/github/xuhongzuo/DeepOD?branch=main
    :alt: coveralls

.. image:: https://static.pepy.tech/personalized-badge/deepod?period=total&units=international_system&left_color=black&right_color=orange&left_text=Downloads
   :target: https://pepy.tech/project/deepod
   :alt: downloads
   

``DeepOD`` is an open-source python library for Deep Learning-based `Outlier Detection <https://en.wikipedia.org/wiki/Anomaly_detection>`_
and `Anomaly Detection <https://en.wikipedia.org/wiki/Anomaly_detection>`_. ``DeepOD`` supports tabular anomaly detection and time-series anomaly detection.


DeepOD includes **25** deep outlier detection / anomaly detection algorithms (in unsupervised/weakly-supervised paradigm). More baseline algorithms will be included later.



**DeepOD is featured for**:

* **Unified APIs** across various algorithms.
* **SOTA models** includes reconstruction-, representation-learning-, and self-superivsed-based latest deep learning methods.
* **Comprehensive Testbed** that can be used to directly test different models on benchmark datasets (highly recommend for academic research).
* **Versatile** in different data types including tabular and time-series data (DeepOD will support other data types like images, graph, log, trace, etc. in the future, welcome PR :telescope:).
* **Diverse Network Structures** can be plugged into detection models, we now support LSTM, GRU, TCN, Conv, and Transformer for time-series data.  (welcome PR as well :sparkles:)


If you are interested in our project, we are pleased to have your stars and forks :thumbsup: :beers: .


Installation
~~~~~~~~~~~~~~
The DeepOD framework can be installed via:


.. code-block:: bash


    pip install deepod


install a developing version (strongly recommend)


.. code-block:: bash


    git clone https://github.com/xuhongzuo/DeepOD.git
    cd DeepOD
    pip install .


Usages
~~~~~~~~~~~~~~~~~


Directly use detection models in DeepOD:
::::::::::::::::::::::::::::::::::::::::::

DeepOD can be used in a few lines of code. This API style is the same with `Sklean <https://github.com/scikit-learn/scikit-learn>`_ and `PyOD <https://github.com/yzhao062/pyod>`_.


**for tabular anomaly detection:**

.. code-block:: python


    # unsupervised methods
    from deepod.models.tabular import DeepSVDD
    clf = DeepSVDD()
    clf.fit(X_train, y=None)
    scores = clf.decision_function(X_test)

    # weakly-supervised methods
    from deepod.models.tabular import DevNet
    clf = DevNet()
    clf.fit(X_train, y=semi_y) # semi_y uses 1 for known anomalies, and 0 for unlabeled data
    scores = clf.decision_function(X_test)

    # evaluation of tabular anomaly detection
    from deepod.metrics import tabular_metrics
    auc, ap, f1 = tabular_metrics(y_test, scores)


**for time series anomaly detection:**


.. code-block:: python


    # time series anomaly detection methods
    from deepod.models.time_series import TimesNet
    clf = TimesNet()
    clf.fit(X_train)
    scores = clf.decision_function(X_test)

    # evaluation of time series anomaly detection
    from deepod.metrics import ts_metrics
    from deepod.metrics import point_adjustment # execute point adjustment for time series ad
    eval_metrics = ts_metrics(labels, scores)
    adj_eval_metrics = ts_metrics(labels, point_adjustment(labels, scores))
    




Testbed usage:
::::::::::::::::::::::::::::::::::::::::::


Testbed contains the whole process of testing an anomaly detection model, including data loading, preprocessing, anomaly detection, and evaluation. 

Please refer to ``testbed/``

* ``testbed/testbed_unsupervised_ad.py`` is for testing unsupervised tabular anomaly detection models.
 
* ``testbed/testbed_unsupervised_tsad.py`` is for testing unsupervised time-series anomaly detection models.


Key arguments:

* ``--input_dir``: name of the folder that contains datasets (.csv, .npy)

* ``--dataset``: "FULL" represents testing all the files within the folder, or a list of dataset names using commas to split them (e.g., "10_cover*,20_letter*")

* ``--model``: anomaly detection model name

* ``--runs``: how many times running the detection model, finally report an average performance with standard deviation values


Example: 

1. Download `ADBench <https://github.com/Minqi824/ADBench/tree/main/adbench/datasets/>`_ datasets.
2. modify the ``dataset_root`` variable as the directory of the dataset.
3. ``input_dir`` is the sub-folder name of the ``dataset_root``, e.g., ``Classical`` or ``NLP_by_BERT``.  
4. use the following command in the bash


.. code-block:: bash

    
    cd DeepOD
    pip install .
    cd testbed
    python testbed_unsupervised_ad.py --model DIF --runs 5 --input_dir ADBench
   



Implemented Models
~~~~~~~~~~~~~~~~~~~

**Tabular Anomaly Detection models:**

.. csv-table:: 
 :header: "Model", "Venue", "Year", "Type", "Title"
 :widths: 4, 4, 4, 8, 20 

 Deep SVDD, ICML, 2018, unsupervised, Deep One-Class Classification  [#Ruff2018Deep]_
 REPEN, KDD, 2018, unsupervised, Learning Representations of Ultrahigh-dimensional Data for Random Distance-based Outlier Detection [#Pang2019Repen]_
 RDP, IJCAI, 2020, unsupervised, Unsupervised Representation Learning by Predicting Random Distances [#Wang2020RDP]_
 RCA, IJCAI, 2021, unsupervised, RCA: A Deep Collaborative Autoencoder Approach for Anomaly Detection [#Liu2021RCA]_
 GOAD, ICLR, 2020, unsupervised, Classification-Based Anomaly Detection for General Data [#Bergman2020GOAD]_
 NeuTraL, ICML, 2021, unsupervised, Neural Transformation Learning for Deep Anomaly Detection Beyond Images [#Qiu2021Neutral]_
 ICL, ICLR, 2022, unsupervised, Anomaly Detection for Tabular Data with Internal Contrastive Learning
 DIF, TKDE, 2023, unsupervised, Deep Isolation Forest for Anomaly Detection
 SLAD, ICML, 2023, unsupervised, Fascinating Supervisory Signals and Where to Find Them: Deep Anomaly Detection with Scale Learning
 DevNet, KDD, 2019, weakly-supervised, Deep Anomaly Detection with Deviation Networks
 PReNet, KDD, 2023, weakly-supervised, Deep Weakly-supervised Anomaly Detection
 Deep SAD, ICLR, 2020, weakly-supervised, Deep Semi-Supervised Anomaly Detection
 FeaWAD, TNNLS, 2021, weakly-supervised, Feature Encoding with AutoEncoders for Weakly-supervised Anomaly Detection
 RoSAS, IP&M, 2023, weakly-supervised, RoSAS: Deep semi-supervised anomaly detection with contamination-resilient continuous supervision

**Time-series Anomaly Detection models:**

.. csv-table:: 
 :header: "Model", "Venue", "Year", "Type", "Title"
 :widths: 4, 4, 4, 8, 20 

 TimesNet, ICLR, 2023, unsupervised, TIMESNET: Temporal 2D-Variation Modeling for General Time Series Analysis
 AnomalyTransformer, ICLR, 2022, unsupervised, Anomaly Transformer: Time Series Anomaly Detection with Association Discrepancy
 TranAD, VLDB, 2022, unsupervised, TranAD: Deep Transformer Networks for Anomaly Detection in Multivariate Time Series Data  
 COUTA, arXiv, 2022, unsupervised, Calibrated One-class Classification for Unsupervised Time Series Anomaly Detection
 USAD, KDD, 2020, unsupervised, USAD: UnSupervised Anomaly Detection on Multivariate Time Series  
 DIF, TKDE, 2023, unsupervised, Deep Isolation Forest for Anomaly Detection
 TcnED, TNNLS, 2021, unsupervised, An Evaluation of Anomaly Detection and Diagnosis in Multivariate Time Series
 Deep SVDD (TS), ICML, 2018, unsupervised, Deep One-Class Classification  
 DevNet (TS), KDD, 2019, weakly-supervised, Deep Anomaly Detection with Deviation Networks
 PReNet (TS), KDD, 2023, weakly-supervised, Deep Weakly-supervised Anomaly Detection
 Deep SAD (TS), ICLR, 2020, weakly-supervised, Deep Semi-Supervised Anomaly Detection

NOTE:

- For Deep SVDD, DevNet, PReNet, and DeepSAD, we employ network structures that can handle time-series data. These models' classes have a parameter named  ``network`` in these models, by changing it, you can use different networks.   

- We currently support 'TCN', 'GRU', 'LSTM', 'Transformer', 'ConvSeq', and 'DilatedConv'.   


Citation
~~~~~~~~~~~~~~~~~
If you use this library in your work, please cite this paper:

Hongzuo Xu, Guansong Pang, Yijie Wang and Yongjun Wang, "Deep Isolation Forest for Anomaly Detection," in IEEE Transactions on Knowledge and Data Engineering, doi: 10.1109/TKDE.2023.3270293.


You can also use the BibTex entry below for citation.

.. code-block:: bibtex

   @ARTICLE{xu2023deep,
      author={Xu, Hongzuo and Pang, Guansong and Wang, Yijie and Wang, Yongjun},
      journal={IEEE Transactions on Knowledge and Data Engineering}, 
      title={Deep Isolation Forest for Anomaly Detection}, 
      year={2023},
      volume={},
      number={},
      pages={1-14},
      doi={10.1109/TKDE.2023.3270293}
   }



Reference
~~~~~~~~~~~~~~~~~

.. [#Ruff2018Deep] Ruff, Lukas, et al. "Deep one-class classification." ICML. 2018.

.. [#Pang2019Repen] Pang, Guansong, et al. "Learning representations of ultrahigh-dimensional data for random distance-based outlier detection". KDD (pp. 2041-2050). 2018.

.. [#Wang2020RDP] Wang, Hu, et al. "Unsupervised Representation Learning by Predicting Random Distances". IJCAI (pp. 2950-2956). 2020.

.. [#Liu2021RCA] Liu, Boyang, et al. "RCA: A Deep Collaborative Autoencoder Approach for Anomaly Detection". IJCAI (pp. 1505-1511). 2021.

.. [#Bergman2020GOAD] Bergman, Liron, and Yedid Hoshen. "Classification-Based Anomaly Detection for General Data". ICLR. 2020.

.. [#Qiu2021Neutral] Qiu, Chen, et al. "Neural Transformation Learning for Deep Anomaly Detection Beyond Images". ICML. 2021.



            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/xuhongzuo/DeepOD",
    "name": "deepod",
    "maintainer": "",
    "docs_url": null,
    "requires_python": "",
    "maintainer_email": "",
    "keywords": "outlier detection,anomaly detection,deep anomaly detection,deep learning,data mining",
    "author": "Hongzuo Xu",
    "author_email": "hongzuoxu@126.com",
    "download_url": "https://files.pythonhosted.org/packages/aa/ed/298b8e4ebfaf970e1e18384a251cf065071cd81a27f8b408e3fb6d2e6a27/deepod-0.4.1.tar.gz",
    "platform": null,
    "description": "Python Deep Outlier/Anomaly Detection (DeepOD)\n==================================================\n\n.. image:: https://github.com/xuhongzuo/DeepOD/actions/workflows/testing.yml/badge.svg\n   :target: https://github.com/xuhongzuo/DeepOD/actions/workflows/testing.yml\n   :alt: testing2\n\n.. image:: https://coveralls.io/repos/github/xuhongzuo/DeepOD/badge.svg?branch=main\n    :target: https://coveralls.io/github/xuhongzuo/DeepOD?branch=main\n    :alt: coveralls\n\n.. image:: https://static.pepy.tech/personalized-badge/deepod?period=total&units=international_system&left_color=black&right_color=orange&left_text=Downloads\n   :target: https://pepy.tech/project/deepod\n   :alt: downloads\n   \n\n``DeepOD`` is an open-source python library for Deep Learning-based `Outlier Detection <https://en.wikipedia.org/wiki/Anomaly_detection>`_\nand `Anomaly Detection <https://en.wikipedia.org/wiki/Anomaly_detection>`_. ``DeepOD`` supports tabular anomaly detection and time-series anomaly detection.\n\n\nDeepOD includes **25** deep outlier detection / anomaly detection algorithms (in unsupervised/weakly-supervised paradigm). More baseline algorithms will be included later.\n\n\n\n**DeepOD is featured for**:\n\n* **Unified APIs** across various algorithms.\n* **SOTA models** includes reconstruction-, representation-learning-, and self-superivsed-based latest deep learning methods.\n* **Comprehensive Testbed** that can be used to directly test different models on benchmark datasets (highly recommend for academic research).\n* **Versatile** in different data types including tabular and time-series data (DeepOD will support other data types like images, graph, log, trace, etc. in the future, welcome PR :telescope:).\n* **Diverse Network Structures** can be plugged into detection models, we now support LSTM, GRU, TCN, Conv, and Transformer for time-series data.  (welcome PR as well :sparkles:)\n\n\nIf you are interested in our project, we are pleased to have your stars and forks :thumbsup: :beers: .\n\n\nInstallation\n~~~~~~~~~~~~~~\nThe DeepOD framework can be installed via:\n\n\n.. code-block:: bash\n\n\n    pip install deepod\n\n\ninstall a developing version (strongly recommend)\n\n\n.. code-block:: bash\n\n\n    git clone https://github.com/xuhongzuo/DeepOD.git\n    cd DeepOD\n    pip install .\n\n\nUsages\n~~~~~~~~~~~~~~~~~\n\n\nDirectly use detection models in DeepOD:\n::::::::::::::::::::::::::::::::::::::::::\n\nDeepOD can be used in a few lines of code. This API style is the same with `Sklean <https://github.com/scikit-learn/scikit-learn>`_ and `PyOD <https://github.com/yzhao062/pyod>`_.\n\n\n**for tabular anomaly detection:**\n\n.. code-block:: python\n\n\n    # unsupervised methods\n    from deepod.models.tabular import DeepSVDD\n    clf = DeepSVDD()\n    clf.fit(X_train, y=None)\n    scores = clf.decision_function(X_test)\n\n    # weakly-supervised methods\n    from deepod.models.tabular import DevNet\n    clf = DevNet()\n    clf.fit(X_train, y=semi_y) # semi_y uses 1 for known anomalies, and 0 for unlabeled data\n    scores = clf.decision_function(X_test)\n\n    # evaluation of tabular anomaly detection\n    from deepod.metrics import tabular_metrics\n    auc, ap, f1 = tabular_metrics(y_test, scores)\n\n\n**for time series anomaly detection:**\n\n\n.. code-block:: python\n\n\n    # time series anomaly detection methods\n    from deepod.models.time_series import TimesNet\n    clf = TimesNet()\n    clf.fit(X_train)\n    scores = clf.decision_function(X_test)\n\n    # evaluation of time series anomaly detection\n    from deepod.metrics import ts_metrics\n    from deepod.metrics import point_adjustment # execute point adjustment for time series ad\n    eval_metrics = ts_metrics(labels, scores)\n    adj_eval_metrics = ts_metrics(labels, point_adjustment(labels, scores))\n    \n\n\n\n\nTestbed usage:\n::::::::::::::::::::::::::::::::::::::::::\n\n\nTestbed contains the whole process of testing an anomaly detection model, including data loading, preprocessing, anomaly detection, and evaluation. \n\nPlease refer to ``testbed/``\n\n* ``testbed/testbed_unsupervised_ad.py`` is for testing unsupervised tabular anomaly detection models.\n \n* ``testbed/testbed_unsupervised_tsad.py`` is for testing unsupervised time-series anomaly detection models.\n\n\nKey arguments:\n\n* ``--input_dir``: name of the folder that contains datasets (.csv, .npy)\n\n* ``--dataset``: \"FULL\" represents testing all the files within the folder, or a list of dataset names using commas to split them (e.g., \"10_cover*,20_letter*\")\n\n* ``--model``: anomaly detection model name\n\n* ``--runs``: how many times running the detection model, finally report an average performance with standard deviation values\n\n\nExample: \n\n1. Download `ADBench <https://github.com/Minqi824/ADBench/tree/main/adbench/datasets/>`_ datasets.\n2. modify the ``dataset_root`` variable as the directory of the dataset.\n3. ``input_dir`` is the sub-folder name of the ``dataset_root``, e.g., ``Classical`` or ``NLP_by_BERT``.  \n4. use the following command in the bash\n\n\n.. code-block:: bash\n\n    \n    cd DeepOD\n    pip install .\n    cd testbed\n    python testbed_unsupervised_ad.py --model DIF --runs 5 --input_dir ADBench\n   \n\n\n\nImplemented Models\n~~~~~~~~~~~~~~~~~~~\n\n**Tabular Anomaly Detection models:**\n\n.. csv-table:: \n :header: \"Model\", \"Venue\", \"Year\", \"Type\", \"Title\"\n :widths: 4, 4, 4, 8, 20 \n\n Deep SVDD, ICML, 2018, unsupervised, Deep One-Class Classification  [#Ruff2018Deep]_\n REPEN, KDD, 2018, unsupervised, Learning Representations of Ultrahigh-dimensional Data for Random Distance-based Outlier Detection [#Pang2019Repen]_\n RDP, IJCAI, 2020, unsupervised, Unsupervised Representation Learning by Predicting Random Distances [#Wang2020RDP]_\n RCA, IJCAI, 2021, unsupervised, RCA: A Deep Collaborative Autoencoder Approach for Anomaly Detection [#Liu2021RCA]_\n GOAD, ICLR, 2020, unsupervised, Classification-Based Anomaly Detection for General Data [#Bergman2020GOAD]_\n NeuTraL, ICML, 2021, unsupervised, Neural Transformation Learning for Deep Anomaly Detection Beyond Images [#Qiu2021Neutral]_\n ICL, ICLR, 2022, unsupervised, Anomaly Detection for Tabular Data with Internal Contrastive Learning\n DIF, TKDE, 2023, unsupervised, Deep Isolation Forest for Anomaly Detection\n SLAD, ICML, 2023, unsupervised, Fascinating Supervisory Signals and Where to Find Them: Deep Anomaly Detection with Scale Learning\n DevNet, KDD, 2019, weakly-supervised, Deep Anomaly Detection with Deviation Networks\n PReNet, KDD, 2023, weakly-supervised, Deep Weakly-supervised Anomaly Detection\n Deep SAD, ICLR, 2020, weakly-supervised, Deep Semi-Supervised Anomaly Detection\n FeaWAD, TNNLS, 2021, weakly-supervised, Feature Encoding with AutoEncoders for Weakly-supervised Anomaly Detection\n RoSAS, IP&M, 2023, weakly-supervised, RoSAS: Deep semi-supervised anomaly detection with contamination-resilient continuous supervision\n\n**Time-series Anomaly Detection models:**\n\n.. csv-table:: \n :header: \"Model\", \"Venue\", \"Year\", \"Type\", \"Title\"\n :widths: 4, 4, 4, 8, 20 \n\n TimesNet, ICLR, 2023, unsupervised, TIMESNET: Temporal 2D-Variation Modeling for General Time Series Analysis\n AnomalyTransformer, ICLR, 2022, unsupervised, Anomaly Transformer: Time Series Anomaly Detection with Association Discrepancy\n TranAD, VLDB, 2022, unsupervised, TranAD: Deep Transformer Networks for Anomaly Detection in Multivariate Time Series Data  \n COUTA, arXiv, 2022, unsupervised, Calibrated One-class Classification for Unsupervised Time Series Anomaly Detection\n USAD, KDD, 2020, unsupervised, USAD: UnSupervised Anomaly Detection on Multivariate Time Series  \n DIF, TKDE, 2023, unsupervised, Deep Isolation Forest for Anomaly Detection\n TcnED, TNNLS, 2021, unsupervised, An Evaluation of Anomaly Detection and Diagnosis in Multivariate Time Series\n Deep SVDD (TS), ICML, 2018, unsupervised, Deep One-Class Classification  \n DevNet (TS), KDD, 2019, weakly-supervised, Deep Anomaly Detection with Deviation Networks\n PReNet (TS), KDD, 2023, weakly-supervised, Deep Weakly-supervised Anomaly Detection\n Deep SAD (TS), ICLR, 2020, weakly-supervised, Deep Semi-Supervised Anomaly Detection\n\nNOTE:\n\n- For Deep SVDD, DevNet, PReNet, and DeepSAD, we employ network structures that can handle time-series data. These models' classes have a parameter named  ``network`` in these models, by changing it, you can use different networks.   \n\n- We currently support 'TCN', 'GRU', 'LSTM', 'Transformer', 'ConvSeq', and 'DilatedConv'.   \n\n\nCitation\n~~~~~~~~~~~~~~~~~\nIf you use this library in your work, please cite this paper:\n\nHongzuo Xu, Guansong Pang, Yijie Wang and Yongjun Wang, \"Deep Isolation Forest for Anomaly Detection,\" in IEEE Transactions on Knowledge and Data Engineering, doi: 10.1109/TKDE.2023.3270293.\n\n\nYou can also use the BibTex entry below for citation.\n\n.. code-block:: bibtex\n\n   @ARTICLE{xu2023deep,\n      author={Xu, Hongzuo and Pang, Guansong and Wang, Yijie and Wang, Yongjun},\n      journal={IEEE Transactions on Knowledge and Data Engineering}, \n      title={Deep Isolation Forest for Anomaly Detection}, \n      year={2023},\n      volume={},\n      number={},\n      pages={1-14},\n      doi={10.1109/TKDE.2023.3270293}\n   }\n\n\n\nReference\n~~~~~~~~~~~~~~~~~\n\n.. [#Ruff2018Deep] Ruff, Lukas, et al. \"Deep one-class classification.\" ICML. 2018.\n\n.. [#Pang2019Repen] Pang, Guansong, et al. \"Learning representations of ultrahigh-dimensional data for random distance-based outlier detection\". KDD (pp. 2041-2050). 2018.\n\n.. [#Wang2020RDP] Wang, Hu, et al. \"Unsupervised Representation Learning by Predicting Random Distances\". IJCAI (pp. 2950-2956). 2020.\n\n.. [#Liu2021RCA] Liu, Boyang, et al. \"RCA: A Deep Collaborative Autoencoder Approach for Anomaly Detection\". IJCAI (pp. 1505-1511). 2021.\n\n.. [#Bergman2020GOAD] Bergman, Liron, and Yedid Hoshen. \"Classification-Based Anomaly Detection for General Data\". ICLR. 2020.\n\n.. [#Qiu2021Neutral] Qiu, Chen, et al. \"Neural Transformation Learning for Deep Anomaly Detection Beyond Images\". ICML. 2021.\n\n\n",
    "bugtrack_url": null,
    "license": "MIT License",
    "summary": "",
    "version": "0.4.1",
    "project_urls": {
        "Homepage": "https://github.com/xuhongzuo/DeepOD"
    },
    "split_keywords": [
        "outlier detection",
        "anomaly detection",
        "deep anomaly detection",
        "deep learning",
        "data mining"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "d9b4992573ee30d872741424e31876605cad30c1d39a6decb645c9a26f6630a9",
                "md5": "28ad47fdb9265e3b688ee21208bbd532",
                "sha256": "3783561cb0f584882058a2273679ac817cf1b949d74a1088a6c0ba3424465981"
            },
            "downloads": -1,
            "filename": "deepod-0.4.1-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "28ad47fdb9265e3b688ee21208bbd532",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": null,
            "size": 203757,
            "upload_time": "2023-09-06T09:19:11",
            "upload_time_iso_8601": "2023-09-06T09:19:11.342496Z",
            "url": "https://files.pythonhosted.org/packages/d9/b4/992573ee30d872741424e31876605cad30c1d39a6decb645c9a26f6630a9/deepod-0.4.1-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "aaed298b8e4ebfaf970e1e18384a251cf065071cd81a27f8b408e3fb6d2e6a27",
                "md5": "a5ce2bbd7e10ec76c2c899eab9d592d6",
                "sha256": "153b2d65d8800ce65b65ac0fa48651e071d9ac2c71328cece1ea7217edaf6a2e"
            },
            "downloads": -1,
            "filename": "deepod-0.4.1.tar.gz",
            "has_sig": false,
            "md5_digest": "a5ce2bbd7e10ec76c2c899eab9d592d6",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": null,
            "size": 82456,
            "upload_time": "2023-09-06T09:19:13",
            "upload_time_iso_8601": "2023-09-06T09:19:13.887460Z",
            "url": "https://files.pythonhosted.org/packages/aa/ed/298b8e4ebfaf970e1e18384a251cf065071cd81a27f8b408e3fb6d2e6a27/deepod-0.4.1.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-09-06 09:19:13",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "xuhongzuo",
    "github_project": "DeepOD",
    "travis_ci": true,
    "coveralls": false,
    "github_actions": true,
    "requirements": [],
    "lcname": "deepod"
}
        
Elapsed time: 1.66813s