dit


Namedit JSON
Version 1.5 PyPI version JSON
download
home_pagehttp://dit.io
SummaryPython package for information theory.
upload_time2022-03-20 20:34:17
maintainer
docs_urlNone
authorHumans
requires_python!=3.0.*, !=3.1.*, !=3.2.*, <4
licenseBSD
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            ``dit`` is a Python package for information theory.

|build| |codecov| |codacy| |deps|

|docs| |slack| |saythanks| |conda|

|joss| |zenodo|

Try ``dit`` live: |binder|

Introduction
------------

Information theory is a powerful extension to probability and statistics, quantifying dependencies
among arbitrary random variables in a way that is consistent and comparable across systems and
scales. Information theory was originally developed to quantify how quickly and reliably information
could be transmitted across an arbitrary channel. The demands of modern, data-driven science have
been coopting and extending these quantities and methods into unknown, multivariate settings where
the interpretation and best practices are not known. For example, there are at least four reasonable
multivariate generalizations of the mutual information, none of which inherit all the
interpretations of the standard bivariate case. Which is best to use is context-dependent. ``dit``
implements a vast range of multivariate information measures in an effort to allow information
practitioners to study how these various measures behave and interact in a variety of contexts. We
hope that having all these measures and techniques implemented in one place will allow the
development of robust techniques for the automated quantification of dependencies within a system
and concrete interpretation of what those dependencies mean.

Citing
------

If you use `dit` in your research, please cite it as::

   @article{dit,
     Author = {James, R. G. and Ellison, C. J. and Crutchfield, J. P.},
     Title = {{dit}: a {P}ython package for discrete information theory},
     Journal = {The Journal of Open Source Software},
     Volume = {3},
     Number = {25},
     Pages = {738},
     Year = {2018},
     Doi = {https://doi.org/10.21105/joss.00738}
   }

Basic Information
-----------------

Documentation
*************

http://docs.dit.io

Downloads
*********

https://pypi.org/project/dit/

https://anaconda.org/conda-forge/dit

+-------------------------------------------------------------------+
| Dependencies                                                      |
+===================================================================+
| * Python 3.3+                                                     |
| * `boltons <https://boltons.readthedocs.io>`_                     |
| * `debtcollector <https://docs.openstack.org/debtcollector/>`_    |
| * `lattices <https://github.com/dit/lattices>`_                   |
| * `networkx <https://networkx.github.io/>`_                       |
| * `numpy <http://www.numpy.org/>`_                                |
| * `PLTable <https://github.com/platomav/PLTable>`_                |
| * `scipy <https://www.scipy.org/>`_                               |
+-------------------------------------------------------------------+

Optional Dependencies
~~~~~~~~~~~~~~~~~~~~~
* colorama: colored column heads in PID indicating failure modes
* cython: faster sampling from distributions
* hypothesis: random sampling of distributions
* matplotlib, python-ternary: plotting of various information-theoretic expansions
* numdifftools: numerical evaluation of gradients and hessians during optimization
* pint: add units to informational values
* scikit-learn: faster nearest-neighbor lookups during entropy/mutual information estimation from samples

Install
*******

The easiest way to install is:

.. code-block:: bash

  pip install dit

If you want to install `dit` within a conda environment, you can simply do:

.. code-block:: bash

  conda install -c conda-forge dit

Alternatively, you can clone this repository, move into the newly created
``dit`` directory, and then install the package:

.. code-block:: bash

  git clone https://github.com/dit/dit.git
  cd dit
  pip install .

.. note::

  The cython extensions are currently not supported on windows. Please install
  using the ``--nocython`` option.


Testing
*******
.. code-block:: shell

  $ git clone https://github.com/dit/dit.git
  $ cd dit
  $ pip install -r requirements_testing.txt
  $ py.test

Code and bug tracker
********************

https://github.com/dit/dit

License
*******

BSD 3-Clause, see LICENSE.txt for details.

Implemented Measures
--------------------

``dit`` implements the following information measures. Most of these are implemented in multivariate & conditional
generality, where such generalizations either exist in the literature or are relatively obvious --- for example,
though it is not in the literature, the multivariate conditional exact common information is implemented here.

+------------------------------------------+-----------------------------------------+-----------------------------------+
| Entropies                                | Mutual Informations                     | Divergences                       |
|                                          |                                         |                                   |
| * Shannon Entropy                        | * Co-Information                        | * Variational Distance            |
| * Renyi Entropy                          | * Interaction Information               | * Kullback-Leibler Divergence \   |
| * Tsallis Entropy                        | * Total Correlation /                   |   Relative Entropy                |
| * Necessary Conditional Entropy          |   Multi-Information                     | * Cross Entropy                   |
| * Residual Entropy /                     | * Dual Total Correlation /              | * Jensen-Shannon Divergence       |
|   Independent Information /              |   Binding Information                   | * Earth Mover's Distance          |
|   Variation of Information               | * CAEKL Multivariate Mutual Information +-----------------------------------+
+------------------------------------------+-----------------------------------------+ Other Measures                    |
| Common Informations                      | Partial Information Decomposition       |                                   |
|                                          |                                         | * Channel Capacity                |
| * Gacs-Korner Common Information         | * I_{min}                               | * Complexity Profile              |
| * Wyner Common Information               | * I_{\wedge}                            | * Connected Informations          |
| * Exact Common Information               | * I_{RR}                                | * Copy Mutual Information         |
| * Functional Common Information          | * I_{\downarrow}                        | * Cumulative Residual Entropy     |
| * MSS Common Information                 | * I_{proj}                              | * Extropy                         |
+------------------------------------------+ * I_{BROJA}                             | * Hypercontractivity Coefficient  |
| Secret Key Agreement Bounds              | * I_{ccs}                               | * Information Bottleneck          |
|                                          | * I_{\pm}                               | * Information Diagrams            |
| * Secrecy Capacity                       | * I_{dep}                               | * Information Trimming            |
| * Intrinsic Mutual Information           | * I_{RAV}                               | * Lautum Information              |
| * Reduced Intrinsic Mutual Information   | * I_{mmi}                               | * LMPR Complexity                 |
| * Minimal Intrinsic Mutual Information   | * I_{\prec}                             | * Marginal Utility of Information |
| * Necessary Intrinsic Mutual Information | * I_{RA}                                | * Maximum Correlation             |
| * Two-Part Intrinsic Mutual Information  | * I_{SKAR}                              | * Maximum Entropy Distributions   |
|                                          |                                         | * Perplexity                      |
|                                          |                                         | * Rate-Distortion Theory          |
|                                          |                                         | * TSE Complexity                  |
+------------------------------------------+-----------------------------------------+-----------------------------------+

Quickstart
----------

The basic usage of ``dit`` corresponds to creating distributions, modifying them
if need be, and then computing properties of those distributions. First, we
import:

.. code:: python

   >>> import dit

Suppose we have a really thick coin, one so thick that there is a reasonable
chance of it landing on its edge. Here is how we might represent the coin in
``dit``.

.. code:: python

   >>> d = dit.Distribution(['H', 'T', 'E'], [.4, .4, .2])
   >>> print(d)
   Class:          Distribution
   Alphabet:       ('E', 'H', 'T') for all rvs
   Base:           linear
   Outcome Class:  str
   Outcome Length: 1
   RV Names:       None

   x   p(x)
   E   0.2
   H   0.4
   T   0.4

Calculate the probability of ``H`` and also of the combination ``H or T``.

.. code:: python

   >>> d['H']
   0.4
   >>> d.event_probability(['H','T'])
   0.8

Calculate the Shannon entropy and extropy of the joint distribution.

.. code:: python

   >>> dit.shannon.entropy(d)
   1.5219280948873621
   >>> dit.other.extropy(d)
   1.1419011889093373

Create a distribution where ``Z = xor(X, Y)``.

.. code:: python

   >>> import dit.example_dists
   >>> d = dit.example_dists.Xor()
   >>> d.set_rv_names(['X', 'Y', 'Z'])
   >>> print(d)
   Class:          Distribution
   Alphabet:       ('0', '1') for all rvs
   Base:           linear
   Outcome Class:  str
   Outcome Length: 3
   RV Names:       ('X', 'Y', 'Z')

   x     p(x)
   000   0.25
   011   0.25
   101   0.25
   110   0.25

Calculate the Shannon mutual informations ``I[X:Z]``, ``I[Y:Z]``, and
``I[X,Y:Z]``.

.. code:: python

   >>> dit.shannon.mutual_information(d, ['X'], ['Z'])
   0.0
   >>> dit.shannon.mutual_information(d, ['Y'], ['Z'])
   0.0
   >>> dit.shannon.mutual_information(d, ['X', 'Y'], ['Z'])
   1.0

Calculate the marginal distribution ``P(X,Z)``.
Then print its probabilities as fractions, showing the mask.

.. code:: python

   >>> d2 = d.marginal(['X', 'Z'])
   >>> print(d2.to_string(show_mask=True, exact=True))
   Class:          Distribution
   Alphabet:       ('0', '1') for all rvs
   Base:           linear
   Outcome Class:  str
   Outcome Length: 2 (mask: 3)
   RV Names:       ('X', 'Z')

   x     p(x)
   0*0   1/4
   0*1   1/4
   1*0   1/4
   1*1   1/4

Convert the distribution probabilities to log (base 3.5) probabilities, and
access its probability mass function.

.. code:: python

   >>> d2.set_base(3.5)
   >>> d2.pmf
   array([-1.10658951, -1.10658951, -1.10658951, -1.10658951])

Draw 5 random samples from this distribution.

.. code:: python

   >>> dit.math.prng.seed(1)
   >>> d2.rand(5)
   ['01', '10', '00', '01', '00']

Contributions & Help
--------------------

If you'd like to feature added to ``dit``, please file an issue. Or, better yet, open a pull request. Ideally, all code should be tested and documented, but please don't let this be a barrier to contributing. We'll work with you to ensure that all pull requests are in a mergable state.

If you'd like to get in contact about anything, you can reach us through our `slack channel <https://dit-python.slack.com/>`_.


.. badges:

.. |build| image:: https://github.com/dit/dit/actions/workflows/build.yml/badge.svg
   :target: https://github.com/dit/dit/actions/workflows/build.yml
   :alt: Continuous Integration Status

.. |codecov| image:: https://codecov.io/gh/dit/dit/branch/master/graph/badge.svg
  :target: https://codecov.io/gh/dit/dit
  :alt: Test Coverage Status

.. |coveralls| image:: https://coveralls.io/repos/dit/dit/badge.svg?branch=master
   :target: https://coveralls.io/r/dit/dit?branch=master
   :alt: Test Coverage Status

.. |docs| image:: https://readthedocs.org/projects/dit/badge/?version=latest
   :target: http://dit.readthedocs.org/en/latest/?badge=latest
   :alt: Documentation Status

.. |health| image:: https://landscape.io/github/dit/dit/master/landscape.svg?style=flat
   :target: https://landscape.io/github/dit/dit/master
   :alt: Code Health

.. |codacy| image:: https://api.codacy.com/project/badge/Grade/b1beeea8ada647d49f97648216fd9687
   :target: https://www.codacy.com/app/Autoplectic/dit?utm_source=github.com&amp;utm_medium=referral&amp;utm_content=dit/dit&amp;utm_campaign=Badge_Grade
   :alt: Code Quality

.. |deps| image:: https://requires.io/github/dit/dit/requirements.svg?branch=master
   :target: https://requires.io/github/dit/dit/requirements/?branch=master
   :alt: Requirements Status

.. |conda| image:: https://anaconda.org/conda-forge/dit/badges/installer/conda.svg
   :target: https://anaconda.org/conda-forge/dit
   :alt: Conda installation

.. |zenodo| image:: https://zenodo.org/badge/13201610.svg
   :target: https://zenodo.org/badge/latestdoi/13201610
   :alt: DOI

.. |gitter| image:: https://badges.gitter.im/Join%20Chat.svg
   :target: https://gitter.im/dit/dit?utm_source=badge&utm_medium=badge
   :alt: Join the Chat

.. |saythanks| image:: https://img.shields.io/badge/SayThanks.io-%E2%98%BC-1EAEDB.svg
   :target: https://saythanks.io/to/Autoplectic
   :alt: Say Thanks!

.. |depsy| image:: http://depsy.org/api/package/pypi/dit/badge.svg
   :target: http://depsy.org/package/python/dit
   :alt: Research software impact

.. |waffle| image:: https://badge.waffle.io/dit/dit.png?label=ready&title=Ready
   :target: https://waffle.io/dit/dit?utm_source=badge
   :alt: Stories in Ready

.. |slack| image:: https://img.shields.io/badge/Slack-dit--python-lightgrey.svg
   :target: https://dit-python.slack.com/
   :alt: dit chat

.. |joss| image:: http://joss.theoj.org/papers/10.21105/joss.00738/status.svg
   :target: https://doi.org/10.21105/joss.00738
   :alt: JOSS Status

.. |binder| image:: https://mybinder.org/badge.svg
   :target: https://mybinder.org/v2/gh/dit/dit/master?filepath=examples
   :alt: Run `dit` live!



            

Raw data

            {
    "_id": null,
    "home_page": "http://dit.io",
    "name": "dit",
    "maintainer": "",
    "docs_url": null,
    "requires_python": "!=3.0.*, !=3.1.*, !=3.2.*, <4",
    "maintainer_email": "",
    "keywords": "",
    "author": "Humans",
    "author_email": "admin@dit.io",
    "download_url": "https://files.pythonhosted.org/packages/9a/44/4c9ca52132947daafccf1f52f5b0cebc7f1bb62e34ae7723943682b5e686/dit-1.5.tar.gz",
    "platform": null,
    "description": "``dit`` is a Python package for information theory.\n\n|build| |codecov| |codacy| |deps|\n\n|docs| |slack| |saythanks| |conda|\n\n|joss| |zenodo|\n\nTry ``dit`` live: |binder|\n\nIntroduction\n------------\n\nInformation theory is a powerful extension to probability and statistics, quantifying dependencies\namong arbitrary random variables in a way that is consistent and comparable across systems and\nscales. Information theory was originally developed to quantify how quickly and reliably information\ncould be transmitted across an arbitrary channel. The demands of modern, data-driven science have\nbeen coopting and extending these quantities and methods into unknown, multivariate settings where\nthe interpretation and best practices are not known. For example, there are at least four reasonable\nmultivariate generalizations of the mutual information, none of which inherit all the\ninterpretations of the standard bivariate case. Which is best to use is context-dependent. ``dit``\nimplements a vast range of multivariate information measures in an effort to allow information\npractitioners to study how these various measures behave and interact in a variety of contexts. We\nhope that having all these measures and techniques implemented in one place will allow the\ndevelopment of robust techniques for the automated quantification of dependencies within a system\nand concrete interpretation of what those dependencies mean.\n\nCiting\n------\n\nIf you use `dit` in your research, please cite it as::\n\n   @article{dit,\n     Author = {James, R. G. and Ellison, C. J. and Crutchfield, J. P.},\n     Title = {{dit}: a {P}ython package for discrete information theory},\n     Journal = {The Journal of Open Source Software},\n     Volume = {3},\n     Number = {25},\n     Pages = {738},\n     Year = {2018},\n     Doi = {https://doi.org/10.21105/joss.00738}\n   }\n\nBasic Information\n-----------------\n\nDocumentation\n*************\n\nhttp://docs.dit.io\n\nDownloads\n*********\n\nhttps://pypi.org/project/dit/\n\nhttps://anaconda.org/conda-forge/dit\n\n+-------------------------------------------------------------------+\n| Dependencies                                                      |\n+===================================================================+\n| * Python 3.3+                                                     |\n| * `boltons <https://boltons.readthedocs.io>`_                     |\n| * `debtcollector <https://docs.openstack.org/debtcollector/>`_    |\n| * `lattices <https://github.com/dit/lattices>`_                   |\n| * `networkx <https://networkx.github.io/>`_                       |\n| * `numpy <http://www.numpy.org/>`_                                |\n| * `PLTable <https://github.com/platomav/PLTable>`_                |\n| * `scipy <https://www.scipy.org/>`_                               |\n+-------------------------------------------------------------------+\n\nOptional Dependencies\n~~~~~~~~~~~~~~~~~~~~~\n* colorama: colored column heads in PID indicating failure modes\n* cython: faster sampling from distributions\n* hypothesis: random sampling of distributions\n* matplotlib, python-ternary: plotting of various information-theoretic expansions\n* numdifftools: numerical evaluation of gradients and hessians during optimization\n* pint: add units to informational values\n* scikit-learn: faster nearest-neighbor lookups during entropy/mutual information estimation from samples\n\nInstall\n*******\n\nThe easiest way to install is:\n\n.. code-block:: bash\n\n  pip install dit\n\nIf you want to install `dit` within a conda environment, you can simply do:\n\n.. code-block:: bash\n\n  conda install -c conda-forge dit\n\nAlternatively, you can clone this repository, move into the newly created\n``dit`` directory, and then install the package:\n\n.. code-block:: bash\n\n  git clone https://github.com/dit/dit.git\n  cd dit\n  pip install .\n\n.. note::\n\n  The cython extensions are currently not supported on windows. Please install\n  using the ``--nocython`` option.\n\n\nTesting\n*******\n.. code-block:: shell\n\n  $ git clone https://github.com/dit/dit.git\n  $ cd dit\n  $ pip install -r requirements_testing.txt\n  $ py.test\n\nCode and bug tracker\n********************\n\nhttps://github.com/dit/dit\n\nLicense\n*******\n\nBSD 3-Clause, see LICENSE.txt for details.\n\nImplemented Measures\n--------------------\n\n``dit`` implements the following information measures. Most of these are implemented in multivariate & conditional\ngenerality, where such generalizations either exist in the literature or are relatively obvious --- for example,\nthough it is not in the literature, the multivariate conditional exact common information is implemented here.\n\n+------------------------------------------+-----------------------------------------+-----------------------------------+\n| Entropies                                | Mutual Informations                     | Divergences                       |\n|                                          |                                         |                                   |\n| * Shannon Entropy                        | * Co-Information                        | * Variational Distance            |\n| * Renyi Entropy                          | * Interaction Information               | * Kullback-Leibler Divergence \\   |\n| * Tsallis Entropy                        | * Total Correlation /                   |   Relative Entropy                |\n| * Necessary Conditional Entropy          |   Multi-Information                     | * Cross Entropy                   |\n| * Residual Entropy /                     | * Dual Total Correlation /              | * Jensen-Shannon Divergence       |\n|   Independent Information /              |   Binding Information                   | * Earth Mover's Distance          |\n|   Variation of Information               | * CAEKL Multivariate Mutual Information +-----------------------------------+\n+------------------------------------------+-----------------------------------------+ Other Measures                    |\n| Common Informations                      | Partial Information Decomposition       |                                   |\n|                                          |                                         | * Channel Capacity                |\n| * Gacs-Korner Common Information         | * I_{min}                               | * Complexity Profile              |\n| * Wyner Common Information               | * I_{\\wedge}                            | * Connected Informations          |\n| * Exact Common Information               | * I_{RR}                                | * Copy Mutual Information         |\n| * Functional Common Information          | * I_{\\downarrow}                        | * Cumulative Residual Entropy     |\n| * MSS Common Information                 | * I_{proj}                              | * Extropy                         |\n+------------------------------------------+ * I_{BROJA}                             | * Hypercontractivity Coefficient  |\n| Secret Key Agreement Bounds              | * I_{ccs}                               | * Information Bottleneck          |\n|                                          | * I_{\\pm}                               | * Information Diagrams            |\n| * Secrecy Capacity                       | * I_{dep}                               | * Information Trimming            |\n| * Intrinsic Mutual Information           | * I_{RAV}                               | * Lautum Information              |\n| * Reduced Intrinsic Mutual Information   | * I_{mmi}                               | * LMPR Complexity                 |\n| * Minimal Intrinsic Mutual Information   | * I_{\\prec}                             | * Marginal Utility of Information |\n| * Necessary Intrinsic Mutual Information | * I_{RA}                                | * Maximum Correlation             |\n| * Two-Part Intrinsic Mutual Information  | * I_{SKAR}                              | * Maximum Entropy Distributions   |\n|                                          |                                         | * Perplexity                      |\n|                                          |                                         | * Rate-Distortion Theory          |\n|                                          |                                         | * TSE Complexity                  |\n+------------------------------------------+-----------------------------------------+-----------------------------------+\n\nQuickstart\n----------\n\nThe basic usage of ``dit`` corresponds to creating distributions, modifying them\nif need be, and then computing properties of those distributions. First, we\nimport:\n\n.. code:: python\n\n   >>> import dit\n\nSuppose we have a really thick coin, one so thick that there is a reasonable\nchance of it landing on its edge. Here is how we might represent the coin in\n``dit``.\n\n.. code:: python\n\n   >>> d = dit.Distribution(['H', 'T', 'E'], [.4, .4, .2])\n   >>> print(d)\n   Class:          Distribution\n   Alphabet:       ('E', 'H', 'T') for all rvs\n   Base:           linear\n   Outcome Class:  str\n   Outcome Length: 1\n   RV Names:       None\n\n   x   p(x)\n   E   0.2\n   H   0.4\n   T   0.4\n\nCalculate the probability of ``H`` and also of the combination ``H or T``.\n\n.. code:: python\n\n   >>> d['H']\n   0.4\n   >>> d.event_probability(['H','T'])\n   0.8\n\nCalculate the Shannon entropy and extropy of the joint distribution.\n\n.. code:: python\n\n   >>> dit.shannon.entropy(d)\n   1.5219280948873621\n   >>> dit.other.extropy(d)\n   1.1419011889093373\n\nCreate a distribution where ``Z = xor(X, Y)``.\n\n.. code:: python\n\n   >>> import dit.example_dists\n   >>> d = dit.example_dists.Xor()\n   >>> d.set_rv_names(['X', 'Y', 'Z'])\n   >>> print(d)\n   Class:          Distribution\n   Alphabet:       ('0', '1') for all rvs\n   Base:           linear\n   Outcome Class:  str\n   Outcome Length: 3\n   RV Names:       ('X', 'Y', 'Z')\n\n   x     p(x)\n   000   0.25\n   011   0.25\n   101   0.25\n   110   0.25\n\nCalculate the Shannon mutual informations ``I[X:Z]``, ``I[Y:Z]``, and\n``I[X,Y:Z]``.\n\n.. code:: python\n\n   >>> dit.shannon.mutual_information(d, ['X'], ['Z'])\n   0.0\n   >>> dit.shannon.mutual_information(d, ['Y'], ['Z'])\n   0.0\n   >>> dit.shannon.mutual_information(d, ['X', 'Y'], ['Z'])\n   1.0\n\nCalculate the marginal distribution ``P(X,Z)``.\nThen print its probabilities as fractions, showing the mask.\n\n.. code:: python\n\n   >>> d2 = d.marginal(['X', 'Z'])\n   >>> print(d2.to_string(show_mask=True, exact=True))\n   Class:          Distribution\n   Alphabet:       ('0', '1') for all rvs\n   Base:           linear\n   Outcome Class:  str\n   Outcome Length: 2 (mask: 3)\n   RV Names:       ('X', 'Z')\n\n   x     p(x)\n   0*0   1/4\n   0*1   1/4\n   1*0   1/4\n   1*1   1/4\n\nConvert the distribution probabilities to log (base 3.5) probabilities, and\naccess its probability mass function.\n\n.. code:: python\n\n   >>> d2.set_base(3.5)\n   >>> d2.pmf\n   array([-1.10658951, -1.10658951, -1.10658951, -1.10658951])\n\nDraw 5 random samples from this distribution.\n\n.. code:: python\n\n   >>> dit.math.prng.seed(1)\n   >>> d2.rand(5)\n   ['01', '10', '00', '01', '00']\n\nContributions & Help\n--------------------\n\nIf you'd like to feature added to ``dit``, please file an issue. Or, better yet, open a pull request. Ideally, all code should be tested and documented, but please don't let this be a barrier to contributing. We'll work with you to ensure that all pull requests are in a mergable state.\n\nIf you'd like to get in contact about anything, you can reach us through our `slack channel <https://dit-python.slack.com/>`_.\n\n\n.. badges:\n\n.. |build| image:: https://github.com/dit/dit/actions/workflows/build.yml/badge.svg\n   :target: https://github.com/dit/dit/actions/workflows/build.yml\n   :alt: Continuous Integration Status\n\n.. |codecov| image:: https://codecov.io/gh/dit/dit/branch/master/graph/badge.svg\n  :target: https://codecov.io/gh/dit/dit\n  :alt: Test Coverage Status\n\n.. |coveralls| image:: https://coveralls.io/repos/dit/dit/badge.svg?branch=master\n   :target: https://coveralls.io/r/dit/dit?branch=master\n   :alt: Test Coverage Status\n\n.. |docs| image:: https://readthedocs.org/projects/dit/badge/?version=latest\n   :target: http://dit.readthedocs.org/en/latest/?badge=latest\n   :alt: Documentation Status\n\n.. |health| image:: https://landscape.io/github/dit/dit/master/landscape.svg?style=flat\n   :target: https://landscape.io/github/dit/dit/master\n   :alt: Code Health\n\n.. |codacy| image:: https://api.codacy.com/project/badge/Grade/b1beeea8ada647d49f97648216fd9687\n   :target: https://www.codacy.com/app/Autoplectic/dit?utm_source=github.com&amp;utm_medium=referral&amp;utm_content=dit/dit&amp;utm_campaign=Badge_Grade\n   :alt: Code Quality\n\n.. |deps| image:: https://requires.io/github/dit/dit/requirements.svg?branch=master\n   :target: https://requires.io/github/dit/dit/requirements/?branch=master\n   :alt: Requirements Status\n\n.. |conda| image:: https://anaconda.org/conda-forge/dit/badges/installer/conda.svg\n   :target: https://anaconda.org/conda-forge/dit\n   :alt: Conda installation\n\n.. |zenodo| image:: https://zenodo.org/badge/13201610.svg\n   :target: https://zenodo.org/badge/latestdoi/13201610\n   :alt: DOI\n\n.. |gitter| image:: https://badges.gitter.im/Join%20Chat.svg\n   :target: https://gitter.im/dit/dit?utm_source=badge&utm_medium=badge\n   :alt: Join the Chat\n\n.. |saythanks| image:: https://img.shields.io/badge/SayThanks.io-%E2%98%BC-1EAEDB.svg\n   :target: https://saythanks.io/to/Autoplectic\n   :alt: Say Thanks!\n\n.. |depsy| image:: http://depsy.org/api/package/pypi/dit/badge.svg\n   :target: http://depsy.org/package/python/dit\n   :alt: Research software impact\n\n.. |waffle| image:: https://badge.waffle.io/dit/dit.png?label=ready&title=Ready\n   :target: https://waffle.io/dit/dit?utm_source=badge\n   :alt: Stories in Ready\n\n.. |slack| image:: https://img.shields.io/badge/Slack-dit--python-lightgrey.svg\n   :target: https://dit-python.slack.com/\n   :alt: dit chat\n\n.. |joss| image:: http://joss.theoj.org/papers/10.21105/joss.00738/status.svg\n   :target: https://doi.org/10.21105/joss.00738\n   :alt: JOSS Status\n\n.. |binder| image:: https://mybinder.org/badge.svg\n   :target: https://mybinder.org/v2/gh/dit/dit/master?filepath=examples\n   :alt: Run `dit` live!\n\n\n",
    "bugtrack_url": null,
    "license": "BSD",
    "summary": "Python package for information theory.",
    "version": "1.5",
    "project_urls": {
        "Homepage": "http://dit.io"
    },
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "2b3780ab8294550d9549627fa935bee5d794572e66ad2e1b3b2204e59e2c6162",
                "md5": "d6c8d3e20014ea4150f01199b2976ef6",
                "sha256": "8a6c50818cb2edf4b64415aa6bf0df0d3ed557cf6f2e6954f5618472c34aa2c2"
            },
            "downloads": -1,
            "filename": "dit-1.5-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "d6c8d3e20014ea4150f01199b2976ef6",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "!=3.0.*, !=3.1.*, !=3.2.*, <4",
            "size": 298365,
            "upload_time": "2022-03-20T20:34:15",
            "upload_time_iso_8601": "2022-03-20T20:34:15.110717Z",
            "url": "https://files.pythonhosted.org/packages/2b/37/80ab8294550d9549627fa935bee5d794572e66ad2e1b3b2204e59e2c6162/dit-1.5-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "9a444c9ca52132947daafccf1f52f5b0cebc7f1bb62e34ae7723943682b5e686",
                "md5": "cb85ed3d722fc2cc116827e7efaf6b4a",
                "sha256": "fa9972ea1082ae594d53d1518fd1bb3679a4b844bc13c90e646e68232d7f9092"
            },
            "downloads": -1,
            "filename": "dit-1.5.tar.gz",
            "has_sig": false,
            "md5_digest": "cb85ed3d722fc2cc116827e7efaf6b4a",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "!=3.0.*, !=3.1.*, !=3.2.*, <4",
            "size": 232905,
            "upload_time": "2022-03-20T20:34:17",
            "upload_time_iso_8601": "2022-03-20T20:34:17.529073Z",
            "url": "https://files.pythonhosted.org/packages/9a/44/4c9ca52132947daafccf1f52f5b0cebc7f1bb62e34ae7723943682b5e686/dit-1.5.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2022-03-20 20:34:17",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "lcname": "dit"
}
        
Elapsed time: 0.19644s