Dictances
=========================================================================================
|pip| |downloads|
Distances and divergences between discrete distributions described as dictionaries implemented in python.
These are meant as fast solutions to compute distances and divergences between discrete distributions,
expecially when the two distributions contains a significant amount of events with nill probability
which are not described in the dictionaries.
How do I install this package?
----------------------------------------------
As usual, just download it using pip:
.. code:: shell
pip install dictances
Available metrics
-----------------------------------------------
A number of distances and divergences are available:
.. role:: python(code)
:language: python
+----------------------------------------------------------------------------------------------------------------+-------------------------------------------------+
| Distances | Methods |
+================================================================================================================+=================================================+
| `Bhattacharyya distance <https://en.wikipedia.org/wiki/Bhattacharyya_distance>`__ | :python:`bhattacharyya` |
+----------------------------------------------------------------------------------------------------------------+-------------------------------------------------+
| `Bhattacharyya coefficient <https://en.wikipedia.org/wiki/Bhattacharyya_distance#Bhattacharyya_coefficient>`__ | :python:`bhattacharyya_coefficient` |
+----------------------------------------------------------------------------------------------------------------+-------------------------------------------------+
| `Canberra distance <https://en.wikipedia.org/wiki/Canberra_distance>`__ | :python:`canberra` |
+----------------------------------------------------------------------------------------------------------------+-------------------------------------------------+
| `Chebyshev distance <https://en.wikipedia.org/wiki/Chebyshev_distance>`__ | :python:`chebyshev` |
+----------------------------------------------------------------------------------------------------------------+-------------------------------------------------+
| `Chi Square distance <https://en.wikipedia.org/wiki/Chi-squared_test>`__ | :python:`chi_square` |
+----------------------------------------------------------------------------------------------------------------+-------------------------------------------------+
| `Cosine Distance <https://en.wikipedia.org/wiki/Cosine_similarity>`__ | :python:`cosine` |
+----------------------------------------------------------------------------------------------------------------+-------------------------------------------------+
| `Euclidean distance <https://en.wikipedia.org/wiki/Euclidean_distance>`__ | :python:`euclidean` |
+----------------------------------------------------------------------------------------------------------------+-------------------------------------------------+
| `Hamming distance <https://en.wikipedia.org/wiki/Hamming_distance>`__ | :python:`hamming` |
+----------------------------------------------------------------------------------------------------------------+-------------------------------------------------+
| `Jensen-Shannon divergence <https://en.wikipedia.org/wiki/Jensen%E2%80%93Shannon_divergence>`__ | :python:`jensen_shannon` |
+----------------------------------------------------------------------------------------------------------------+-------------------------------------------------+
| `Kullback-Leibler divergence <https://en.wikipedia.org/wiki/Kullback%E2%80%93Leibler_divergence>`__ | :python:`kullback_leibler` |
+----------------------------------------------------------------------------------------------------------------+-------------------------------------------------+
| `Mean absolute error <https://en.wikipedia.org/wiki/Mean_absolute_error>`__ | :python:`mae` |
+----------------------------------------------------------------------------------------------------------------+-------------------------------------------------+
| `Taxicab geometry <https://en.wikipedia.org/wiki/Taxicab_geometry>`__ | :python:`manhattan, cityblock, total_variation` |
+----------------------------------------------------------------------------------------------------------------+-------------------------------------------------+
| `Minkowski distance <https://en.wikipedia.org/wiki/Minkowski_distance>`__ | :python:`minkowsky` |
+----------------------------------------------------------------------------------------------------------------+-------------------------------------------------+
| `Mean squared error <https://en.wikipedia.org/wiki/Mean_squared_error>`__ | :python:`mse` |
+----------------------------------------------------------------------------------------------------------------+-------------------------------------------------+
| `Pearson's distance <https://en.wikipedia.org/wiki/Pearson_correlation_coefficient#Pearson's_distance>`__ | :python:`pearson` |
+----------------------------------------------------------------------------------------------------------------+-------------------------------------------------+
| `Squared deviations from the mean <https://en.wikipedia.org/wiki/Squared_deviations_from_the_mean>`__ | :python:`squared_variation` |
+----------------------------------------------------------------------------------------------------------------+-------------------------------------------------+
Usage example with points
--------------------------------------
Suppose you have a point described by `my_first_dictionary` and another one described by `my_second_dictionary`:
.. code:: python
from dictances import cosine
my_first_dictionary = {
"a": 56,
"b": 34,
"c": 89
}
my_second_dictionary = {
"a": 21,
"d": 51,
"e": 74
}
cosine(my_first_dictionary, my_second_dictionary)
#>>> 0.8847005261889619
Usage example with distributions
-----------------------------------------
Suppose you have a point described by `my_first_dictionary` and another one described by `my_second_dictionary`:
.. code:: python
from dictances import bhattacharyya, bhattacharyya_coefficient
a = {
"event_1": 0.4,
"event_2": 0.1,
"event_3": 0.2,
"event_4": 0.3,
}
b = {
"event_1": 0.1,
"event_2": 0.2,
"event_5": 0.2,
"event_9": 0.5,
}
bhattacharyya_coefficient(a, b)
#>>> 0.3414213562373095
bhattacharyya(a, b)
#>>> 1.07463791569453
Handling nested dictionaries
------------------------------------------
If you need to compute the distance between two nested dictionaries you can use `deflate_dict <https://github.com/LucaCappelletti94/deflate_dict>`_ as follows:
.. code:: python
from dictances import cosine
from deflate_dict import deflate
my_first_dictionary = {
"a": 8,
"b": {
"c": 3,
"d": 6
}
}
my_second_dictionary = {
"b": {
"c": 8,
"d": 1
},
"y": 3,
}
cosine(deflate(my_first_dictionary), deflate(my_second_dictionary))
.. |pip| image:: https://badge.fury.io/py/dictances.svg
:target: https://badge.fury.io/py/dictances
:alt: Pypi project
.. |downloads| image:: https://pepy.tech/badge/dictances
:target: https://pepy.tech/badge/dictances
:alt: Pypi total project downloads
Raw data
{
"_id": null,
"home_page": "https://github.com/LucaCappelletti94/dictances",
"name": "dictances",
"maintainer": "",
"docs_url": null,
"requires_python": "",
"maintainer_email": "",
"keywords": "",
"author": "Luca Cappelletti",
"author_email": "cappelletti.luca94@gmail.com",
"download_url": "https://files.pythonhosted.org/packages/ae/ee/79ce1e09ca9cc22f5da12fd7557afb312824b1baa90eb9d27b7fea77a4ae/dictances-1.5.6.tar.gz",
"platform": null,
"description": "Dictances\n=========================================================================================\n|pip| |downloads|\n\nDistances and divergences between discrete distributions described as dictionaries implemented in python.\n\nThese are meant as fast solutions to compute distances and divergences between discrete distributions,\nexpecially when the two distributions contains a significant amount of events with nill probability\nwhich are not described in the dictionaries.\n\nHow do I install this package?\n----------------------------------------------\nAs usual, just download it using pip:\n\n.. code:: shell\n\n pip install dictances\n\n\nAvailable metrics\n-----------------------------------------------\nA number of distances and divergences are available:\n\n.. role:: python(code)\n :language: python\n\n+----------------------------------------------------------------------------------------------------------------+-------------------------------------------------+\n| Distances | Methods |\n+================================================================================================================+=================================================+\n| `Bhattacharyya distance <https://en.wikipedia.org/wiki/Bhattacharyya_distance>`__ | :python:`bhattacharyya` |\n+----------------------------------------------------------------------------------------------------------------+-------------------------------------------------+\n| `Bhattacharyya coefficient <https://en.wikipedia.org/wiki/Bhattacharyya_distance#Bhattacharyya_coefficient>`__ | :python:`bhattacharyya_coefficient` |\n+----------------------------------------------------------------------------------------------------------------+-------------------------------------------------+\n| `Canberra distance <https://en.wikipedia.org/wiki/Canberra_distance>`__ | :python:`canberra` |\n+----------------------------------------------------------------------------------------------------------------+-------------------------------------------------+\n| `Chebyshev distance <https://en.wikipedia.org/wiki/Chebyshev_distance>`__ | :python:`chebyshev` |\n+----------------------------------------------------------------------------------------------------------------+-------------------------------------------------+\n| `Chi Square distance <https://en.wikipedia.org/wiki/Chi-squared_test>`__ | :python:`chi_square` |\n+----------------------------------------------------------------------------------------------------------------+-------------------------------------------------+\n| `Cosine Distance <https://en.wikipedia.org/wiki/Cosine_similarity>`__ | :python:`cosine` |\n+----------------------------------------------------------------------------------------------------------------+-------------------------------------------------+\n| `Euclidean distance <https://en.wikipedia.org/wiki/Euclidean_distance>`__ | :python:`euclidean` |\n+----------------------------------------------------------------------------------------------------------------+-------------------------------------------------+\n| `Hamming distance <https://en.wikipedia.org/wiki/Hamming_distance>`__ | :python:`hamming` |\n+----------------------------------------------------------------------------------------------------------------+-------------------------------------------------+\n| `Jensen-Shannon divergence <https://en.wikipedia.org/wiki/Jensen%E2%80%93Shannon_divergence>`__ | :python:`jensen_shannon` |\n+----------------------------------------------------------------------------------------------------------------+-------------------------------------------------+\n| `Kullback-Leibler divergence <https://en.wikipedia.org/wiki/Kullback%E2%80%93Leibler_divergence>`__ | :python:`kullback_leibler` |\n+----------------------------------------------------------------------------------------------------------------+-------------------------------------------------+\n| `Mean absolute error <https://en.wikipedia.org/wiki/Mean_absolute_error>`__ | :python:`mae` |\n+----------------------------------------------------------------------------------------------------------------+-------------------------------------------------+\n| `Taxicab geometry <https://en.wikipedia.org/wiki/Taxicab_geometry>`__ | :python:`manhattan, cityblock, total_variation` |\n+----------------------------------------------------------------------------------------------------------------+-------------------------------------------------+\n| `Minkowski distance <https://en.wikipedia.org/wiki/Minkowski_distance>`__ | :python:`minkowsky` |\n+----------------------------------------------------------------------------------------------------------------+-------------------------------------------------+\n| `Mean squared error <https://en.wikipedia.org/wiki/Mean_squared_error>`__ | :python:`mse` |\n+----------------------------------------------------------------------------------------------------------------+-------------------------------------------------+\n| `Pearson's distance <https://en.wikipedia.org/wiki/Pearson_correlation_coefficient#Pearson's_distance>`__ | :python:`pearson` |\n+----------------------------------------------------------------------------------------------------------------+-------------------------------------------------+\n| `Squared deviations from the mean <https://en.wikipedia.org/wiki/Squared_deviations_from_the_mean>`__ | :python:`squared_variation` |\n+----------------------------------------------------------------------------------------------------------------+-------------------------------------------------+\n\nUsage example with points\n--------------------------------------\nSuppose you have a point described by `my_first_dictionary` and another one described by `my_second_dictionary`:\n\n.. code:: python\n\n from dictances import cosine\n \n my_first_dictionary = {\n \"a\": 56,\n \"b\": 34,\n \"c\": 89\n }\n \n my_second_dictionary = {\n \"a\": 21,\n \"d\": 51,\n \"e\": 74\n }\n\n cosine(my_first_dictionary, my_second_dictionary)\n #>>> 0.8847005261889619\n\n\nUsage example with distributions\n-----------------------------------------\nSuppose you have a point described by `my_first_dictionary` and another one described by `my_second_dictionary`:\n\n.. code:: python\n \n from dictances import bhattacharyya, bhattacharyya_coefficient\n\n a = {\n \"event_1\": 0.4,\n \"event_2\": 0.1,\n \"event_3\": 0.2,\n \"event_4\": 0.3,\n }\n b = {\n \"event_1\": 0.1,\n \"event_2\": 0.2,\n \"event_5\": 0.2,\n \"event_9\": 0.5,\n }\n \n bhattacharyya_coefficient(a, b)\n #>>> 0.3414213562373095\n bhattacharyya(a, b)\n #>>> 1.07463791569453\n\n\nHandling nested dictionaries\n------------------------------------------\nIf you need to compute the distance between two nested dictionaries you can use `deflate_dict <https://github.com/LucaCappelletti94/deflate_dict>`_ as follows:\n\n.. code:: python\n\n from dictances import cosine\n from deflate_dict import deflate\n\n my_first_dictionary = {\n \"a\": 8,\n \"b\": {\n \"c\": 3,\n \"d\": 6\n }\n }\n\n my_second_dictionary = {\n \"b\": {\n \"c\": 8,\n \"d\": 1\n },\n \"y\": 3,\n\n }\n\n cosine(deflate(my_first_dictionary), deflate(my_second_dictionary))\n\n\n.. |pip| image:: https://badge.fury.io/py/dictances.svg\n :target: https://badge.fury.io/py/dictances\n :alt: Pypi project\n\n.. |downloads| image:: https://pepy.tech/badge/dictances\n :target: https://pepy.tech/badge/dictances\n :alt: Pypi total project downloads",
"bugtrack_url": null,
"license": "MIT",
"summary": "Distances and divergences between distributions implemented in python.",
"version": "1.5.6",
"split_keywords": [],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "aeee79ce1e09ca9cc22f5da12fd7557afb312824b1baa90eb9d27b7fea77a4ae",
"md5": "0694e4abc94a4fa1eca6ae665c96f7be",
"sha256": "c94cefa990d8301a225e82acd368f58612830b373572a7a677b8fab2dcde20b3"
},
"downloads": -1,
"filename": "dictances-1.5.6.tar.gz",
"has_sig": false,
"md5_digest": "0694e4abc94a4fa1eca6ae665c96f7be",
"packagetype": "sdist",
"python_version": "source",
"requires_python": null,
"size": 8272,
"upload_time": "2023-01-11T09:47:29",
"upload_time_iso_8601": "2023-01-11T09:47:29.353886Z",
"url": "https://files.pythonhosted.org/packages/ae/ee/79ce1e09ca9cc22f5da12fd7557afb312824b1baa90eb9d27b7fea77a4ae/dictances-1.5.6.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2023-01-11 09:47:29",
"github": true,
"gitlab": false,
"bitbucket": false,
"github_user": "LucaCappelletti94",
"github_project": "dictances",
"travis_ci": false,
"coveralls": false,
"github_actions": false,
"lcname": "dictances"
}