# Logic Tensor Networks (LTN)
Logic Tensor Network (LTN) is a neurosymbolic framework that supports querying, learning and reasoning with both rich data and rich abstract knowledge about the world.
LTN uses a differentiable first-order logic language, called Real Logic, to incorporate data and logic. The figure below describes features of Real Logic.

LTN converts Real Logic formulas (e.g. `∀x(cat(x) → ∃y(partOf(x,y)∧tail(y)))`) into [TensorFlow](https://www.tensorflow.org/) computational graphs.
Such formulas can express complex queries about the data, prior knowledge to satisfy during learning, statements to prove ... The next two figures describe how Real Logic sentences can represent computational graphs (inputs are on the left, outputs are on the right).


One can represent and effectively compute the most important tasks of deep learning. Examples of such tasks are classification, regression, clustering, or link prediction.
The ["Getting Started"](#getting-started) section of the README links to tutorials and examples of LTN code.
[[Paper]](https://www.sciencedirect.com/science/article/abs/pii/S0004370221002009) -- [[Preprint]](https://arxiv.org/pdf/2012.13635.pdf)
Cite as:
```
@article{badreddine2022logic,
title = {Logic Tensor Networks},
journal = {Artificial Intelligence},
volume = {303},
pages = {103649},
year = {2022},
issn = {0004-3702},
doi = {https://doi.org/10.1016/j.artint.2021.103649},
author = {Samy Badreddine and Artur {d'Avila Garcez} and Luciano Serafini and Michael Spranger},
keywords = {Neurosymbolic AI, Deep learning and reasoning, Many-valued logics}
}
```
## Installation
For the latest release version, install via pip. To install the core dependencies, run:
```
pip install ltn
```
If you need the dependencies used in the examples, run:
```
pip install ltn[examples]
```
For the latest development version, clone the github repository and install it locally (with or without dependency modifier).
```
pip install -e <local project path>
```
## Repository structure
- `ltn/core.py` -- core system for defining constants, variables, predicates, functions and formulas,
- `ltn/fuzzy_ops.py` -- a collection of fuzzy logic operators defined using Tensorflow primitives,
- `ltn/utils.py` -- a collection of useful functions,
- `tutorials/` -- tutorials to start with LTN,
- `examples/` -- various problems approached using LTN,
- `tests/` -- tests.
## Getting Started
### Tutorials
`tutorials/` contains a walk-through of LTN. In order, the tutorials cover the following topics:
1. [Grounding in LTN part 1](https://nbviewer.jupyter.org/github/logictensornetworks/logictensornetworks/blob/master/tutorials/1-grounding_non_logical_symbols.ipynb): Real Logic, constants, predicates, functions, variables,
2. [Grounding in LTN part 2](https://nbviewer.jupyter.org/github/logictensornetworks/logictensornetworks/blob/master/tutorials/2-grounding_connectives.ipynb): connectives and quantifiers (+ [complement](https://nbviewer.jupyter.org/github/logictensornetworks/logictensornetworks/blob/master/tutorials/2b-operators_and_gradients.ipynb): choosing appropriate operators for learning),
3. [Learning in LTN](https://nbviewer.jupyter.org/github/logictensornetworks/logictensornetworks/blob/master/tutorials/3-knowledgebase_and_learning.ipynb): using satisfiability of LTN formulas as a training objective,
4. [Reasoning in LTN](https://nbviewer.jupyter.org/github/logictensornetworks/logictensornetworks/blob/master/tutorials/4-reasoning.ipynb): measuring if a formula is the logical consequence of a knowledgebase.
The tutorials are implemented using jupyter notebooks.
### Examples
`examples/` contains a series of experiments. Their objective is to show how the language of Real Logic can be used to specify a number of tasks that involve learning from data and reasoning about logical knowledge. Examples of such tasks are: classification, regression, clustering, link prediction.
- The [binary classification](https://nbviewer.jupyter.org/github/logictensornetworks/logictensornetworks/blob/master/examples/binary_classification/binary_classification.ipynb) example illustrates in the simplest setting how to ground a binary classifier as a predicate in LTN, and how to feed batches of data during training,
- The multiclass classification examples ([single-label](https://nbviewer.jupyter.org/github/logictensornetworks/logictensornetworks/blob/master/examples/multiclass_classification/multiclass-singlelabel.ipynb), [multi-label](https://nbviewer.jupyter.org/github/logictensornetworks/logictensornetworks/blob/master/examples/multiclass_classification/multiclass-multilabel.ipynb)) illustrate how to ground predicates that can classify samples in several classes,
- The [MNIST digit addition](https://nbviewer.jupyter.org/github/logictensornetworks/logictensornetworks/blob/master/examples/mnist/single_digits_addition.ipynb) example showcases the power of a neurosymbolic approach in a classification task that only provides groundtruth for some final labels (result of the addition), where LTN is used to provide prior knowledge about intermediate labels (possible digits used in the addition),
- The [regression](https://nbviewer.jupyter.org/github/logictensornetworks/logictensornetworks/blob/master/examples/regression/regression.ipynb) example illustrates how to ground a regressor as a function symbol in LTN,
- The [clustering](https://nbviewer.jupyter.org/github/logictensornetworks/logictensornetworks/blob/master/examples/clustering/clustering.ipynb) example illustrates how LTN can solve a task using first-order constraints only, without any label being given through supervision,
- The [Smokes Friends Cancer](https://nbviewer.jupyter.org/github/logictensornetworks/logictensornetworks/blob/master/examples/smokes_friends_cancer/smokes_friends_cancer.ipynb) example is a classical link prediction problem of Statistical Relational Learning where LTN learns embeddings for individuals based on fuzzy groundtruths and first-order constraints.
The examples are presented with both jupyter notebooks and Python scripts.


## License
This project is licensed under the MIT License - see the [LICENSE](https://github.com/logictensornetworks/logictensornetworks/blob/master/LICENSE) file for details.
## Acknowledgements
LTN has been developed thanks to active contributions and discussions with the following people (in alphabetical order):
- Alessandro Daniele (FBK)
- Artur d’Avila Garcez (City)
- Benedikt Wagner (City)
- Emile van Krieken (VU Amsterdam)
- Francesco Giannini (UniSiena)
- Giuseppe Marra (UniSiena)
- Ivan Donadello (FBK)
- Lucas Bechberger (UniOsnabruck)
- Luciano Serafini (FBK)
- Marco Gori (UniSiena)
- Michael Spranger (Sony AI)
- Michelangelo Diligenti (UniSiena)
- Samy Badreddine (Sony AI)
Raw data
{
"_id": null,
"home_page": "",
"name": "ltn",
"maintainer": "",
"docs_url": null,
"requires_python": ">=3.7",
"maintainer_email": "",
"keywords": "machine-learning,framework,neural-symbolic-computing,fuzzy-logic,tensorflow",
"author": "",
"author_email": "Samy Badreddine <badreddine.samy@gmail.com>",
"download_url": "https://files.pythonhosted.org/packages/d4/0d/f948313a73808e833fde3bab085060f257e41b28736f0b41bf9a235c365b/ltn-2.1.tar.gz",
"platform": null,
"description": "# Logic Tensor Networks (LTN)\n\nLogic Tensor Network (LTN) is a neurosymbolic framework that supports querying, learning and reasoning with both rich data and rich abstract knowledge about the world.\nLTN uses a differentiable first-order logic language, called Real Logic, to incorporate data and logic. The figure below describes features of Real Logic.\n\n\n\nLTN converts Real Logic formulas (e.g. `\u2200x(cat(x) \u2192 \u2203y(partOf(x,y)\u2227tail(y)))`) into [TensorFlow](https://www.tensorflow.org/) computational graphs.\nSuch formulas can express complex queries about the data, prior knowledge to satisfy during learning, statements to prove ... The next two figures describe how Real Logic sentences can represent computational graphs (inputs are on the left, outputs are on the right).\n\n\n\n\n\nOne can represent and effectively compute the most important tasks of deep learning. Examples of such tasks are classification, regression, clustering, or link prediction.\nThe [\"Getting Started\"](#getting-started) section of the README links to tutorials and examples of LTN code.\n\n[[Paper]](https://www.sciencedirect.com/science/article/abs/pii/S0004370221002009) -- [[Preprint]](https://arxiv.org/pdf/2012.13635.pdf)\n\nCite as:\n```\n@article{badreddine2022logic,\ntitle = {Logic Tensor Networks},\njournal = {Artificial Intelligence},\nvolume = {303},\npages = {103649},\nyear = {2022},\nissn = {0004-3702},\ndoi = {https://doi.org/10.1016/j.artint.2021.103649},\nauthor = {Samy Badreddine and Artur {d'Avila Garcez} and Luciano Serafini and Michael Spranger},\nkeywords = {Neurosymbolic AI, Deep learning and reasoning, Many-valued logics}\n}\n```\n\n\n## Installation\n\nFor the latest release version, install via pip. To install the core dependencies, run:\n```\npip install ltn\n```\n\nIf you need the dependencies used in the examples, run:\n```\npip install ltn[examples]\n```\n\nFor the latest development version, clone the github repository and install it locally (with or without dependency modifier).\n```\npip install -e <local project path>\n```\n\n## Repository structure\n\n- `ltn/core.py` -- core system for defining constants, variables, predicates, functions and formulas,\n- `ltn/fuzzy_ops.py` -- a collection of fuzzy logic operators defined using Tensorflow primitives,\n- `ltn/utils.py` -- a collection of useful functions,\n- `tutorials/` -- tutorials to start with LTN,\n- `examples/` -- various problems approached using LTN,\n- `tests/` -- tests.\n\n## Getting Started\n\n### Tutorials\n\n`tutorials/` contains a walk-through of LTN. In order, the tutorials cover the following topics:\n1. [Grounding in LTN part 1](https://nbviewer.jupyter.org/github/logictensornetworks/logictensornetworks/blob/master/tutorials/1-grounding_non_logical_symbols.ipynb): Real Logic, constants, predicates, functions, variables,\n2. [Grounding in LTN part 2](https://nbviewer.jupyter.org/github/logictensornetworks/logictensornetworks/blob/master/tutorials/2-grounding_connectives.ipynb): connectives and quantifiers (+ [complement](https://nbviewer.jupyter.org/github/logictensornetworks/logictensornetworks/blob/master/tutorials/2b-operators_and_gradients.ipynb): choosing appropriate operators for learning),\n3. [Learning in LTN](https://nbviewer.jupyter.org/github/logictensornetworks/logictensornetworks/blob/master/tutorials/3-knowledgebase_and_learning.ipynb): using satisfiability of LTN formulas as a training objective,\n4. [Reasoning in LTN](https://nbviewer.jupyter.org/github/logictensornetworks/logictensornetworks/blob/master/tutorials/4-reasoning.ipynb): measuring if a formula is the logical consequence of a knowledgebase.\n\nThe tutorials are implemented using jupyter notebooks.\n\n### Examples\n\n`examples/` contains a series of experiments. Their objective is to show how the language of Real Logic can be used to specify a number of tasks that involve learning from data and reasoning about logical knowledge. Examples of such tasks are: classification, regression, clustering, link prediction.\n\n- The [binary classification](https://nbviewer.jupyter.org/github/logictensornetworks/logictensornetworks/blob/master/examples/binary_classification/binary_classification.ipynb) example illustrates in the simplest setting how to ground a binary classifier as a predicate in LTN, and how to feed batches of data during training,\n- The multiclass classification examples ([single-label](https://nbviewer.jupyter.org/github/logictensornetworks/logictensornetworks/blob/master/examples/multiclass_classification/multiclass-singlelabel.ipynb), [multi-label](https://nbviewer.jupyter.org/github/logictensornetworks/logictensornetworks/blob/master/examples/multiclass_classification/multiclass-multilabel.ipynb)) illustrate how to ground predicates that can classify samples in several classes,\n- The [MNIST digit addition](https://nbviewer.jupyter.org/github/logictensornetworks/logictensornetworks/blob/master/examples/mnist/single_digits_addition.ipynb) example showcases the power of a neurosymbolic approach in a classification task that only provides groundtruth for some final labels (result of the addition), where LTN is used to provide prior knowledge about intermediate labels (possible digits used in the addition),\n- The [regression](https://nbviewer.jupyter.org/github/logictensornetworks/logictensornetworks/blob/master/examples/regression/regression.ipynb) example illustrates how to ground a regressor as a function symbol in LTN,\n- The [clustering](https://nbviewer.jupyter.org/github/logictensornetworks/logictensornetworks/blob/master/examples/clustering/clustering.ipynb) example illustrates how LTN can solve a task using first-order constraints only, without any label being given through supervision,\n- The [Smokes Friends Cancer](https://nbviewer.jupyter.org/github/logictensornetworks/logictensornetworks/blob/master/examples/smokes_friends_cancer/smokes_friends_cancer.ipynb) example is a classical link prediction problem of Statistical Relational Learning where LTN learns embeddings for individuals based on fuzzy groundtruths and first-order constraints.\n\nThe examples are presented with both jupyter notebooks and Python scripts.\n\n\n\n\n\n\n## License\n\nThis project is licensed under the MIT License - see the [LICENSE](https://github.com/logictensornetworks/logictensornetworks/blob/master/LICENSE) file for details.\n\n## Acknowledgements\n\nLTN has been developed thanks to active contributions and discussions with the following people (in alphabetical order):\n- Alessandro Daniele (FBK)\n- Artur d\u2019Avila Garcez (City)\n- Benedikt Wagner (City)\n- Emile van Krieken (VU Amsterdam)\n- Francesco Giannini (UniSiena)\n- Giuseppe Marra (UniSiena)\n- Ivan Donadello (FBK)\n- Lucas Bechberger (UniOsnabruck)\n- Luciano Serafini (FBK)\n- Marco Gori (UniSiena)\n- Michael Spranger (Sony AI)\n- Michelangelo Diligenti (UniSiena)\n- Samy Badreddine (Sony AI)\n",
"bugtrack_url": null,
"license": "",
"summary": "Logic Tensor Networks",
"version": "2.1",
"split_keywords": [
"machine-learning",
"framework",
"neural-symbolic-computing",
"fuzzy-logic",
"tensorflow"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "bce135cd3491156fc238436a8651e4fee3664eadfb3a9f0081274196d0d86d6c",
"md5": "c17003a3b199038ccb17d6dea005bb36",
"sha256": "080e288784d1ad8e70f1b40f5c74a3478a459ddf7b80dec96e0127a8a15bb775"
},
"downloads": -1,
"filename": "ltn-2.1-py3-none-any.whl",
"has_sig": false,
"md5_digest": "c17003a3b199038ccb17d6dea005bb36",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.7",
"size": 13014,
"upload_time": "2023-03-31T04:59:05",
"upload_time_iso_8601": "2023-03-31T04:59:05.473235Z",
"url": "https://files.pythonhosted.org/packages/bc/e1/35cd3491156fc238436a8651e4fee3664eadfb3a9f0081274196d0d86d6c/ltn-2.1-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "d40df948313a73808e833fde3bab085060f257e41b28736f0b41bf9a235c365b",
"md5": "a8a24e1b165d67906066d9075c64bd1a",
"sha256": "c09c4ffcaa846ce0652116f7311598853ae01774e4ff077bab28583f73fc8f88"
},
"downloads": -1,
"filename": "ltn-2.1.tar.gz",
"has_sig": false,
"md5_digest": "a8a24e1b165d67906066d9075c64bd1a",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.7",
"size": 19640,
"upload_time": "2023-03-31T04:59:07",
"upload_time_iso_8601": "2023-03-31T04:59:07.653216Z",
"url": "https://files.pythonhosted.org/packages/d4/0d/f948313a73808e833fde3bab085060f257e41b28736f0b41bf9a235c365b/ltn-2.1.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2023-03-31 04:59:07",
"github": false,
"gitlab": false,
"bitbucket": false,
"lcname": "ltn"
}