================================================
ML2: Machine Learning for Mathematics and Logics
================================================
.. image:: https://img.shields.io/pypi/pyversions/ml2
:target: https://www.python.org
.. image:: https://img.shields.io/pypi/v/ml2
:target: https://pypi.org/project/ml2/
.. image:: https://img.shields.io/github/license/reactive-systems/ml2
:target: https://github.com/reactive-systems/ml2/blob/main/LICENSE
ML2 is an open source Python library for machine learning research on mathematical and logical reasoning problems. The library includes the (re-)implementation of the research papers `Teaching Temporal Logics to Neural Networks <https://iclr.cc/virtual/2021/poster/3332>`_ and `Neural Circuit Synthesis from Specification Patterns <https://proceedings.neurips.cc/paper/2021/file/8230bea7d54bcdf99cdfe85cb07313d5-Paper.pdf>`_. So far, the focus of ML2 is on propositional and linear-time temporal logic (LTL) and transformer architectures. ML2 is actively developed at `CISPA Helmholtz Center for Information Security <https://cispa.de/en>`_.
Requirements
------------
- `Docker <https://www.docker.com>`_
- `Python 3.8 <https://www.python.org/dev/peps/pep-0569/>`_
Note on Docker: For data generation, evaluation, and benchmarking ML2 uses a variety of research tools (e.g. SAT solvers, model checkers, and synthesis tools). For ease of use, each tool is encapsulated in a Docker container that is automatically pulled and started when the tool is needed. Thus, ML2 requires Docker to be installed and running.
Installation
------------
**Before installing ML2, please note the Docker requirement.**
From PyPI
~~~~~~~~~
Install ML2 from PyPI with ``pip install ml2``.
From Source
~~~~~~~~~~~
To install ML2 from source, clone the git repo and install with pip as follows:
.. code:: shell
git clone https://github.com/reactive-systems/ml2.git && \
cd ml2 && \
pip install .
For development pip install in editable mode and include the development dependencies as follows:
.. code:: shell
pip install -e .[dev]
Neural Circuit Synthesis (`presented at NeurIPS 21 <https://proceedings.neurips.cc/paper/2021/file/8230bea7d54bcdf99cdfe85cb07313d5-Paper.pdf>`_)
--------------------------------------------------------------------------------------------------------------------------------------------------------
In this project, hierarchical Transformers were trained to synthesize hardware circuits directly out of high-level specifications in a temporal logic. The lack of sufficient amounts of training data was tackled with a method to generate large amounts of additional training data, i.e., pairs of specifications and circuits implementing them by mining common specification patterns from the annual synthesis competition `SYNTCOMP <syntcomp.org>`_.
Training
~~~~~~~~
To train a hierarchical Transformer with default parameters:
.. code:: shell
python -m ml2.experiment.run configs/ltl-syn/neurips21/ht.json
Evaluation
~~~~~~~~~~
To evaluate a hierarchical Transformer trained on the circuit synthesis task:
.. code:: shell
python -m ml2.experiment.run configs/ltl-syn/neurips21/ht-sem-eval.json`
Datasets and Data Generation
~~~~~~~~~~~~~~~~~~~~~~~~~~~~
To generate a dataset of specifications and AIGER circuits:
.. code:: shell
python -m ml2.ltl.ltl_syn.ltl_syn_data_gen_patterns --name dataset`
How to Cite
~~~~~~~~~~~
.. code:: tex
@inproceedings{neural_circuit_synthesis,
author = {Frederik Schmitt and Christopher Hahn and Markus N. Rabe and Bernd Finkbeiner},
title = {Neural Circuit Synthesis from Specification Patterns},
booktitle = {Advances in Neural Information Processing Systems 34 Pre-proceedings},
year = {2021}
url = {https://proceedings.neurips.cc/paper/2021/hash/8230bea7d54bcdf99cdfe85cb07313d5-Abstract.html}
}
Teaching Temporal Logics to Neural Networks (`presented at ICLR 21 <https://iclr.cc/virtual/2021/poster/3332>`_)
-------------------------------------------------------------------------------------------------------------------
In this project, Transformers were trained on the problem of finding a satisfying trace to a linear-time temporal logic (LTL) formula. While the training data was generated with classical solvers, providing only one of a possibly infinite number of solutions, the Transformers successfully generalized: while often deviating from the solutions found by the classical solver, they still predicted a correct solution to most formulas. Generalization was also demonstrated on larger formulas and formulas on which the classical solver timed out.
Training
~~~~~~~~
To train a Transformer with default parameters on the trace generation problem:
.. code:: shell
python -m ml2.experiment.run configs/ltl-strace/iclr21/t.json
For the propositional satisfiability experiment:
.. code:: shell
python -m ml2.experiment.run configs/prop-sat/iclr21/t.json
Evaluation
~~~~~~~~~~
To evaluate a Transformer trained on the trace generation problem:
.. code:: shell
python -m ml2.experiment.run configs/ltl-strace/iclr21/t-sem-eval.json`
How to Cite
~~~~~~~~~~~
.. code:: tex
@inproceedings{teaching_temporal_logics,
title = {Teaching Temporal Logics to Neural Networks},
author = {Christopher Hahn and Frederik Schmitt and Jens U. Kreber and Markus N. Rabe and Bernd Finkbeiner},
booktitle = {International Conference on Learning Representations},
year = {2021},
url = {https://openreview.net/forum?id=dOcQK-f4byz}
}
Raw data
{
"_id": null,
"home_page": "https://github.com/reactive-systems/ml2",
"name": "ml2",
"maintainer": "",
"docs_url": null,
"requires_python": ">=3.8",
"maintainer_email": "",
"keywords": "machine learning,deep learning,mathematics,logics,neural-symbolic,system 2",
"author": "",
"author_email": "",
"download_url": "https://files.pythonhosted.org/packages/ea/e6/62165593e55248bdb0c1909f549e67a2bf526d44f6e6fba46954be4c168d/ml2-0.2.0.tar.gz",
"platform": null,
"description": "================================================\nML2: Machine Learning for Mathematics and Logics\n================================================\n\n\n.. image:: https://img.shields.io/pypi/pyversions/ml2\n :target: https://www.python.org\n.. image:: https://img.shields.io/pypi/v/ml2\n :target: https://pypi.org/project/ml2/\n.. image:: https://img.shields.io/github/license/reactive-systems/ml2 \n :target: https://github.com/reactive-systems/ml2/blob/main/LICENSE\n\n\nML2 is an open source Python library for machine learning research on mathematical and logical reasoning problems. The library includes the (re-)implementation of the research papers `Teaching Temporal Logics to Neural Networks <https://iclr.cc/virtual/2021/poster/3332>`_ and `Neural Circuit Synthesis from Specification Patterns <https://proceedings.neurips.cc/paper/2021/file/8230bea7d54bcdf99cdfe85cb07313d5-Paper.pdf>`_. So far, the focus of ML2 is on propositional and linear-time temporal logic (LTL) and transformer architectures. ML2 is actively developed at `CISPA Helmholtz Center for Information Security <https://cispa.de/en>`_.\n\n\nRequirements\n------------\n\n- `Docker <https://www.docker.com>`_\n- `Python 3.8 <https://www.python.org/dev/peps/pep-0569/>`_\n\nNote on Docker: For data generation, evaluation, and benchmarking ML2 uses a variety of research tools (e.g. SAT solvers, model checkers, and synthesis tools). For ease of use, each tool is encapsulated in a Docker container that is automatically pulled and started when the tool is needed. Thus, ML2 requires Docker to be installed and running.\n\nInstallation\n------------\n\n**Before installing ML2, please note the Docker requirement.**\n\nFrom PyPI\n~~~~~~~~~\n\nInstall ML2 from PyPI with ``pip install ml2``.\n\nFrom Source\n~~~~~~~~~~~\n\nTo install ML2 from source, clone the git repo and install with pip as follows:\n\n.. code:: shell\n\n git clone https://github.com/reactive-systems/ml2.git && \\\n cd ml2 && \\\n pip install .\n\nFor development pip install in editable mode and include the development dependencies as follows:\n\n.. code:: shell\n\n pip install -e .[dev]\n\n\nNeural Circuit Synthesis (`presented at NeurIPS 21 <https://proceedings.neurips.cc/paper/2021/file/8230bea7d54bcdf99cdfe85cb07313d5-Paper.pdf>`_)\n--------------------------------------------------------------------------------------------------------------------------------------------------------\n\nIn this project, hierarchical Transformers were trained to synthesize hardware circuits directly out of high-level specifications in a temporal logic. The lack of sufficient amounts of training data was tackled with a method to generate large amounts of additional training data, i.e., pairs of specifications and circuits implementing them by mining common specification patterns from the annual synthesis competition `SYNTCOMP <syntcomp.org>`_.\n\nTraining\n~~~~~~~~\n\nTo train a hierarchical Transformer with default parameters:\n\n.. code:: shell\n\n python -m ml2.experiment.run configs/ltl-syn/neurips21/ht.json\n\nEvaluation\n~~~~~~~~~~\n\nTo evaluate a hierarchical Transformer trained on the circuit synthesis task:\n\n.. code:: shell\n\n python -m ml2.experiment.run configs/ltl-syn/neurips21/ht-sem-eval.json`\n\nDatasets and Data Generation\n~~~~~~~~~~~~~~~~~~~~~~~~~~~~\n\nTo generate a dataset of specifications and AIGER circuits:\n\n.. code:: shell\n \n python -m ml2.ltl.ltl_syn.ltl_syn_data_gen_patterns --name dataset`\n\nHow to Cite\n~~~~~~~~~~~\n\n.. code:: tex\n\n @inproceedings{neural_circuit_synthesis,\n author = {Frederik Schmitt and Christopher Hahn and Markus N. Rabe and Bernd Finkbeiner},\n title = {Neural Circuit Synthesis from Specification Patterns},\n booktitle = {Advances in Neural Information Processing Systems 34 Pre-proceedings},\n year = {2021}\n url = {https://proceedings.neurips.cc/paper/2021/hash/8230bea7d54bcdf99cdfe85cb07313d5-Abstract.html}\n }\n\n\nTeaching Temporal Logics to Neural Networks (`presented at ICLR 21 <https://iclr.cc/virtual/2021/poster/3332>`_)\n-------------------------------------------------------------------------------------------------------------------\n\nIn this project, Transformers were trained on the problem of finding a satisfying trace to a linear-time temporal logic (LTL) formula. While the training data was generated with classical solvers, providing only one of a possibly infinite number of solutions, the Transformers successfully generalized: while often deviating from the solutions found by the classical solver, they still predicted a correct solution to most formulas. Generalization was also demonstrated on larger formulas and formulas on which the classical solver timed out.\n\nTraining\n~~~~~~~~\n\nTo train a Transformer with default parameters on the trace generation problem:\n\n.. code:: shell\n\n python -m ml2.experiment.run configs/ltl-strace/iclr21/t.json\n\nFor the propositional satisfiability experiment:\n\n.. code:: shell\n\n python -m ml2.experiment.run configs/prop-sat/iclr21/t.json\n\nEvaluation\n~~~~~~~~~~\n\nTo evaluate a Transformer trained on the trace generation problem:\n\n.. code:: shell\n\n python -m ml2.experiment.run configs/ltl-strace/iclr21/t-sem-eval.json`\n\nHow to Cite\n~~~~~~~~~~~\n\n.. code:: tex\n\n @inproceedings{teaching_temporal_logics,\n title = {Teaching Temporal Logics to Neural Networks},\n author = {Christopher Hahn and Frederik Schmitt and Jens U. Kreber and Markus N. Rabe and Bernd Finkbeiner},\n booktitle = {International Conference on Learning Representations},\n year = {2021},\n url = {https://openreview.net/forum?id=dOcQK-f4byz}\n }\n",
"bugtrack_url": null,
"license": "MIT License",
"summary": "Machine Learning for Mathematics and Logics",
"version": "0.2.0",
"project_urls": {
"Homepage": "https://github.com/reactive-systems/ml2"
},
"split_keywords": [
"machine learning",
"deep learning",
"mathematics",
"logics",
"neural-symbolic",
"system 2"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "016e2a859e8268ad8ec6dc48c9ee0bfa776a97da9ea50f1a15c091cb2c2c6779",
"md5": "ab4e80e58e2856a26d441973af86f80e",
"sha256": "f07263e8d5a2f8097022ec6fb9eaa61ae06ced31a2f68f3572c2140e30fa4da6"
},
"downloads": -1,
"filename": "ml2-0.2.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "ab4e80e58e2856a26d441973af86f80e",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.8",
"size": 337514,
"upload_time": "2024-01-19T16:29:30",
"upload_time_iso_8601": "2024-01-19T16:29:30.525799Z",
"url": "https://files.pythonhosted.org/packages/01/6e/2a859e8268ad8ec6dc48c9ee0bfa776a97da9ea50f1a15c091cb2c2c6779/ml2-0.2.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "eae662165593e55248bdb0c1909f549e67a2bf526d44f6e6fba46954be4c168d",
"md5": "99a9d0e07a7bf027d331ebbdf230d944",
"sha256": "2a4eac8523f2dab467b8ddf431a5137ba72f22c6084da4118a11ba1374ed3d7a"
},
"downloads": -1,
"filename": "ml2-0.2.0.tar.gz",
"has_sig": false,
"md5_digest": "99a9d0e07a7bf027d331ebbdf230d944",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.8",
"size": 200904,
"upload_time": "2024-01-19T16:29:32",
"upload_time_iso_8601": "2024-01-19T16:29:32.106974Z",
"url": "https://files.pythonhosted.org/packages/ea/e6/62165593e55248bdb0c1909f549e67a2bf526d44f6e6fba46954be4c168d/ml2-0.2.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-01-19 16:29:32",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "reactive-systems",
"github_project": "ml2",
"travis_ci": false,
"coveralls": false,
"github_actions": false,
"lcname": "ml2"
}