Name | SparkleAI JSON |
Version |
0.9.1.2
JSON |
| download |
home_page | https://github.com/thijssnelleman/Sparkle |
Summary | Sparkle is a Programming by Optimisation (PbO)-based problem-solving platform designed to enable the widespread and effective use of PbO techniques for improving the state-of-the-art in solving a broad range of prominent AI problems, including SAT and AI Planning. |
upload_time | 2024-12-12 07:26:09 |
maintainer | None |
docs_url | None |
author | Thijs Snelleman |
requires_python | None |
license | None |
keywords |
ai
sat
planning
|
VCS |
|
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
# _Sparkle_
[![Tests](https://ada-research.github.io/Sparkle/_static/junit/junit-badge.svg)](https://ada-research.github.io/Sparkle/_static/junit/index.html)
![tests status](https://github.com/ada-research/sparkle/actions/workflows/unittest.yml/badge.svg?event=push)
[![Coverage Status](https://ada-research.github.io/Sparkle/_static/coverage/coverage-badge.svg)](https://ada-research.github.io/Sparkle/_static/coverage/index.html)
![linter](https://github.com/ada-research/sparkle/actions/workflows/linter.yml/badge.svg?event=push)
![docs](https://github.com/ada-research/sparkle/actions/workflows/documentation.yml/badge.svg?event=push)
> A Programming by Optimisation (PbO)-based problem-solving platform designed to enable the widespread and effective use of PbO techniques for improving the state-of-the-art in solving a broad range of prominent AI problems, including SAT and AI Planning.
Specifically, Sparkle facilitates the use of:
* Automated algorithm configuration
* Automated algorithm selection
Furthermore, Sparkle handles various tasks for the user such as:
* Algorithm meta information collection and statistics calculation
* Instance/Data Set management and feature extraction
* Compute cluster job submission and monitoring
* Log file collection
## Installation
The quick and full installation of Sparkle can be done using Conda (For Conda installation see [here]( https://docs.conda.io/en/latest/miniconda.html)).
Simply download the `environment.yml` file from the [Github](https://github.com/ADA-research/Sparkle/blob/main/environment.yml) with wget:
```bash
wget https://raw.githubusercontent.com/ADA-research/Sparkle/main/environment.yml
```
and run:
```bash
conda env create -f environment.yml
```
The installation of the environment may take up to five minutes depending on your internet connection.
Once the environment has been created it can be activated by:
```
conda activate sparkle
```
```{note}
The creation of the Conda environment also takes care of the installation of the Sparkle package itself.
```
```{note}
You will need to reactivate the environment every time you start the terminal, before using Sparkle.
```
Sparkle can also be installed as a standalone package using Pip. We recommend creating a new virtual environment (For example, [venv](https://docs.python.org/3/library/venv.html)) before to ensure no clashes between dependencies occur.
```bash
pip install SparkleAI
```
Note that a direct installation through Pip does not handle certain dependencies of the Sparkle CLI, such as the required libraries for compiling [RunSolver](https://www.cril.univ-artois.fr/~roussel/runsolver/).
### Install dependencies
Asside from several package dependencies, Sparkle's package / CLI relies on a few user supplied executables:
- `LaTex` compiler ([pdflatex](https://gist.github.com/rain1024/98dd5e2c6c8c28f9ea9d)) for report generation
- `Java`, tested with version 1.8.0_402, in order to use SMAC2
- `R`, tested with version 4.3.1 in order to use IRACE
Other dependencies are handled by the Conda environment, but if that is not an option for you please ensure you have the following:
- [libnuma](https://anaconda.org/esrf-bcu/libnuma) and [numactl](https://anaconda.org/brown-data-science/numactl) for [Runsolver](http://www.cril.univ-artois.fr/~roussel/runsolver/) compilation which sparkle uses to measure solvers meta data. This is restricted to Linux based systems.
- [Swig](https://anaconda.org/conda-forge/swig/) 4.0.2 for [SMAC3](https://github.com/automl/SMAC3), which is in turn used by [AutoFolio](https://github.com/automl/AutoFolio).
For detailed installation instructions see the documentation: https://ada-research.github.io/Sparkle/
### Developer installation
The file `dev-env.yml` is used for developer mode of the Sparkle package and contains several extra packages for testing.
The two environments can be created in parallel since one is named `sparkle` and the other `sparkle-dev`. If you want to update an environment it is better to do a clean installation by removing and recreating it. For example:
```
conda deactivate
conda env remove -n sparkle
conda env create -f environment.yml
conda activate sparkle
```
This should be fast as both `conda` and `pip` use local cache for the packages.
#### Examples
See the `Examples` directory for some examples on how to use `Sparkle`. All Sparkle CLI commands need to be executed from the root of the initialised Sparkle directory.
#### Documentation
The documentation can be read at https://ada-research.github.io/Sparkle/.
A `PDF` is also available in the [repository](https://raw.githubusercontent.com/ADA-research/Sparkle/main/Documentation/sparkle-userguide.pdf).
#### Licensing
Sparkle is distributed under the MIT licence
##### Component licences
Sparkle is distributed with a number of external components, solvers, and instance sets. Descriptions and licensing information for each these are included in the `sparkle/Components` and `Examples/Resources/` directories.
The SATzilla 2012 feature extractor is used from `http://www.cs.ubc.ca/labs/beta/Projects/SATzilla/` with some modifications. The main modification of this component is to disable calling the SAT instance preprocessor called SatELite. It is located in: `Examples/Resources/Extractors/SAT-features-competition2012_revised_without_SatELite_sparkle/`
### Citation
If you use Sparkle for one of your papers and want to cite it, please cite our [paper](https://doi.org/10.1109/TEVC.2022.3215013) describing Sparkle:
K. van der Blom, H. H. Hoos, C. Luo and J. G. Rook, **Sparkle: Toward Accessible Meta-Algorithmics for Improving the State of the Art in Solving Challenging Problems**, in _IEEE Transactions on Evolutionary Computation_, vol. 26, no. 6, pp. 1351-1364, Dec. 2022, doi: 10.1109/TEVC.2022.3215013.
```
@article{BloEtAl22,
title={Sparkle: Toward Accessible Meta-Algorithmics for Improving the State of the Art in Solving Challenging Problems},
author={van der Blom, Koen and Hoos, Holger H. and Luo, Chuan and Rook, Jeroen G.},
journal={IEEE Transactions on Evolutionary Computation},
year={2022},
volume={26},
number={6},
pages={1351--1364},
doi={10.1109/TEVC.2022.3215013}
}
```
### Maintainers
Thijs Snelleman,
Jeroen Rook,
Holger H. Hoos,
### Contributors
Chuan Luo,
Richard Middelkoop,
Jérémie Gobeil,
Sam Vermeulen,
Marcel Baumann,
Jakob Bossek,
Tarek Junied,
Yingliu Lu,
Malte Schwerin,
Aaron Berger,
Marie Anastacio,
Aaron Berger
Koen van der Blom,
Noah Peil,
Brian Schiller
### Contact
sparkle@aim.rwth-aachen.de
### Sponsors
The development of Sparkle is partially sponsored by the [Alexander von Humboldt foundation](https://www.humboldt-foundation.de/en/).
Raw data
{
"_id": null,
"home_page": "https://github.com/thijssnelleman/Sparkle",
"name": "SparkleAI",
"maintainer": null,
"docs_url": null,
"requires_python": null,
"maintainer_email": null,
"keywords": "ai sat planning",
"author": "Thijs Snelleman",
"author_email": "fkt_sparkle@aim.rwth-aachen.de",
"download_url": "https://files.pythonhosted.org/packages/90/8c/8ffa2526a3495346bdccb1e2730b4af71686e88ca7352f2552e05f654ac3/sparkleai-0.9.1.2.tar.gz",
"platform": null,
"description": "# _Sparkle_\n\n[![Tests](https://ada-research.github.io/Sparkle/_static/junit/junit-badge.svg)](https://ada-research.github.io/Sparkle/_static/junit/index.html)\n![tests status](https://github.com/ada-research/sparkle/actions/workflows/unittest.yml/badge.svg?event=push)\n[![Coverage Status](https://ada-research.github.io/Sparkle/_static/coverage/coverage-badge.svg)](https://ada-research.github.io/Sparkle/_static/coverage/index.html)\n![linter](https://github.com/ada-research/sparkle/actions/workflows/linter.yml/badge.svg?event=push)\n![docs](https://github.com/ada-research/sparkle/actions/workflows/documentation.yml/badge.svg?event=push)\n\n> A Programming by Optimisation (PbO)-based problem-solving platform designed to enable the widespread and effective use of PbO techniques for improving the state-of-the-art in solving a broad range of prominent AI problems, including SAT and AI Planning.\n\nSpecifically, Sparkle facilitates the use of:\n\n * Automated algorithm configuration\n * Automated algorithm selection\n\nFurthermore, Sparkle handles various tasks for the user such as:\n\n * Algorithm meta information collection and statistics calculation\n * Instance/Data Set management and feature extraction\n * Compute cluster job submission and monitoring\n * Log file collection\n\n## Installation\n\nThe quick and full installation of Sparkle can be done using Conda (For Conda installation see [here]( https://docs.conda.io/en/latest/miniconda.html)). \n\nSimply download the `environment.yml` file from the [Github](https://github.com/ADA-research/Sparkle/blob/main/environment.yml) with wget:\n\n```bash\nwget https://raw.githubusercontent.com/ADA-research/Sparkle/main/environment.yml\n```\n\nand run:\n\n```bash\nconda env create -f environment.yml\n```\n\nThe installation of the environment may take up to five minutes depending on your internet connection.\nOnce the environment has been created it can be activated by:\n\n```\nconda activate sparkle\n```\n\n```{note}\nThe creation of the Conda environment also takes care of the installation of the Sparkle package itself. \n```\n\n```{note}\nYou will need to reactivate the environment every time you start the terminal, before using Sparkle.\n```\n\nSparkle can also be installed as a standalone package using Pip. We recommend creating a new virtual environment (For example, [venv](https://docs.python.org/3/library/venv.html)) before to ensure no clashes between dependencies occur. \n\n```bash\npip install SparkleAI\n```\n\nNote that a direct installation through Pip does not handle certain dependencies of the Sparkle CLI, such as the required libraries for compiling [RunSolver](https://www.cril.univ-artois.fr/~roussel/runsolver/).\n\n### Install dependencies\nAsside from several package dependencies, Sparkle's package / CLI relies on a few user supplied executables:\n- `LaTex` compiler ([pdflatex](https://gist.github.com/rain1024/98dd5e2c6c8c28f9ea9d)) for report generation\n- `Java`, tested with version 1.8.0_402, in order to use SMAC2\n- `R`, tested with version 4.3.1 in order to use IRACE\n\nOther dependencies are handled by the Conda environment, but if that is not an option for you please ensure you have the following:\n\n- [libnuma](https://anaconda.org/esrf-bcu/libnuma) and [numactl](https://anaconda.org/brown-data-science/numactl) for [Runsolver](http://www.cril.univ-artois.fr/~roussel/runsolver/) compilation which sparkle uses to measure solvers meta data. This is restricted to Linux based systems.\n- [Swig](https://anaconda.org/conda-forge/swig/) 4.0.2 for [SMAC3](https://github.com/automl/SMAC3), which is in turn used by [AutoFolio](https://github.com/automl/AutoFolio).\n\nFor detailed installation instructions see the documentation: https://ada-research.github.io/Sparkle/\n\n### Developer installation\n\nThe file `dev-env.yml` is used for developer mode of the Sparkle package and contains several extra packages for testing.\n\nThe two environments can be created in parallel since one is named `sparkle` and the other `sparkle-dev`. If you want to update an environment it is better to do a clean installation by removing and recreating it. For example:\n\n```\nconda deactivate\nconda env remove -n sparkle\nconda env create -f environment.yml\nconda activate sparkle\n```\n\nThis should be fast as both `conda` and `pip` use local cache for the packages.\n\n#### Examples\n\nSee the `Examples` directory for some examples on how to use `Sparkle`. All Sparkle CLI commands need to be executed from the root of the initialised Sparkle directory.\n\n#### Documentation\n\nThe documentation can be read at https://ada-research.github.io/Sparkle/. \n\nA `PDF` is also available in the [repository](https://raw.githubusercontent.com/ADA-research/Sparkle/main/Documentation/sparkle-userguide.pdf).\n\n#### Licensing\n\nSparkle is distributed under the MIT licence\n\n##### Component licences \n\nSparkle is distributed with a number of external components, solvers, and instance sets. Descriptions and licensing information for each these are included in the `sparkle/Components` and `Examples/Resources/` directories.\n\nThe SATzilla 2012 feature extractor is used from `http://www.cs.ubc.ca/labs/beta/Projects/SATzilla/` with some modifications. The main modification of this component is to disable calling the SAT instance preprocessor called SatELite. It is located in: `Examples/Resources/Extractors/SAT-features-competition2012_revised_without_SatELite_sparkle/`\n\n### Citation\n\nIf you use Sparkle for one of your papers and want to cite it, please cite our [paper](https://doi.org/10.1109/TEVC.2022.3215013) describing Sparkle:\nK. van der Blom, H. H. Hoos, C. Luo and J. G. Rook, **Sparkle: Toward Accessible Meta-Algorithmics for Improving the State of the Art in Solving Challenging Problems**, in _IEEE Transactions on Evolutionary Computation_, vol. 26, no. 6, pp. 1351-1364, Dec. 2022, doi: 10.1109/TEVC.2022.3215013.\n```\n@article{BloEtAl22,\n title={Sparkle: Toward Accessible Meta-Algorithmics for Improving the State of the Art in Solving Challenging Problems}, \n author={van der Blom, Koen and Hoos, Holger H. and Luo, Chuan and Rook, Jeroen G.},\n journal={IEEE Transactions on Evolutionary Computation}, \n year={2022},\n volume={26},\n number={6},\n pages={1351--1364},\n doi={10.1109/TEVC.2022.3215013}\n}\n```\n\n### Maintainers\nThijs Snelleman,\nJeroen Rook,\nHolger H. Hoos,\n\n### Contributors\nChuan Luo,\nRichard Middelkoop,\nJ\u00e9r\u00e9mie Gobeil,\nSam Vermeulen,\nMarcel Baumann,\nJakob Bossek,\nTarek Junied,\nYingliu Lu,\nMalte Schwerin,\nAaron Berger,\nMarie Anastacio,\nAaron Berger\nKoen van der Blom,\nNoah Peil,\nBrian Schiller\n\n### Contact\nsparkle@aim.rwth-aachen.de\n\n\n### Sponsors\n\nThe development of Sparkle is partially sponsored by the [Alexander von Humboldt foundation](https://www.humboldt-foundation.de/en/).\n",
"bugtrack_url": null,
"license": null,
"summary": "Sparkle is a Programming by Optimisation (PbO)-based problem-solving platform designed to enable the widespread and effective use of PbO techniques for improving the state-of-the-art in solving a broad range of prominent AI problems, including SAT and AI Planning.",
"version": "0.9.1.2",
"project_urls": {
"Homepage": "https://github.com/thijssnelleman/Sparkle"
},
"split_keywords": [
"ai",
"sat",
"planning"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "908c8ffa2526a3495346bdccb1e2730b4af71686e88ca7352f2552e05f654ac3",
"md5": "e6e1c11cb92fa571bcbff6fdb6ec236c",
"sha256": "10b639eba8ad9c0d6738a2cd24c9b2c1865c31ce1bfbfda57ecdc2058cc28a02"
},
"downloads": -1,
"filename": "sparkleai-0.9.1.2.tar.gz",
"has_sig": false,
"md5_digest": "e6e1c11cb92fa571bcbff6fdb6ec236c",
"packagetype": "sdist",
"python_version": "source",
"requires_python": null,
"size": 30036189,
"upload_time": "2024-12-12T07:26:09",
"upload_time_iso_8601": "2024-12-12T07:26:09.574848Z",
"url": "https://files.pythonhosted.org/packages/90/8c/8ffa2526a3495346bdccb1e2730b4af71686e88ca7352f2552e05f654ac3/sparkleai-0.9.1.2.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-12-12 07:26:09",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "thijssnelleman",
"github_project": "Sparkle",
"github_not_found": true,
"lcname": "sparkleai"
}