Name | csp-llm JSON |
Version |
0.1.5
JSON |
| download |
home_page | None |
Summary | Python app using llm via MCP for modeling and solving a csp problem in pycsp3 |
upload_time | 2025-07-15 09:46:57 |
maintainer | None |
docs_url | None |
author | None |
requires_python | >=3.10 |
license | MIT |
keywords |
pycsp3
csp
ai
llm
openai
anthropic
|
VCS |
 |
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
## 💡 About csp-llm
csp-llm is a python package running as an AI agent to enable the automatic generation and execution of [PyCSP3](https://github.com/xcsp3team/pycsp3/) code for a constraint problem.
It provides an interactive and customizable web user interface, which allows the user to enter or import the description of a constraint problem in natural language. The problem description is then sent to a pre-configured LLM model, which will generate and display the code to the user. The user can either request that the code be executed directly, or make modifications to the generated code before requesting its execution.
The application incorporates modern technologies to run any type of LLM model (LLM models deployed within CRIL, Anthropic models, OpenAI models, Google models, etc.).
Once installed, the application offers a few examples of constraint problems with which the user can have fun testing.
**NB: *It may happen that the generated code contains errors, in which case the user is given the opportunity to correct them via the interface.
## 🛑 Requirements
- Run on Linux and Mac platforms (tested on bash linux and zsh mac).
- Have access to an LLM platform. LLM templates from CRIL are proposed by default. Those with a LAN account can use their API key. For further information, please contact Alain Kemgue( kemgue@cril.fr )
- Have installed a version of python3 (**3.10** or higher)
- Have installed a version of java to run pycsp3 (java 8 or higher)
## 📦 Installation
We recommend installing the application in a python virtual environment.
### Virtual environment installation
```bash
python3 -m venv venv
source venv/bin/activate
```
### Installing the csp-llm package on PyPi
```bash
pip install csp-llm
```
### Launch the application
```bash
(venv) ordi@alain% launch-csp-llm
🚀 Launching the application...
💡 Application dependencies
missing ScriptRunContext! This warning can be ignored when running in bare mode.
✅ anthropic 0.55.0
✅ openai 1.92.2
✅ streamlit 1.46.1
✅ streamlit_ace 0.1.1
✅ dotenv
✅ pycsp3 2.5.1
✅ Java 21.0.7 detected (>= 8)
🌐 Application available at: http://localhost:8501
💡 Press Ctrl+C to stop
--------------------------------------------------
You can now view your Streamlit app in your browser.
URL: http://localhost:8501
**************************************************
```
The application is then available at http://localhost:8501
You can change port and host by passing parameters to the launch script.
```bash
(venv) ordi@alain% launch-csp-llm --help
usage: launch-csp-llm [-h] [--host HOST] [--port PORT] [-ev]
Launch the application
options:
-h, --help show this help message and exit
--host HOST
--port PORT
-ev
```
Example of launch on port 3000 and host 0.0.0.0( makes the application accessible on the entire network )
```bash
(venv) ordi@alain% launch-csp-llm --port 3000 --host 0.0.0.0
🚀 Launching the application...
💡 Application dependencies
✅ anthropic 0.55.0
✅ openai 1.92.2
✅ streamlit 1.46.1
✅ streamlit_ace 0.1.1
✅ dotenv
✅ pycsp3 2.5.1
✅ Java 21.0.7 detected (>= 8)
🌐 Application available at: http://0.0.0.0:3000
💡 Press Ctrl+C to stop
--------------------------------------------------
You can now view your Streamlit app in your browser.
URL: http://0.0.0.0:3000
**************************************************
```
Raw data
{
"_id": null,
"home_page": null,
"name": "csp-llm",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.10",
"maintainer_email": "Alain Kemgue <kemgue@cril.fr>",
"keywords": "pycsp3, csp, ai, llm, openai, anthropic",
"author": null,
"author_email": "Alain Kemgue <kemgue@cril.fr>",
"download_url": null,
"platform": null,
"description": "## \ud83d\udca1 About csp-llm\n\ncsp-llm is a python package running as an AI agent to enable the automatic generation and execution of [PyCSP3](https://github.com/xcsp3team/pycsp3/) code for a constraint problem.\n\nIt provides an interactive and customizable web user interface, which allows the user to enter or import the description of a constraint problem in natural language. The problem description is then sent to a pre-configured LLM model, which will generate and display the code to the user. The user can either request that the code be executed directly, or make modifications to the generated code before requesting its execution.\n\n\nThe application incorporates modern technologies to run any type of LLM model (LLM models deployed within CRIL, Anthropic models, OpenAI models, Google models, etc.).\n\nOnce installed, the application offers a few examples of constraint problems with which the user can have fun testing.\n\n\n**NB: *It may happen that the generated code contains errors, in which case the user is given the opportunity to correct them via the interface.\n\n\n## \ud83d\uded1 Requirements\n\n- Run on Linux and Mac platforms (tested on bash linux and zsh mac).\n- Have access to an LLM platform. LLM templates from CRIL are proposed by default. Those with a LAN account can use their API key. For further information, please contact Alain Kemgue( kemgue@cril.fr )\n- Have installed a version of python3 (**3.10** or higher)\n- Have installed a version of java to run pycsp3 (java 8 or higher)\n\n## \ud83d\udce6 Installation\n\nWe recommend installing the application in a python virtual environment.\n\n### Virtual environment installation\n\n```bash\npython3 -m venv venv\nsource venv/bin/activate\n\n```\n\n### Installing the csp-llm package on PyPi\n\n```bash\npip install csp-llm\n\n```\n\n### Launch the application\n\n```bash\n(venv) ordi@alain% launch-csp-llm \n\ud83d\ude80 Launching the application...\n\ud83d\udca1 Application dependencies\nmissing ScriptRunContext! This warning can be ignored when running in bare mode.\n\u2705 anthropic 0.55.0\n\u2705 openai 1.92.2\n\u2705 streamlit 1.46.1\n\u2705 streamlit_ace 0.1.1\n\u2705 dotenv\n\u2705 pycsp3 2.5.1\n\u2705 Java 21.0.7 detected (>= 8)\n\ud83c\udf10 Application available at: http://localhost:8501\n\ud83d\udca1 Press Ctrl+C to stop\n--------------------------------------------------\n\n You can now view your Streamlit app in your browser.\n\n URL: http://localhost:8501\n\n**************************************************\n```\n\nThe application is then available at http://localhost:8501\n\nYou can change port and host by passing parameters to the launch script.\n\n```bash\n\n(venv) ordi@alain% launch-csp-llm --help \nusage: launch-csp-llm [-h] [--host HOST] [--port PORT] [-ev]\n\nLaunch the application\n\noptions:\n -h, --help show this help message and exit\n --host HOST\n --port PORT\n -ev\n```\n\nExample of launch on port 3000 and host 0.0.0.0( makes the application accessible on the entire network )\n\n```bash\n(venv) ordi@alain% launch-csp-llm --port 3000 --host 0.0.0.0\n\ud83d\ude80 Launching the application...\n\ud83d\udca1 Application dependencies\n\u2705 anthropic 0.55.0\n\u2705 openai 1.92.2\n\u2705 streamlit 1.46.1\n\u2705 streamlit_ace 0.1.1\n\u2705 dotenv\n\u2705 pycsp3 2.5.1\n\u2705 Java 21.0.7 detected (>= 8)\n\ud83c\udf10 Application available at: http://0.0.0.0:3000\n\ud83d\udca1 Press Ctrl+C to stop\n--------------------------------------------------\n\n You can now view your Streamlit app in your browser.\n\n URL: http://0.0.0.0:3000\n\n**************************************************\n\n```\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "Python app using llm via MCP for modeling and solving a csp problem in pycsp3",
"version": "0.1.5",
"project_urls": {
"Bug Tracker": "https://github.com/kemgue/csp_llm/issues",
"Homepage": "https://github.com/kemgue/csp_llm",
"Repository": "https://github.com/kemgue/csp_llm"
},
"split_keywords": [
"pycsp3",
" csp",
" ai",
" llm",
" openai",
" anthropic"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "dc6e0e0eb624f897c951c6b0b859366f9ef999e4d00b3605db5f3a2c40d6a6d9",
"md5": "f1895add118afec90981d6b64391d98a",
"sha256": "be9bade5223c4a7dc014b892621fbc21dfd508441261b6c58e5665e1482a147b"
},
"downloads": -1,
"filename": "csp_llm-0.1.5-py3-none-any.whl",
"has_sig": false,
"md5_digest": "f1895add118afec90981d6b64391d98a",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.10",
"size": 17063,
"upload_time": "2025-07-15T09:46:57",
"upload_time_iso_8601": "2025-07-15T09:46:57.137228Z",
"url": "https://files.pythonhosted.org/packages/dc/6e/0e0eb624f897c951c6b0b859366f9ef999e4d00b3605db5f3a2c40d6a6d9/csp_llm-0.1.5-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-07-15 09:46:57",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "kemgue",
"github_project": "csp_llm",
"github_not_found": true,
"lcname": "csp-llm"
}