sigmund


Namesigmund JSON
Version 1.4.1 PyPI version JSON
download
home_pageNone
SummaryAI-based chatbot that provides sensible answers based on documentation
upload_time2025-08-30 11:52:48
maintainerNone
docs_urlNone
authorNone
requires_python>=3.8
licenseNone
keywords ai chatbot llm
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # Sigmund AI

Copyright 2023-2025 Sebastiaan Mathôt

![](artwork/sigmund-avatar-small.png)

A Python library and web app for an LLM-based chatbot:

Features:

- __Privacy__: all messages and uploaded attachments are encrypted so that no-one can listen in on your conversation
- __Expert knowledge__: access to documentation 
- __Continuous conversation__: conversations are summarized when they become too long to fit into the prompt
- __Tool use__ (in research-assistant mode):
    - __Code execution__: ability to execute Python and R code
    - __Google Scholar search__: ability to search for articles on Google Scholar
    - __Image generation__: ability to generate images
- __Integrations__
    - __Python__: [connect to JupyterLab, Notebook, Spyder or Rapunzel](https://github.com/open-cogsci/jupyter-extension-sigmund)
    - __OpenSesame__: [directly integrate with OpenSesame](https://osdoc.cogsci.nl/4.0/manual/sigmund/)
    - __Sigmund Analyst__: [directly integrate with Sigmund Analyst for data analysis](https://github.com/open-cogsci/sigmund-analyst)
    
Sigmund is not a large language model itself. Rather it uses third-party models. Currently, models from [OpenAI](https://openai.com), [Anthropic](https://www.anthropic.com/), and [Mistral](https://mistral.ai/) are supported. API keys from these respective providers are required.


[output2.webm](https://github.com/user-attachments/assets/905233c3-5980-45f5-b8fb-dc769b4c3526)


## Configuration

See `sigmund/config.py` for configuration instructions.


## Dependencies

For Python dependencies, see `pyproject.toml`. In addition to these, `pandoc` is required for the ability to read attachments, and a local `redis` server needs to run for persistent data between sessions.


## Running (development)

Download the source code, and copy `.env.example` to `.env`. Edit this file to specify at least the API keys, and depending on the functionality that you want activate, possibly also other variables. The only variable that is strictly required is the OpenAI API key, because OpenAI is used to create text embeddings, even when a different model is used for the conversation.

Next, install the dependencies, build the documentation index, and launch the app!

```
pip install .               # install dependencies
python index_library.py     # build library (documentation) index
python app.py               # start the app
```

Next, access the app (by default) through:

```
http://127.0.0.1:5000/
```


## Running (production)

In production, the server is generally not run by directly calling the app. There are many ways to run a Flask app in production. One way is to use gunicorn to start the app, and then use an nginx web server as a proxy that reroutes requests to the app. When taking this route, make sure to set up nginx with a large `client_max_body_size` (to allow attachment uploading) and disable `proxy_cache` and `proxy_buffering` (to allow status messages to be streamed while Sigmund is answering).


## License

Sigmund is distributed under the terms of the GNU General Public License 3. The full license should be included in the file `COPYING`, or can be obtained from:

- <http://www.gnu.org/licenses/gpl.txt>


            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "sigmund",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.8",
    "maintainer_email": null,
    "keywords": "ai, chatbot, llm",
    "author": null,
    "author_email": "Sebastiaan Math\u00f4t <s.mathot@cogsci.nl>",
    "download_url": "https://files.pythonhosted.org/packages/0a/f5/e563a22e0cb54edca431a3ce473a0ceda443482c00044eb24c0ab2dca9a1/sigmund-1.4.1.tar.gz",
    "platform": null,
    "description": "# Sigmund AI\n\nCopyright 2023-2025 Sebastiaan Math\u00f4t\n\n![](artwork/sigmund-avatar-small.png)\n\nA Python library and web app for an LLM-based chatbot:\n\nFeatures:\n\n- __Privacy__: all messages and uploaded attachments are encrypted so that no-one can listen in on your conversation\n- __Expert knowledge__: access to documentation \n- __Continuous conversation__: conversations are summarized when they become too long to fit into the prompt\n- __Tool use__ (in research-assistant mode):\n    - __Code execution__: ability to execute Python and R code\n    - __Google Scholar search__: ability to search for articles on Google Scholar\n    - __Image generation__: ability to generate images\n- __Integrations__\n    - __Python__: [connect to JupyterLab, Notebook, Spyder or Rapunzel](https://github.com/open-cogsci/jupyter-extension-sigmund)\n    - __OpenSesame__: [directly integrate with OpenSesame](https://osdoc.cogsci.nl/4.0/manual/sigmund/)\n    - __Sigmund Analyst__: [directly integrate with Sigmund Analyst for data analysis](https://github.com/open-cogsci/sigmund-analyst)\n    \nSigmund is not a large language model itself. Rather it uses third-party models. Currently, models from [OpenAI](https://openai.com), [Anthropic](https://www.anthropic.com/), and [Mistral](https://mistral.ai/) are supported. API keys from these respective providers are required.\n\n\n[output2.webm](https://github.com/user-attachments/assets/905233c3-5980-45f5-b8fb-dc769b4c3526)\n\n\n## Configuration\n\nSee `sigmund/config.py` for configuration instructions.\n\n\n## Dependencies\n\nFor Python dependencies, see `pyproject.toml`. In addition to these, `pandoc` is required for the ability to read attachments, and a local `redis` server needs to run for persistent data between sessions.\n\n\n## Running (development)\n\nDownload the source code, and copy `.env.example` to `.env`. Edit this file to specify at least the API keys, and depending on the functionality that you want activate, possibly also other variables. The only variable that is strictly required is the OpenAI API key, because OpenAI is used to create text embeddings, even when a different model is used for the conversation.\n\nNext, install the dependencies, build the documentation index, and launch the app!\n\n```\npip install .               # install dependencies\npython index_library.py     # build library (documentation) index\npython app.py               # start the app\n```\n\nNext, access the app (by default) through:\n\n```\nhttp://127.0.0.1:5000/\n```\n\n\n## Running (production)\n\nIn production, the server is generally not run by directly calling the app. There are many ways to run a Flask app in production. One way is to use gunicorn to start the app, and then use an nginx web server as a proxy that reroutes requests to the app. When taking this route, make sure to set up nginx with a large `client_max_body_size` (to allow attachment uploading) and disable `proxy_cache` and `proxy_buffering` (to allow status messages to be streamed while Sigmund is answering).\n\n\n## License\n\nSigmund is distributed under the terms of the GNU General Public License 3. The full license should be included in the file `COPYING`, or can be obtained from:\n\n- <http://www.gnu.org/licenses/gpl.txt>\n\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "AI-based chatbot that provides sensible answers based on documentation",
    "version": "1.4.1",
    "project_urls": {
        "Documentation": "https://sigmundai.eu",
        "Source": "https://github.com/open-cogsci/sigmund-ai"
    },
    "split_keywords": [
        "ai",
        " chatbot",
        " llm"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "31f8b6d65cfb8d637ea2ae59ba69d50d78ea38e0396013d56f75392960502db8",
                "md5": "a12793ebeabf54fa200a0d5a903e163a",
                "sha256": "8f096d5e596fcde3c53b5edd4f67eab9ebf66617a05602843ace3df095e4d255"
            },
            "downloads": -1,
            "filename": "sigmund-1.4.1-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "a12793ebeabf54fa200a0d5a903e163a",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.8",
            "size": 6499645,
            "upload_time": "2025-08-30T11:52:46",
            "upload_time_iso_8601": "2025-08-30T11:52:46.308166Z",
            "url": "https://files.pythonhosted.org/packages/31/f8/b6d65cfb8d637ea2ae59ba69d50d78ea38e0396013d56f75392960502db8/sigmund-1.4.1-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "0af5e563a22e0cb54edca431a3ce473a0ceda443482c00044eb24c0ab2dca9a1",
                "md5": "56350c6facf0a2258730958f67c6055b",
                "sha256": "6485dbb84fe761ab0d03f631dfc6eaeaeca5344129264d11a2c82619edd3103d"
            },
            "downloads": -1,
            "filename": "sigmund-1.4.1.tar.gz",
            "has_sig": false,
            "md5_digest": "56350c6facf0a2258730958f67c6055b",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.8",
            "size": 6481795,
            "upload_time": "2025-08-30T11:52:48",
            "upload_time_iso_8601": "2025-08-30T11:52:48.381076Z",
            "url": "https://files.pythonhosted.org/packages/0a/f5/e563a22e0cb54edca431a3ce473a0ceda443482c00044eb24c0ab2dca9a1/sigmund-1.4.1.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-08-30 11:52:48",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "open-cogsci",
    "github_project": "sigmund-ai",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "sigmund"
}
        
Elapsed time: 1.10336s