sigmund


Namesigmund JSON
Version 0.19.0 PyPI version JSON
download
home_pageNone
SummaryAI-based chatbot that provides sensible answers based on documentation
upload_time2024-05-14 12:06:12
maintainerNone
docs_urlNone
authorNone
requires_python>=3.8
licenseNone
keywords ai chatbot llm
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # Sigmund AI

Copyright 2023-2024 Sebastiaan Mathôt

A Python library and web app for an LLM-based chatbot with many capabilities.

Features:

- __Privacy__: all messages and uploaded attachments are encrypted so that no-one can listen in on your conversation
- __Knowledge__: access to documentation
- __Continuous conversation__: conversations are summarized when they become too long to fit into the prompt
- __Tool use__:
    - __Code execution__: ability to execute Python and R code
    - __Google Scholar search__: ability to search for articles on Google Scholar
    - __Attachments__: ability to read attachments
    - __Download__: ability to download pages and files as attachments
    
Sigmund is not a large language model itself. Rather it uses third-party models. Currently, models from [OpenAI](https://openai.com), [Anthropic](https://www.anthropic.com/), and [Mistral](https://mistral.ai/) are supported. API keys from these respective providers are required.

By default, Sigmund is configured to act as an assistant for OpenSesame, a program for creating psychology/ cognitive-neuroscience experiments. However, the software can easily be reconfigured for a different purpose.


## What can Sigmund do? And how does Sigmund work?

For a description of how Sigmund works, see: <https://sigmundai.eu/about>. This page describes the default configuration.


## Configuration

See `sigmund/config.py` for configuration instructions.


## Dependencies

For Python dependencies, see `pyproject.toml`. In addition to these, `pandoc` is required for the ability to read attachments, and a local `redis` server needs to run for persistent data between sessions.


## Running (development)

Download the source code, and in the folder of the source code execute the following:

```
# Specify API keys for model providers. Even when using Anthropic (Claude) or
# Mistral, an OpenAI key is provided when document search is enabled
export OPENAI_API_KEY = 'your key here'
export ANTHROPIC_API_KEY = 'your key here'
export MISTRAL_API_KEY = 'your key here'
pip install .               # install dependencies
python index_library.py     # build library (documentation) index
python app.py               # start the app
```

Next, access the app (by default) through:

```
https://127.0.0.1:5000/
```


## Running (production)

In production, the server is generally not run by directly calling the app. There are many ways to run a Flask app in production. One way is to use gunicorn to start the app, and then use an nginx web server as a proxy that reroutes requests to the app. When taking this route, make sure to set up nginx with a large `client_max_body_size` (to allow attachment uploading) and disable `proxy_cache` and `proxy_buffering` (to allow status messages to be streamed while Sigmund is answering).


## License

Sigmund is distributed under the terms of the GNU General Public License 3. The full license should be included in the file `COPYING`, or can be obtained from:

- <http://www.gnu.org/licenses/gpl.txt>

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "sigmund",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.8",
    "maintainer_email": null,
    "keywords": "ai, chatbot, llm",
    "author": null,
    "author_email": "Sebastiaan Math\u00f4t <s.mathot@cogsci.nl>",
    "download_url": "https://files.pythonhosted.org/packages/75/8c/8e85e2038bc241db6802794c3b958c2ff90a3febfa4267daa9ba54fbdf11/sigmund-0.19.0.tar.gz",
    "platform": null,
    "description": "# Sigmund AI\n\nCopyright 2023-2024 Sebastiaan Math\u00f4t\n\nA Python library and web app for an LLM-based chatbot with many capabilities.\n\nFeatures:\n\n- __Privacy__: all messages and uploaded attachments are encrypted so that no-one can listen in on your conversation\n- __Knowledge__: access to documentation\n- __Continuous conversation__: conversations are summarized when they become too long to fit into the prompt\n- __Tool use__:\n    - __Code execution__: ability to execute Python and R code\n    - __Google Scholar search__: ability to search for articles on Google Scholar\n    - __Attachments__: ability to read attachments\n    - __Download__: ability to download pages and files as attachments\n    \nSigmund is not a large language model itself. Rather it uses third-party models. Currently, models from [OpenAI](https://openai.com), [Anthropic](https://www.anthropic.com/), and [Mistral](https://mistral.ai/) are supported. API keys from these respective providers are required.\n\nBy default, Sigmund is configured to act as an assistant for OpenSesame, a program for creating psychology/ cognitive-neuroscience experiments. However, the software can easily be reconfigured for a different purpose.\n\n\n## What can Sigmund do? And how does Sigmund work?\n\nFor a description of how Sigmund works, see: <https://sigmundai.eu/about>. This page describes the default configuration.\n\n\n## Configuration\n\nSee `sigmund/config.py` for configuration instructions.\n\n\n## Dependencies\n\nFor Python dependencies, see `pyproject.toml`. In addition to these, `pandoc` is required for the ability to read attachments, and a local `redis` server needs to run for persistent data between sessions.\n\n\n## Running (development)\n\nDownload the source code, and in the folder of the source code execute the following:\n\n```\n# Specify API keys for model providers. Even when using Anthropic (Claude) or\n# Mistral, an OpenAI key is provided when document search is enabled\nexport OPENAI_API_KEY = 'your key here'\nexport ANTHROPIC_API_KEY = 'your key here'\nexport MISTRAL_API_KEY = 'your key here'\npip install .               # install dependencies\npython index_library.py     # build library (documentation) index\npython app.py               # start the app\n```\n\nNext, access the app (by default) through:\n\n```\nhttps://127.0.0.1:5000/\n```\n\n\n## Running (production)\n\nIn production, the server is generally not run by directly calling the app. There are many ways to run a Flask app in production. One way is to use gunicorn to start the app, and then use an nginx web server as a proxy that reroutes requests to the app. When taking this route, make sure to set up nginx with a large `client_max_body_size` (to allow attachment uploading) and disable `proxy_cache` and `proxy_buffering` (to allow status messages to be streamed while Sigmund is answering).\n\n\n## License\n\nSigmund is distributed under the terms of the GNU General Public License 3. The full license should be included in the file `COPYING`, or can be obtained from:\n\n- <http://www.gnu.org/licenses/gpl.txt>\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "AI-based chatbot that provides sensible answers based on documentation",
    "version": "0.19.0",
    "project_urls": {
        "Documentation": "https://sigmundai.eu",
        "Source": "https://github.com/smathot/sigmundai"
    },
    "split_keywords": [
        "ai",
        " chatbot",
        " llm"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "d308f7337246337712f7366d0bf74f5ff72561d0a4709a087d8c10cfec12fe54",
                "md5": "7a77c8fcc6f24c395fc125bc8f9bf568",
                "sha256": "f6e435ecee991a5d673312447f35afec19f2af83a9dc08dae4eb0d8e24c59fbc"
            },
            "downloads": -1,
            "filename": "sigmund-0.19.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "7a77c8fcc6f24c395fc125bc8f9bf568",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.8",
            "size": 6185003,
            "upload_time": "2024-05-14T12:06:11",
            "upload_time_iso_8601": "2024-05-14T12:06:11.199989Z",
            "url": "https://files.pythonhosted.org/packages/d3/08/f7337246337712f7366d0bf74f5ff72561d0a4709a087d8c10cfec12fe54/sigmund-0.19.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "758c8e85e2038bc241db6802794c3b958c2ff90a3febfa4267daa9ba54fbdf11",
                "md5": "992b8e78d4ec12b7833a60a045dba6e4",
                "sha256": "6a3dc442613acaa5e859b15edba63cb30243d4145fb1502c3b3cbd163db31909"
            },
            "downloads": -1,
            "filename": "sigmund-0.19.0.tar.gz",
            "has_sig": false,
            "md5_digest": "992b8e78d4ec12b7833a60a045dba6e4",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.8",
            "size": 6171662,
            "upload_time": "2024-05-14T12:06:12",
            "upload_time_iso_8601": "2024-05-14T12:06:12.926736Z",
            "url": "https://files.pythonhosted.org/packages/75/8c/8e85e2038bc241db6802794c3b958c2ff90a3febfa4267daa9ba54fbdf11/sigmund-0.19.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-05-14 12:06:12",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "smathot",
    "github_project": "sigmundai",
    "github_not_found": true,
    "lcname": "sigmund"
}
        
Elapsed time: 0.25520s