sopel-ai


Namesopel-ai JSON
Version 1.2.1 PyPI version JSON
download
home_pageNone
SummarySopel AI - an LLM enhanced chat bot plug-in
upload_time2024-05-05 04:08:19
maintainerNone
docs_urlNone
authorNone
requires_python>=3.9
licenseBSD-3-Clause
keywords ai bot irc llm plugin sopel
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            % sopel_ai(1) Version 1.2.1 chatbot plugin

Name
====

**Sopel AI** - AI query/response plugin


Synopsis
========
Enable Sopel to respond to queries using a Perplexity AI back-end, featuring the
ability to plug different LLMs on a per-user basis.

```
pip install -U sopel_ai

sopel configure
sopel
```

From a channel where Sopel AI is present enter a query:

`.q Summarize the plot of The Martian by Andy Weir.`

**This plugin requires an API key issued by the service provider.**


Installation
============
```zsh
pip install -U sopel_ai
```

The installation assumes that the Sopel chatbot is already installed in the
target system and in the same environment as the `pip` installation.

Confirm the installed package and version:

```zsh
echo "import sopel_ai ; print(sopel_ai.__VERSION__)" | python
```


Commands
========
Listed in order of frequency of use:

|Command|Arguments|Effect|
|-------|---------|------|
|`.q`|Some question|The model produces a response|
|`.qpm`|Some question|Same as `.q` but in private message|
|`.models`|n/a|Lists all models that Sopel AI supports|
|`.mymodel`|number|Request or set the model to use for the current /nick|
|`.req`|n/a|Return the GitHub URL for Sopel AI feature requests|
|`.bug`|n/a|Same as `.req`|

Other available commands if the standard Sopen infobot plugins are enabled:

|Command|Arguments|Effect|
|-------|---------|------|
|`.search`|Some question|Search using Bing or DuckDuckGo|
|`.dict`|Word|Get a dictionary definition if one is available|
|`.tr`|Word or phrase|Translate to English|
|`.w`|Word or topic|Search Wikipedia for articles|


Usage
=====
The most common usage is to enter a query in-channel or private message, and
wait for the bot to respond.

`.q Quote the Three Law of Robotics as a list and without details.`

Users may check the current model used for producing their responses by issuing:

`.mymodel`

The bot produces a numbered list of supported models by issuing:

`.models`

Users are welcome to change the default model to one of those listed by issuing
the `.mymodel` command followed by the item number for the desired model from
the list:

`.mymodel 1`

Users may request private instead of in-channel responses:

`.qpm Quote the Three Laws of Robotics and give me examples.`

Responses generated by making `.q` queries are expected to be short or are
trunked at 480 characters.  They are intended to appear in-channel and to be as
brief as possible.

Responses generated from a `.qpm` query are expected to be long and detailed,
with a 16 KB length limit, span multpile messages (due to ircv3 limitations),
and `sopel_ai` presents them to the user in private message, regardless of
whether they were issued from a channel or a direct message.

Users can query the bot plugin and AI provider using:

`.mver`


AI providers
============
The current version uses the Perplexity AI models and API.  Future versions may
support other providers.


API Key
=======
All AI services providers require an API key for access.  The API key is
configured via:

`sopel config`

Or edit this section in the Sopel configuration file:

```ini
[sopel_ai]
.
.
llm_key = pplx-3a45enteryourkeykere
```


Docker
======
Sopel AI is dockerized and available from Docker Hub as pr3d4t0r/sopel_ai.  The
version tag is the same as the latest version number for Sopel AI.

The examples in this section assume execution from the local file system.  Adapt
as needed to run in a Kubernets cluster or other deployment method.


### First time

The Sopel + AI configuration file must be created:

```bash
docker run -ti -v ${HOME}/sopel_ai_data:/home/sopel_ai \
    pr3d4t0r/sopel_ai:latest \
    sopel configure
```

The API key and other relevant configuration data must be provided at this time.
`$HOME/sopel_ai_data` is volume mapped to the container's `/home/sopel_ai/.sopel
directory.  Ensure that your host has write permissions in the shared volume.

The `pr3d4t0r/sopel_ai:latest` image is used if no version is specified.  The
image update policy is left to the sysops and is not automatic.

Once `$HOME/sopel_ai_data` exists it's possible to copy the contents of a
different `~/.sopel` directory to it and use is as the configuration and Sopel
AI database store.


### Starting Sopel AI

A Docker Compose file is provided as an example of how to start the service,
<a href='./dockerized/docker-compose.yaml' target='_blank'>docker-file.yaml</a>.  With this Docker Compose
file in the current directory, start the service with:

```bash
docker-compose up [-d] sopel_ai

```

The `-d` parameter daemonizes the service.  Without it, the service will start
and display its output in the current console.


License
=======
The **Sopel AI** Sopel plugin, package, documentation, and examples are licensed
under the BSD-3 open source license at https://github.com/pr3d4t0r/sopel_ai/blob/master/LICENSE.txt.


See also
========
- Sopel AI API documentation at https://pr3d4t0r.github.io/sopel_ai
- PerplexiPy high level API interface to Perplexity AI https://pypi.org/project/perplexipy
- Sopel commands:  https://sopel.chat/usage/commands/
- Sopel bot home page:  https://sopel.chat/


Bugs
====
Feature requests and bug reports:

https://github.com/pr3d4t0r/sopel_ai/issues


            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "sopel-ai",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.9",
    "maintainer_email": null,
    "keywords": "ai, bot, irc, llm, plugin, sopel",
    "author": null,
    "author_email": "The SopelAI team <sopel_ai@cime.net>",
    "download_url": null,
    "platform": null,
    "description": "% sopel_ai(1) Version 1.2.1 chatbot plugin\n\nName\n====\n\n**Sopel AI** - AI query/response plugin\n\n\nSynopsis\n========\nEnable Sopel to respond to queries using a Perplexity AI back-end, featuring the\nability to plug different LLMs on a per-user basis.\n\n```\npip install -U sopel_ai\n\nsopel configure\nsopel\n```\n\nFrom a channel where Sopel AI is present enter a query:\n\n`.q Summarize the plot of The Martian by Andy Weir.`\n\n**This plugin requires an API key issued by the service provider.**\n\n\nInstallation\n============\n```zsh\npip install -U sopel_ai\n```\n\nThe installation assumes that the Sopel chatbot is already installed in the\ntarget system and in the same environment as the `pip` installation.\n\nConfirm the installed package and version:\n\n```zsh\necho \"import sopel_ai ; print(sopel_ai.__VERSION__)\" | python\n```\n\n\nCommands\n========\nListed in order of frequency of use:\n\n|Command|Arguments|Effect|\n|-------|---------|------|\n|`.q`|Some question|The model produces a response|\n|`.qpm`|Some question|Same as `.q` but in private message|\n|`.models`|n/a|Lists all models that Sopel AI supports|\n|`.mymodel`|number|Request or set the model to use for the current /nick|\n|`.req`|n/a|Return the GitHub URL for Sopel AI feature requests|\n|`.bug`|n/a|Same as `.req`|\n\nOther available commands if the standard Sopen infobot plugins are enabled:\n\n|Command|Arguments|Effect|\n|-------|---------|------|\n|`.search`|Some question|Search using Bing or DuckDuckGo|\n|`.dict`|Word|Get a dictionary definition if one is available|\n|`.tr`|Word or phrase|Translate to English|\n|`.w`|Word or topic|Search Wikipedia for articles|\n\n\nUsage\n=====\nThe most common usage is to enter a query in-channel or private message, and\nwait for the bot to respond.\n\n`.q Quote the Three Law of Robotics as a list and without details.`\n\nUsers may check the current model used for producing their responses by issuing:\n\n`.mymodel`\n\nThe bot produces a numbered list of supported models by issuing:\n\n`.models`\n\nUsers are welcome to change the default model to one of those listed by issuing\nthe `.mymodel` command followed by the item number for the desired model from\nthe list:\n\n`.mymodel 1`\n\nUsers may request private instead of in-channel responses:\n\n`.qpm Quote the Three Laws of Robotics and give me examples.`\n\nResponses generated by making `.q` queries are expected to be short or are\ntrunked at 480 characters.  They are intended to appear in-channel and to be as\nbrief as possible.\n\nResponses generated from a `.qpm` query are expected to be long and detailed,\nwith a 16 KB length limit, span multpile messages (due to ircv3 limitations),\nand `sopel_ai` presents them to the user in private message, regardless of\nwhether they were issued from a channel or a direct message.\n\nUsers can query the bot plugin and AI provider using:\n\n`.mver`\n\n\nAI providers\n============\nThe current version uses the Perplexity AI models and API.  Future versions may\nsupport other providers.\n\n\nAPI Key\n=======\nAll AI services providers require an API key for access.  The API key is\nconfigured via:\n\n`sopel config`\n\nOr edit this section in the Sopel configuration file:\n\n```ini\n[sopel_ai]\n.\n.\nllm_key = pplx-3a45enteryourkeykere\n```\n\n\nDocker\n======\nSopel AI is dockerized and available from Docker Hub as pr3d4t0r/sopel_ai.  The\nversion tag is the same as the latest version number for Sopel AI.\n\nThe examples in this section assume execution from the local file system.  Adapt\nas needed to run in a Kubernets cluster or other deployment method.\n\n\n### First time\n\nThe Sopel + AI configuration file must be created:\n\n```bash\ndocker run -ti -v ${HOME}/sopel_ai_data:/home/sopel_ai \\\n    pr3d4t0r/sopel_ai:latest \\\n    sopel configure\n```\n\nThe API key and other relevant configuration data must be provided at this time.\n`$HOME/sopel_ai_data` is volume mapped to the container's `/home/sopel_ai/.sopel\ndirectory.  Ensure that your host has write permissions in the shared volume.\n\nThe `pr3d4t0r/sopel_ai:latest` image is used if no version is specified.  The\nimage update policy is left to the sysops and is not automatic.\n\nOnce `$HOME/sopel_ai_data` exists it's possible to copy the contents of a\ndifferent `~/.sopel` directory to it and use is as the configuration and Sopel\nAI database store.\n\n\n### Starting Sopel AI\n\nA Docker Compose file is provided as an example of how to start the service,\n<a href='./dockerized/docker-compose.yaml' target='_blank'>docker-file.yaml</a>.  With this Docker Compose\nfile in the current directory, start the service with:\n\n```bash\ndocker-compose up [-d] sopel_ai\n\n```\n\nThe `-d` parameter daemonizes the service.  Without it, the service will start\nand display its output in the current console.\n\n\nLicense\n=======\nThe **Sopel AI** Sopel plugin, package, documentation, and examples are licensed\nunder the BSD-3 open source license at https://github.com/pr3d4t0r/sopel_ai/blob/master/LICENSE.txt.\n\n\nSee also\n========\n- Sopel AI API documentation at https://pr3d4t0r.github.io/sopel_ai\n- PerplexiPy high level API interface to Perplexity AI https://pypi.org/project/perplexipy\n- Sopel commands:  https://sopel.chat/usage/commands/\n- Sopel bot home page:  https://sopel.chat/\n\n\nBugs\n====\nFeature requests and bug reports:\n\nhttps://github.com/pr3d4t0r/sopel_ai/issues\n\n",
    "bugtrack_url": null,
    "license": "BSD-3-Clause",
    "summary": "Sopel AI - an LLM enhanced chat bot plug-in",
    "version": "1.2.1",
    "project_urls": {
        "Bug Tracker": "https://github.com/pr3d4t0r/sopel_ai/issues",
        "Homepage": "https://github.com/pr3d4t0r/sopel_ai"
    },
    "split_keywords": [
        "ai",
        " bot",
        " irc",
        " llm",
        " plugin",
        " sopel"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "7af6dcb4cf2dacef952a000dceb55ac5b16dcdef177474768db3ed8ef1b019b8",
                "md5": "014a84bd1dd3719acdae9446ef01c70e",
                "sha256": "fe53cceb115773b3a5eee02de773f3b858fcb85becd4eee0945f7b2620adb1e2"
            },
            "downloads": -1,
            "filename": "sopel_ai-1.2.1-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "014a84bd1dd3719acdae9446ef01c70e",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.9",
            "size": 11506,
            "upload_time": "2024-05-05T04:08:19",
            "upload_time_iso_8601": "2024-05-05T04:08:19.593837Z",
            "url": "https://files.pythonhosted.org/packages/7a/f6/dcb4cf2dacef952a000dceb55ac5b16dcdef177474768db3ed8ef1b019b8/sopel_ai-1.2.1-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-05-05 04:08:19",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "pr3d4t0r",
    "github_project": "sopel_ai",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "requirements": [],
    "lcname": "sopel-ai"
}
        
Elapsed time: 0.24773s