openai-kira


Nameopenai-kira JSON
Version 0.3.6 PyPI version JSON
download
home_pagehttps://github.com/sudoskys/openai-kira
SummaryA chat client
upload_time2023-01-15 17:02:48
maintainersudoskys
docs_urlNone
authorsudoskys
requires_python>=3.8,<4.0
licenseLGPL-2.1-or-later
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # openai-kira

Openai GPT3 ChatBot 框架包,在未公开前快速实现类 ChatGPT接入(公开后就接入chatGPT),打包成依赖的玩具。提供 redis 和 文件数据库
两个选择。

## Use

`pip install -U openai-kira`

**init**

```python
import openai_kira

# 
openai_kira.setting.redisSetting = openai_kira.setting.RedisConfig()
openai_kira.setting.dbFile = "openai_msg.db"
openai_kira.setting.openaiApiKey = ["key", "key2"]
openai_kira.setting.proxyUrl = None  # "127.0.0.1"
# 插件的设置
openai_kira.setting.webServerUrlFilter = False
openai_kira.setting.webServerStopSentence = ["广告", "营销号"]
```

## Exp

SEE `./test` for More Exp!

```python
import asyncio

import openai_kira
from openai_kira import Chat

print(openai_kira.RedisConfig())
openai_kira.setting.openaiApiKey = ["key"]

receiver = Chat.Chatbot(
    conversation_id=10086,
    call_func=None,  # Api_keys.pop_api_key,
    start_sequ="Ai:",
    restart_sequ="Human:",
)


async def main():
    response = await receiver.get_chat_response(model="text-davinci-003",
                                                prompt="你好",
                                                max_tokens=500,
                                                role="你扮演...",
                                                web_enhance_server={"time": ""}
                                                )
    print(response)


asyncio.run(main())
```

```python
import asyncio
import openai_kira

print(openai_kira.RedisConfig())
openai_kira.setting.openaiApiKey = ["key"]
print(openai_kira.setting.openaiApiKey)


async def main():
    try:
        response = await openai_kira.Completion().create(model="text-davinci-003",
                                                         prompt="Say this is a test",
                                                         temperature=0,
                                                         max_tokens=20)
        # TEST
        print(response)
        print(type(response))
    except Exception as e:
        print(e)
        if "Incorrect API key provided" in e:
            print("OK")
        else:
            print("NO")


asyncio.run(main())
```

## Plugin

**Table**

| plugins   | desc              | value/server                                          | use                                   |
|-----------|-------------------|-------------------------------------------------------|---------------------------------------|
| `time`    | now time          | `""`,no need                                          | `明昨今天`....                            |
| `week`    | week time         | `""`,no need                                          | `周几` .....                            |
| `search`  | Web Search        | `["some.com?searchword={}"]`,must need                | `查询` `你知道` len<80 / end with`?`len<15 |
| `duckgo`  | Web Search        | `""`,no need,but need `pip install duckduckgo_search` | `查询` `你知道` len<80 / end with`?`len<15 |
| `details` | answer with steps | `""`,no need                                          | Ask for help `how to`                 |

## Plugin dev

There is a middleware between the memory pool and the analytics that provides some networked retrieval support and
operational support. It can be spiked with services that interface to other Api's.

**Prompt Injection**

Use `""` `[]` to emphasise content.

### Exp

First create a file in `openai_kira/Chat/module/plugin` without underscores (`_`) in the file name.

**Template**

```python
from ..platform import ChatPlugin, PluginConfig
from ._plugin_tool import PromptTool
import os
from loguru import logger

modulename = os.path.basename(__file__).strip(".py")


@ChatPlugin.plugin_register(modulename)
class Week(object):
    def __init__(self):
        self._server = None
        self._text = None
        self._time = ["time", "多少天", "几天", "时间", "几点", "今天", "昨天", "明天", "几月", "几月", "几号",
                      "几个月",
                      "天前"]

    def requirements(self):
        return []

    async def check(self, params: PluginConfig) -> bool:
        if PromptTool.isStrIn(prompt=params.text, keywords=self._time):
            return True
        return False

    async def process(self, params: PluginConfig) -> list:
        _return = []
        self._text = params.text
        # 校验
        if not all([self._text]):
            return []
        # GET
        from datetime import datetime, timedelta, timezone
        utc_dt = datetime.utcnow().replace(tzinfo=timezone.utc)
        bj_dt = utc_dt.astimezone(timezone(timedelta(hours=8)))
        now = bj_dt.strftime("%Y-%m-%d %H:%M")
        _return.append(f"Current Time UTC8 {now}")
        # LOGGER
        logger.trace(_return)
        return _return
```

`openai_kira/Chat/module/plugin/_plugin_tool.py` provides some tool classes, PR is welcome

**Testing**

You cannot test directly from within the module package, please run the `openai_kira/Chat/test_module.py` file to test
the module, with the prompt matching check.

Alternatively, you can safely use `from loguru import logger` + `logger.trace(_return)` in the module to debug the
module variables and the trace level logs will not be output by the production environment.

## 结构

```markdown
.
└── openai_kira
├── api
│ ├── api_url.json
│ ├── api_utils.py
│ ├── network.py
├── Chat
│ ├── __init__.py
│ ├── module
│ ├── Summer.py
│ ├── test_module.py
│ ├── text_analysis_tools
│ └── vocab.json
├── __init__.py
├── requirements.txt
├── resouce
│ ├── completion.py
│ ├── __init__.py
└── utils
├── data.py
├── Network.py
└── Talk.py
```

## EULA(end-user-license-agreement)

**cn**

1. 自行因为不当操作导致的损失。
2. 本项目并非官方项目。
3. 因为安全事故导致的损失我不负责。
4. 拒绝未经授权的专利/软著相关用途。

**en**

1. the damage caused by improper operation on its own.
2. This is not an official project.
3. I am not responsible for any damage caused by safety incidents.


            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/sudoskys/openai-kira",
    "name": "openai-kira",
    "maintainer": "sudoskys",
    "docs_url": null,
    "requires_python": ">=3.8,<4.0",
    "maintainer_email": "coldlando@hotmail.com",
    "keywords": "",
    "author": "sudoskys",
    "author_email": "coldlando@hotmail.com",
    "download_url": "https://files.pythonhosted.org/packages/0e/6a/0c4fd1b07476540ff43fa85cecce312447aefe55f2952621bdf497989ad5/openai_kira-0.3.6.tar.gz",
    "platform": null,
    "description": "# openai-kira\n\nOpenai GPT3 ChatBot \u6846\u67b6\u5305\uff0c\u5728\u672a\u516c\u5f00\u524d\u5feb\u901f\u5b9e\u73b0\u7c7b ChatGPT\u63a5\u5165\uff08\u516c\u5f00\u540e\u5c31\u63a5\u5165chatGPT\uff09\uff0c\u6253\u5305\u6210\u4f9d\u8d56\u7684\u73a9\u5177\u3002\u63d0\u4f9b redis \u548c \u6587\u4ef6\u6570\u636e\u5e93\n\u4e24\u4e2a\u9009\u62e9\u3002\n\n## Use\n\n`pip install -U openai-kira`\n\n**init**\n\n```python\nimport openai_kira\n\n# \nopenai_kira.setting.redisSetting = openai_kira.setting.RedisConfig()\nopenai_kira.setting.dbFile = \"openai_msg.db\"\nopenai_kira.setting.openaiApiKey = [\"key\", \"key2\"]\nopenai_kira.setting.proxyUrl = None  # \"127.0.0.1\"\n# \u63d2\u4ef6\u7684\u8bbe\u7f6e\nopenai_kira.setting.webServerUrlFilter = False\nopenai_kira.setting.webServerStopSentence = [\"\u5e7f\u544a\", \"\u8425\u9500\u53f7\"]\n```\n\n## Exp\n\nSEE `./test` for More Exp!\n\n```python\nimport asyncio\n\nimport openai_kira\nfrom openai_kira import Chat\n\nprint(openai_kira.RedisConfig())\nopenai_kira.setting.openaiApiKey = [\"key\"]\n\nreceiver = Chat.Chatbot(\n    conversation_id=10086,\n    call_func=None,  # Api_keys.pop_api_key,\n    start_sequ=\"Ai:\",\n    restart_sequ=\"Human:\",\n)\n\n\nasync def main():\n    response = await receiver.get_chat_response(model=\"text-davinci-003\",\n                                                prompt=\"\u4f60\u597d\",\n                                                max_tokens=500,\n                                                role=\"\u4f60\u626e\u6f14...\",\n                                                web_enhance_server={\"time\": \"\"}\n                                                )\n    print(response)\n\n\nasyncio.run(main())\n```\n\n```python\nimport asyncio\nimport openai_kira\n\nprint(openai_kira.RedisConfig())\nopenai_kira.setting.openaiApiKey = [\"key\"]\nprint(openai_kira.setting.openaiApiKey)\n\n\nasync def main():\n    try:\n        response = await openai_kira.Completion().create(model=\"text-davinci-003\",\n                                                         prompt=\"Say this is a test\",\n                                                         temperature=0,\n                                                         max_tokens=20)\n        # TEST\n        print(response)\n        print(type(response))\n    except Exception as e:\n        print(e)\n        if \"Incorrect API key provided\" in e:\n            print(\"OK\")\n        else:\n            print(\"NO\")\n\n\nasyncio.run(main())\n```\n\n## Plugin\n\n**Table**\n\n| plugins   | desc              | value/server                                          | use                                   |\n|-----------|-------------------|-------------------------------------------------------|---------------------------------------|\n| `time`    | now time          | `\"\"`,no need                                          | `\u660e\u6628\u4eca\u5929`....                            |\n| `week`    | week time         | `\"\"`,no need                                          | `\u5468\u51e0` .....                            |\n| `search`  | Web Search        | `[\"some.com?searchword={}\"]`,must need                | `\u67e5\u8be2` `\u4f60\u77e5\u9053` len<80 / end with`?`len<15 |\n| `duckgo`  | Web Search        | `\"\"`,no need,but need `pip install duckduckgo_search` | `\u67e5\u8be2` `\u4f60\u77e5\u9053` len<80 / end with`?`len<15 |\n| `details` | answer with steps | `\"\"`,no need                                          | Ask for help `how to`                 |\n\n## Plugin dev\n\nThere is a middleware between the memory pool and the analytics that provides some networked retrieval support and\noperational support. It can be spiked with services that interface to other Api's.\n\n**Prompt Injection**\n\nUse `\"\"` `[]` to emphasise content.\n\n### Exp\n\nFirst create a file in `openai_kira/Chat/module/plugin` without underscores (`_`) in the file name.\n\n**Template**\n\n```python\nfrom ..platform import ChatPlugin, PluginConfig\nfrom ._plugin_tool import PromptTool\nimport os\nfrom loguru import logger\n\nmodulename = os.path.basename(__file__).strip(\".py\")\n\n\n@ChatPlugin.plugin_register(modulename)\nclass Week(object):\n    def __init__(self):\n        self._server = None\n        self._text = None\n        self._time = [\"time\", \"\u591a\u5c11\u5929\", \"\u51e0\u5929\", \"\u65f6\u95f4\", \"\u51e0\u70b9\", \"\u4eca\u5929\", \"\u6628\u5929\", \"\u660e\u5929\", \"\u51e0\u6708\", \"\u51e0\u6708\", \"\u51e0\u53f7\",\n                      \"\u51e0\u4e2a\u6708\",\n                      \"\u5929\u524d\"]\n\n    def requirements(self):\n        return []\n\n    async def check(self, params: PluginConfig) -> bool:\n        if PromptTool.isStrIn(prompt=params.text, keywords=self._time):\n            return True\n        return False\n\n    async def process(self, params: PluginConfig) -> list:\n        _return = []\n        self._text = params.text\n        # \u6821\u9a8c\n        if not all([self._text]):\n            return []\n        # GET\n        from datetime import datetime, timedelta, timezone\n        utc_dt = datetime.utcnow().replace(tzinfo=timezone.utc)\n        bj_dt = utc_dt.astimezone(timezone(timedelta(hours=8)))\n        now = bj_dt.strftime(\"%Y-%m-%d %H:%M\")\n        _return.append(f\"Current Time UTC8 {now}\")\n        # LOGGER\n        logger.trace(_return)\n        return _return\n```\n\n`openai_kira/Chat/module/plugin/_plugin_tool.py` provides some tool classes, PR is welcome\n\n**Testing**\n\nYou cannot test directly from within the module package, please run the `openai_kira/Chat/test_module.py` file to test\nthe module, with the prompt matching check.\n\nAlternatively, you can safely use `from loguru import logger` + `logger.trace(_return)` in the module to debug the\nmodule variables and the trace level logs will not be output by the production environment.\n\n## \u7ed3\u6784\n\n```markdown\n.\n\u2514\u2500\u2500 openai_kira\n\u251c\u2500\u2500 api\n\u2502 \u251c\u2500\u2500 api_url.json\n\u2502 \u251c\u2500\u2500 api_utils.py\n\u2502 \u251c\u2500\u2500 network.py\n\u251c\u2500\u2500 Chat\n\u2502 \u251c\u2500\u2500 __init__.py\n\u2502 \u251c\u2500\u2500 module\n\u2502 \u251c\u2500\u2500 Summer.py\n\u2502 \u251c\u2500\u2500 test_module.py\n\u2502 \u251c\u2500\u2500 text_analysis_tools\n\u2502 \u2514\u2500\u2500 vocab.json\n\u251c\u2500\u2500 __init__.py\n\u251c\u2500\u2500 requirements.txt\n\u251c\u2500\u2500 resouce\n\u2502 \u251c\u2500\u2500 completion.py\n\u2502 \u251c\u2500\u2500 __init__.py\n\u2514\u2500\u2500 utils\n\u251c\u2500\u2500 data.py\n\u251c\u2500\u2500 Network.py\n\u2514\u2500\u2500 Talk.py\n```\n\n## EULA(end-user-license-agreement)\n\n**cn**\n\n1. \u81ea\u884c\u56e0\u4e3a\u4e0d\u5f53\u64cd\u4f5c\u5bfc\u81f4\u7684\u635f\u5931\u3002\n2. \u672c\u9879\u76ee\u5e76\u975e\u5b98\u65b9\u9879\u76ee\u3002\n3. \u56e0\u4e3a\u5b89\u5168\u4e8b\u6545\u5bfc\u81f4\u7684\u635f\u5931\u6211\u4e0d\u8d1f\u8d23\u3002\n4. \u62d2\u7edd\u672a\u7ecf\u6388\u6743\u7684\u4e13\u5229/\u8f6f\u8457\u76f8\u5173\u7528\u9014\u3002\n\n**en**\n\n1. the damage caused by improper operation on its own.\n2. This is not an official project.\n3. I am not responsible for any damage caused by safety incidents.\n\n",
    "bugtrack_url": null,
    "license": "LGPL-2.1-or-later",
    "summary": "A chat client",
    "version": "0.3.6",
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "18c6eb6933f35cc5a93dae5bae9fe9d845169a21585c03c424d558c4175d7189",
                "md5": "a00f93da3e688a8e4d965db0adbdd14f",
                "sha256": "1efa54354028d392d3d98136fcc7969183d68e3ec0309e9389fa40f9d34cfbbf"
            },
            "downloads": -1,
            "filename": "openai_kira-0.3.6-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "a00f93da3e688a8e4d965db0adbdd14f",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.8,<4.0",
            "size": 6930368,
            "upload_time": "2023-01-15T17:02:45",
            "upload_time_iso_8601": "2023-01-15T17:02:45.879669Z",
            "url": "https://files.pythonhosted.org/packages/18/c6/eb6933f35cc5a93dae5bae9fe9d845169a21585c03c424d558c4175d7189/openai_kira-0.3.6-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "0e6a0c4fd1b07476540ff43fa85cecce312447aefe55f2952621bdf497989ad5",
                "md5": "3fe5f412b993f05aefffcc1fdb6a5f03",
                "sha256": "785c2e0edb07ed5435dc2175bee389fd1586b445c7e2c2ee13d0912cb9bb3cd4"
            },
            "downloads": -1,
            "filename": "openai_kira-0.3.6.tar.gz",
            "has_sig": false,
            "md5_digest": "3fe5f412b993f05aefffcc1fdb6a5f03",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.8,<4.0",
            "size": 6807638,
            "upload_time": "2023-01-15T17:02:48",
            "upload_time_iso_8601": "2023-01-15T17:02:48.395460Z",
            "url": "https://files.pythonhosted.org/packages/0e/6a/0c4fd1b07476540ff43fa85cecce312447aefe55f2952621bdf497989ad5/openai_kira-0.3.6.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-01-15 17:02:48",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "github_user": "sudoskys",
    "github_project": "openai-kira",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "openai-kira"
}
        
Elapsed time: 0.02753s