# openai-simple-chat
大模型对话标准接口。支持模板对话、JSON对话等。
## 安装
```shell
pip install openai-simple-chat
```
## 环境变量配置项
- OPENAI_BASE_URL # 支持openai兼容服务
- OPENAI_API_KEY
- OPENAI_CHAT_MODEL
- OLLAMA_BASE_URL # 支持ollama兼容服务
- OLLAMA_API_KEY
- OLLAMA_CHAT_MODEL
- OPENAI_SIMPLE_CHAT_TEMPLATE_ENGINE # 其它设置
- OPENAI_SIMPLE_CHAT_LOGGER_NAME
## 使用方法
*test_templates/calc.txt*
```text
以标准json返回以下计算结果数值【输出格式为:{"result": xx}】:{{expression}}
```
*main1.py*
```python
import openai_simple_chat
llm = openai_simple_chat.OpenAIChatService(
template_engine=openai_simple_chat.get_template_prompt_by_jinjia2,
template_root="test_templates",
)
response, response_info = llm.jsonchat(
template="calc.txt",
expression="1+1",
)
assert response
assert response_info
assert isinstance(response, dict)
assert isinstance(response_info, dict)
assert "result" in response
assert response["result"] == 2
# 注意,如果是stream_chat的话,response可能为空字符串。
```
*main2.py*
```python
import openai_simple_chat
llm = openai_simple_chat.OllamaChatService(
template_engine=openai_simple_chat.get_template_prompt_by_jinjia2,
template_root="test_templates",
)
response, response_info = llm.jsonchat(template="calc.txt", expression="1+1")
assert response
assert response_info
assert isinstance(response, dict)
assert isinstance(response_info, dict)
assert "result" in response
assert response["result"] == 2
```
## 版本记录
### v0.1.0
- 版本首发。
- 支持模板对话。
- 支持json对话。
- 兼容openai和ollama服务。
- 兼容django和jinja2模板引擎。
- jsonchat已经对deekseek输出的`think`过程输出进行处理。
### v0.1.1
- OpenAIService服务支持max_tokens参数。
### v0.1.2
- 添加:get_template_prompt_by_django_template_source_engine提示词模板引擎。
- 改进:更灵活的服务初始化构造和调用。
### v0.1.3
- 修正:直接使用template生成提示词而无需prompt的情况下报错的问题。
### v0.1.4
- 添加:动态计算max_tokens的机制。
- 修改:移除max_input_tokens和max_output_tokens参数。
- 修正:openai的streaming_chat问题。
- 修正:ollama的对话问题。
### v0.1.5
- 优化:非标准json输出的处理,支持半开json块的解析。
Raw data
{
"_id": null,
"home_page": null,
"name": "openai-simple-chat",
"maintainer": "rRR0VrFP",
"docs_url": null,
"requires_python": null,
"maintainer_email": null,
"keywords": "openai-simple-chat",
"author": "rRR0VrFP",
"author_email": null,
"download_url": "https://files.pythonhosted.org/packages/05/fa/06f23edd7f115bc77d5fc7121c011e1d86cc843f235df83b3dd5d6720130/openai_simple_chat-0.1.5.tar.gz",
"platform": null,
"description": "# openai-simple-chat\r\n\r\n\u5927\u6a21\u578b\u5bf9\u8bdd\u6807\u51c6\u63a5\u53e3\u3002\u652f\u6301\u6a21\u677f\u5bf9\u8bdd\u3001JSON\u5bf9\u8bdd\u7b49\u3002\r\n\r\n## \u5b89\u88c5\r\n\r\n```shell\r\npip install openai-simple-chat\r\n```\r\n\r\n## \u73af\u5883\u53d8\u91cf\u914d\u7f6e\u9879\r\n\r\n- OPENAI_BASE_URL # \u652f\u6301openai\u517c\u5bb9\u670d\u52a1\r\n- OPENAI_API_KEY\r\n- OPENAI_CHAT_MODEL\r\n- OLLAMA_BASE_URL # \u652f\u6301ollama\u517c\u5bb9\u670d\u52a1\r\n- OLLAMA_API_KEY\r\n- OLLAMA_CHAT_MODEL\r\n- OPENAI_SIMPLE_CHAT_TEMPLATE_ENGINE # \u5176\u5b83\u8bbe\u7f6e\r\n- OPENAI_SIMPLE_CHAT_LOGGER_NAME\r\n\r\n## \u4f7f\u7528\u65b9\u6cd5\r\n\r\n*test_templates/calc.txt*\r\n\r\n```text\r\n\u4ee5\u6807\u51c6json\u8fd4\u56de\u4ee5\u4e0b\u8ba1\u7b97\u7ed3\u679c\u6570\u503c\u3010\u8f93\u51fa\u683c\u5f0f\u4e3a\uff1a{\"result\": xx}\u3011\uff1a{{expression}}\r\n```\r\n\r\n*main1.py*\r\n\r\n```python\r\nimport openai_simple_chat\r\n\r\nllm = openai_simple_chat.OpenAIChatService(\r\n template_engine=openai_simple_chat.get_template_prompt_by_jinjia2,\r\n template_root=\"test_templates\",\r\n)\r\nresponse, response_info = llm.jsonchat(\r\n template=\"calc.txt\",\r\n expression=\"1+1\",\r\n)\r\nassert response\r\nassert response_info\r\nassert isinstance(response, dict)\r\nassert isinstance(response_info, dict)\r\nassert \"result\" in response\r\nassert response[\"result\"] == 2\r\n\r\n# \u6ce8\u610f\uff0c\u5982\u679c\u662fstream_chat\u7684\u8bdd\uff0cresponse\u53ef\u80fd\u4e3a\u7a7a\u5b57\u7b26\u4e32\u3002\r\n```\r\n\r\n*main2.py*\r\n\r\n```python\r\nimport openai_simple_chat\r\n\r\nllm = openai_simple_chat.OllamaChatService(\r\n template_engine=openai_simple_chat.get_template_prompt_by_jinjia2,\r\n template_root=\"test_templates\",\r\n)\r\nresponse, response_info = llm.jsonchat(template=\"calc.txt\", expression=\"1+1\")\r\nassert response\r\nassert response_info\r\nassert isinstance(response, dict)\r\nassert isinstance(response_info, dict)\r\nassert \"result\" in response\r\nassert response[\"result\"] == 2\r\n```\r\n\r\n## \u7248\u672c\u8bb0\u5f55\r\n\r\n### v0.1.0\r\n\r\n- \u7248\u672c\u9996\u53d1\u3002\r\n- \u652f\u6301\u6a21\u677f\u5bf9\u8bdd\u3002\r\n- \u652f\u6301json\u5bf9\u8bdd\u3002\r\n- \u517c\u5bb9openai\u548collama\u670d\u52a1\u3002\r\n- \u517c\u5bb9django\u548cjinja2\u6a21\u677f\u5f15\u64ce\u3002\r\n- jsonchat\u5df2\u7ecf\u5bf9deekseek\u8f93\u51fa\u7684`think`\u8fc7\u7a0b\u8f93\u51fa\u8fdb\u884c\u5904\u7406\u3002\r\n\r\n### v0.1.1\r\n\r\n- OpenAIService\u670d\u52a1\u652f\u6301max_tokens\u53c2\u6570\u3002\r\n\r\n### v0.1.2\r\n\r\n- \u6dfb\u52a0\uff1aget_template_prompt_by_django_template_source_engine\u63d0\u793a\u8bcd\u6a21\u677f\u5f15\u64ce\u3002\r\n- \u6539\u8fdb\uff1a\u66f4\u7075\u6d3b\u7684\u670d\u52a1\u521d\u59cb\u5316\u6784\u9020\u548c\u8c03\u7528\u3002\r\n\r\n### v0.1.3\r\n\r\n- \u4fee\u6b63\uff1a\u76f4\u63a5\u4f7f\u7528template\u751f\u6210\u63d0\u793a\u8bcd\u800c\u65e0\u9700prompt\u7684\u60c5\u51b5\u4e0b\u62a5\u9519\u7684\u95ee\u9898\u3002\r\n\r\n### v0.1.4\r\n\r\n- \u6dfb\u52a0\uff1a\u52a8\u6001\u8ba1\u7b97max_tokens\u7684\u673a\u5236\u3002\r\n- \u4fee\u6539\uff1a\u79fb\u9664max_input_tokens\u548cmax_output_tokens\u53c2\u6570\u3002\r\n- \u4fee\u6b63\uff1aopenai\u7684streaming_chat\u95ee\u9898\u3002\r\n- \u4fee\u6b63\uff1aollama\u7684\u5bf9\u8bdd\u95ee\u9898\u3002\r\n\r\n### v0.1.5\r\n\r\n- \u4f18\u5316\uff1a\u975e\u6807\u51c6json\u8f93\u51fa\u7684\u5904\u7406\uff0c\u652f\u6301\u534a\u5f00json\u5757\u7684\u89e3\u6790\u3002\r\n",
"bugtrack_url": null,
"license": "Apache License, Version 2.0",
"summary": "\u5927\u6a21\u578b\u5bf9\u8bdd\u6807\u51c6\u63a5\u53e3\u3002\u652f\u6301\u6a21\u677f\u5bf9\u8bdd\u3001JSON\u5bf9\u8bdd\u7b49\u3002",
"version": "0.1.5",
"project_urls": null,
"split_keywords": [
"openai-simple-chat"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "01078e1621068645615d32e5b2ae9e0de3814b12011259cd95dd6540f3ed7bcb",
"md5": "b570f0dd3b54dc6f11128212fbb5b03d",
"sha256": "d729857d8a06c76fbf18d2a4d22d6c14fdab146ebd43fef115dd7ab3840351e9"
},
"downloads": -1,
"filename": "openai_simple_chat-0.1.5-py3-none-any.whl",
"has_sig": false,
"md5_digest": "b570f0dd3b54dc6f11128212fbb5b03d",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": null,
"size": 20074,
"upload_time": "2025-03-22T06:50:52",
"upload_time_iso_8601": "2025-03-22T06:50:52.196109Z",
"url": "https://files.pythonhosted.org/packages/01/07/8e1621068645615d32e5b2ae9e0de3814b12011259cd95dd6540f3ed7bcb/openai_simple_chat-0.1.5-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "05fa06f23edd7f115bc77d5fc7121c011e1d86cc843f235df83b3dd5d6720130",
"md5": "a5f589c9b83dba2778982600bdf8c0c2",
"sha256": "c183904c1ce04d4376eca4bc1b56baeedc4a3442d4a22909edc5b6049c8140ad"
},
"downloads": -1,
"filename": "openai_simple_chat-0.1.5.tar.gz",
"has_sig": false,
"md5_digest": "a5f589c9b83dba2778982600bdf8c0c2",
"packagetype": "sdist",
"python_version": "source",
"requires_python": null,
"size": 16107,
"upload_time": "2025-03-22T06:50:53",
"upload_time_iso_8601": "2025-03-22T06:50:53.611224Z",
"url": "https://files.pythonhosted.org/packages/05/fa/06f23edd7f115bc77d5fc7121c011e1d86cc843f235df83b3dd5d6720130/openai_simple_chat-0.1.5.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-03-22 06:50:53",
"github": false,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"lcname": "openai-simple-chat"
}