# PyLMKit
[README(English version)](https://github.com/52phm/pylmkit/blob/main/README_English.md)
**pylmkit** 是一个旨在构建或集成具有实际价值的大模型(LM)应用程序的项目,旨在帮助用户快速构建符合其业务需求的应用程序。
## 快速安装
```bash
pip install -U pylmkit
```
## 文档
- <a href="http://app.pylmkit.cn" target="_blank">PyLMKit应用(online application)</a>
- <a href="http://en.pylmkit.cn" target="_blank">English document</a>
- <a href="http://zh.pylmkit.cn" target="_blank">中文文档</a>
## 功能
- 1.角色扮演:通过设置角色模板并结合在线搜索、记忆和知识库功能,实现了典型的对话类的功能应用。RolePlay角色扮演是一种基础功能,也是重要的功能。现在在各大大模型企业的APP中可以看到很多关于`短视频文案、小红书文案、讲故事`等这些功能的底层逻辑是基于角色扮演中设置不同的角色模板实现的。
- 案例教程
- [PyLMKit RolePlay: Using Tutorials(English version)](https://github.com/52phm/pylmkit/blob/main/examples/01-RolePlay-APP.md)
- [PyLMKit 角色扮演案例教程(简体中文版)](https://github.com/52phm/pylmkit/blob/main/examples/01-角色扮演应用案例.ipynb)

- 2.RAG(Retrieval-Augmented Generation,检索增强生成)是一种利用知识库检索的方法,提供与用户查询相关的内容,从而增强模型答案的准确性和特异性。RAG包括本地知识库、基于网络的知识库、记忆知识库和数据库知识库。
- **PyLMKit设计了四种RAG功能**
- 基于本地文档的知识库DocRAG
- 基于网页的知识库WebRAG
- 基于数据库的知识库DBRAG
- 基于记忆的知识库MemoryRAG
- 案例教程
- [PyLMKit RAG: Using Tutorials(English version)](https://github.com/52phm/pylmkit/blob/main/examples/02-RAG-Retrieval-Augmented-Generation.md)
- [PyLMKit基于知识库检索增强生成RAG案例教程(简体中文版)](https://github.com/52phm/pylmkit/blob/main/examples/02-基于知识库检索增强生成RAG案例.ipynb)

- 其他功能正在更新中……
## 快速开始
[PyLMKit QuickStart(English version)](https://github.com/52phm/pylmkit/blob/main/examples/00-QuickStart.md)
[PyLMKit 快速开始教程 (简体中文版)](https://github.com/52phm/pylmkit/blob/main/examples/00-快速开始.md)
## LLM模型使用
`PyLMKit`目前集成了`LLM`模型分为:
- `API`付费调用型
- 本地开源模型下载部署
**API 调用型**
(1)首先配置好 `API KEY`,便于调用`LLM`模型。
一个方便的方法是创建一个新的`.env`文件,并在其中配置所有的API密钥信息,从而方便地使用不同的模型。`.env`文件的格式如下:
```python
openai_api_key = "" # OpenAI
QIANFAN_AK = "" # 百度-千帆
QIANFAN_SK = ""
DASHSCOPE_API_KEY = "" # 阿里-通义
spark_appid = "" # 科大讯飞-星火
spark_apikey = ""
spark_apisecret = ""
spark_domain = "generalv3" # generalv2
zhipu_apikey = "" # 清华-智谱AI
baichuan_api_key = "" # 百川
baichuan_secret_key = ""
hunyuan_app_id = "" # 腾讯-混元
hunyuan_secret_id = ""
hunyuan_secret_key = ""
```
(2)`LLM`模型有两种调用方式:
- 普通模式:invoke(query)
- 流式模式:stream(query)
其中,`query`表示用户输入内容,简单例子如下:
```python
from pylmkit.llms import ChatQianfan # 百度-千帆
from pylmkit.llms import ChatSpark # 讯飞-星火
from pylmkit.llms import ChatZhipu # 清华-智谱
from pylmkit.llms import ChatHunyuan # 腾讯-混元
from pylmkit.llms import ChatBaichuan # 百川
from pylmkit.llms import ChatTongyi # 阿里-通义
from pylmkit.llms import ChatOpenAI # OpenAI
from dotenv import load_dotenv
# 加载 .env
load_dotenv()
# 加载模型
model = ChatQianfan()
# 普通模式
res = model.invoke(query="如何学习python?")
print(res)
# 流式模式
res = model.stream(query="如何学习python?")
for i in res:
print(i)
```
**本地开源模型部署和调用**
预先安装 `modelscope` 依赖包
```bash
pip install modelscope
```
- (1)开源模型下载
本案例采用一个很小的模型作为例子,让大多数人都能成功运行这个Demo。这个Demo在CPU环境也能成功运行,并且内存占用小,模型名称为 `Qwen/Qwen-1_8B-Chat`,更多
模型名称,可以去魔搭社区去找相应的模型,链接:[魔搭社区-文本生成](https://www.modelscope.cn/models?page=1&tasks=text-generation&type=nlp)
```python
from pylmkit.llms import LocalLLMModel
LocalLLMModel(model_path='Qwen/Qwen-1_8B-Chat', # 模型名称
tokenizer_kwargs={"revision": 'master'},
model_kwargs={"revision": 'master'},
language='zh', # zh 表示使用国内环境下载,速度快
)
```
下载好的效果如下:

- (2)将下载模型及文件拷贝到自己想要的位置
一般采用上述下载方式,模型都会下载缓存到 `C` 盘,缓存的位置一般在:`C:\Users\你的用户名\.cache\modelscope\hub`,
将刚刚下载的`qwen/Qwen-1_8B-Chat`文件夹剪切保存到 `D:/mycode_llm/` (这里举个例子,可以自定义)路径下,主要避免文件太大占满 `C` 盘。
- (3)安装该开源模型的依赖库
这一步骤很重要,一方面要保证你的`CUDA`环境配置正确、内存足够,即硬件资源得够(当然,CPU环境也可以跑一些模型);另一方面则需要到开源模型的github或者根据其它教程,安装依赖包和推理加速等依赖文件。
一般在下载的模型文件夹里,会有一个`README.md`文件,这是一个关于模型基本信息和安装的说明书,不懂安装依赖库的同学可以详细参照着这个教程来安装,比如`Qwen/Qwen-1_8B-Chat`文件夹内容截图如下:

这个步骤做好了,下一步骤才能正确运行模型,不然可能会遇到大量BUG,那么这个时候就需要一个一个排查和解决。
- (4)加载本地模型并使用
```python
from pylmkit.llms import LocalLLMModel
model = LocalLLMModel(model_path='D:/mycode_llm/qwen/Qwen-1_8B-Chat', # 前面保存的模型文件路径
tokenizer_kwargs={"revision": 'master'},
model_kwargs={"revision": 'master'},
language='zh'
)
# 普通模式
res = model.invoke(query="如何学习python?")
print(">>>invoke ", res)
# 流式模式
res = model.stream(query="如何学习python?")
for i in res:
print(">>>stream ", i)
```
效果如下

## 开源协议
Apache License Version 2.0
Raw data
{
"_id": null,
"home_page": "https://github.com/52phm/pylmkit",
"name": "PyLMKit",
"maintainer": "",
"docs_url": null,
"requires_python": "",
"maintainer_email": "",
"keywords": "",
"author": "52phm",
"author_email": "2374521450@qq.com",
"download_url": "https://files.pythonhosted.org/packages/b2/fd/e686f20f7967748fde7bf8ebcf488a7b92da5dafeea74a654297543a83ee/PyLMKit-0.0.14.tar.gz",
"platform": null,
"description": "# PyLMKit\r\n\r\n\r\n[README\uff08English version\uff09](https://github.com/52phm/pylmkit/blob/main/README_English.md)\r\n\r\n\r\n**pylmkit** \u662f\u4e00\u4e2a\u65e8\u5728\u6784\u5efa\u6216\u96c6\u6210\u5177\u6709\u5b9e\u9645\u4ef7\u503c\u7684\u5927\u6a21\u578b\uff08LM\uff09\u5e94\u7528\u7a0b\u5e8f\u7684\u9879\u76ee\uff0c\u65e8\u5728\u5e2e\u52a9\u7528\u6237\u5feb\u901f\u6784\u5efa\u7b26\u5408\u5176\u4e1a\u52a1\u9700\u6c42\u7684\u5e94\u7528\u7a0b\u5e8f\u3002\r\n\r\n\r\n## \u5feb\u901f\u5b89\u88c5\r\n```bash\r\npip install -U pylmkit\r\n```\r\n\r\n## \u6587\u6863\r\n\r\n- <a href=\"http://app.pylmkit.cn\" target=\"_blank\">PyLMKit\u5e94\u7528\uff08online application\uff09</a>\r\n- <a href=\"http://en.pylmkit.cn\" target=\"_blank\">English document</a>\r\n- <a href=\"http://zh.pylmkit.cn\" target=\"_blank\">\u4e2d\u6587\u6587\u6863</a>\r\n\r\n\r\n## \u529f\u80fd\r\n\r\n- 1.\u89d2\u8272\u626e\u6f14\uff1a\u901a\u8fc7\u8bbe\u7f6e\u89d2\u8272\u6a21\u677f\u5e76\u7ed3\u5408\u5728\u7ebf\u641c\u7d22\u3001\u8bb0\u5fc6\u548c\u77e5\u8bc6\u5e93\u529f\u80fd\uff0c\u5b9e\u73b0\u4e86\u5178\u578b\u7684\u5bf9\u8bdd\u7c7b\u7684\u529f\u80fd\u5e94\u7528\u3002RolePlay\u89d2\u8272\u626e\u6f14\u662f\u4e00\u79cd\u57fa\u7840\u529f\u80fd\uff0c\u4e5f\u662f\u91cd\u8981\u7684\u529f\u80fd\u3002\u73b0\u5728\u5728\u5404\u5927\u5927\u6a21\u578b\u4f01\u4e1a\u7684APP\u4e2d\u53ef\u4ee5\u770b\u5230\u5f88\u591a\u5173\u4e8e`\u77ed\u89c6\u9891\u6587\u6848\u3001\u5c0f\u7ea2\u4e66\u6587\u6848\u3001\u8bb2\u6545\u4e8b`\u7b49\u8fd9\u4e9b\u529f\u80fd\u7684\u5e95\u5c42\u903b\u8f91\u662f\u57fa\u4e8e\u89d2\u8272\u626e\u6f14\u4e2d\u8bbe\u7f6e\u4e0d\u540c\u7684\u89d2\u8272\u6a21\u677f\u5b9e\u73b0\u7684\u3002\r\n \r\n - \u6848\u4f8b\u6559\u7a0b\r\n\r\n - [PyLMKit RolePlay: Using Tutorials(English version)](https://github.com/52phm/pylmkit/blob/main/examples/01-RolePlay-APP.md)\r\n\r\n - [PyLMKit \u89d2\u8272\u626e\u6f14\u6848\u4f8b\u6559\u7a0b(\u7b80\u4f53\u4e2d\u6587\u7248)](https://github.com/52phm/pylmkit/blob/main/examples/01-\u89d2\u8272\u626e\u6f14\u5e94\u7528\u6848\u4f8b.ipynb)\r\n\r\n\r\n\r\n\r\n- 2.RAG\uff08Retrieval-Augmented Generation\uff0c\u68c0\u7d22\u589e\u5f3a\u751f\u6210\uff09\u662f\u4e00\u79cd\u5229\u7528\u77e5\u8bc6\u5e93\u68c0\u7d22\u7684\u65b9\u6cd5\uff0c\u63d0\u4f9b\u4e0e\u7528\u6237\u67e5\u8be2\u76f8\u5173\u7684\u5185\u5bb9\uff0c\u4ece\u800c\u589e\u5f3a\u6a21\u578b\u7b54\u6848\u7684\u51c6\u786e\u6027\u548c\u7279\u5f02\u6027\u3002RAG\u5305\u62ec\u672c\u5730\u77e5\u8bc6\u5e93\u3001\u57fa\u4e8e\u7f51\u7edc\u7684\u77e5\u8bc6\u5e93\u3001\u8bb0\u5fc6\u77e5\u8bc6\u5e93\u548c\u6570\u636e\u5e93\u77e5\u8bc6\u5e93\u3002\r\n\r\n - **PyLMKit\u8bbe\u8ba1\u4e86\u56db\u79cdRAG\u529f\u80fd**\r\n\r\n - \u57fa\u4e8e\u672c\u5730\u6587\u6863\u7684\u77e5\u8bc6\u5e93DocRAG\r\n - \u57fa\u4e8e\u7f51\u9875\u7684\u77e5\u8bc6\u5e93WebRAG\r\n - \u57fa\u4e8e\u6570\u636e\u5e93\u7684\u77e5\u8bc6\u5e93DBRAG\r\n - \u57fa\u4e8e\u8bb0\u5fc6\u7684\u77e5\u8bc6\u5e93MemoryRAG\r\n\r\n - \u6848\u4f8b\u6559\u7a0b\r\n \r\n - [PyLMKit RAG: Using Tutorials(English version)](https://github.com/52phm/pylmkit/blob/main/examples/02-RAG-Retrieval-Augmented-Generation.md)\r\n - [PyLMKit\u57fa\u4e8e\u77e5\u8bc6\u5e93\u68c0\u7d22\u589e\u5f3a\u751f\u6210RAG\u6848\u4f8b\u6559\u7a0b(\u7b80\u4f53\u4e2d\u6587\u7248)](https://github.com/52phm/pylmkit/blob/main/examples/02-\u57fa\u4e8e\u77e5\u8bc6\u5e93\u68c0\u7d22\u589e\u5f3a\u751f\u6210RAG\u6848\u4f8b.ipynb)\r\n\r\n\r\n\r\n\r\n- \u5176\u4ed6\u529f\u80fd\u6b63\u5728\u66f4\u65b0\u4e2d\u2026\u2026\r\n\r\n\r\n## \u5feb\u901f\u5f00\u59cb\r\n\r\n[PyLMKit QuickStart(English version)](https://github.com/52phm/pylmkit/blob/main/examples/00-QuickStart.md)\r\n\r\n[PyLMKit \u5feb\u901f\u5f00\u59cb\u6559\u7a0b (\u7b80\u4f53\u4e2d\u6587\u7248)](https://github.com/52phm/pylmkit/blob/main/examples/00-\u5feb\u901f\u5f00\u59cb.md)\r\n\r\n## LLM\u6a21\u578b\u4f7f\u7528\r\n\r\n`PyLMKit`\u76ee\u524d\u96c6\u6210\u4e86`LLM`\u6a21\u578b\u5206\u4e3a\uff1a\r\n\r\n- `API`\u4ed8\u8d39\u8c03\u7528\u578b\r\n- \u672c\u5730\u5f00\u6e90\u6a21\u578b\u4e0b\u8f7d\u90e8\u7f72\r\n\r\n**API \u8c03\u7528\u578b**\r\n\r\n\uff081\uff09\u9996\u5148\u914d\u7f6e\u597d `API KEY`\uff0c\u4fbf\u4e8e\u8c03\u7528`LLM`\u6a21\u578b\u3002\r\n\r\n\u4e00\u4e2a\u65b9\u4fbf\u7684\u65b9\u6cd5\u662f\u521b\u5efa\u4e00\u4e2a\u65b0\u7684`.env`\u6587\u4ef6\uff0c\u5e76\u5728\u5176\u4e2d\u914d\u7f6e\u6240\u6709\u7684API\u5bc6\u94a5\u4fe1\u606f\uff0c\u4ece\u800c\u65b9\u4fbf\u5730\u4f7f\u7528\u4e0d\u540c\u7684\u6a21\u578b\u3002`.env`\u6587\u4ef6\u7684\u683c\u5f0f\u5982\u4e0b\uff1a\r\n```python\r\nopenai_api_key = \"\" # OpenAI\r\n\r\nQIANFAN_AK = \"\" # \u767e\u5ea6-\u5343\u5e06\r\nQIANFAN_SK = \"\"\r\n\r\nDASHSCOPE_API_KEY = \"\" # \u963f\u91cc-\u901a\u4e49\r\n\r\nspark_appid = \"\" # \u79d1\u5927\u8baf\u98de-\u661f\u706b\r\nspark_apikey = \"\"\r\nspark_apisecret = \"\"\r\nspark_domain = \"generalv3\" # generalv2\r\n\r\nzhipu_apikey = \"\" # \u6e05\u534e-\u667a\u8c31AI\r\n\r\nbaichuan_api_key = \"\" # \u767e\u5ddd\r\nbaichuan_secret_key = \"\"\r\n\r\nhunyuan_app_id = \"\" # \u817e\u8baf-\u6df7\u5143\r\nhunyuan_secret_id = \"\"\r\nhunyuan_secret_key = \"\"\r\n```\r\n\r\n\uff082\uff09`LLM`\u6a21\u578b\u6709\u4e24\u79cd\u8c03\u7528\u65b9\u5f0f\uff1a\r\n\r\n- \u666e\u901a\u6a21\u5f0f\uff1ainvoke(query) \r\n- \u6d41\u5f0f\u6a21\u5f0f\uff1astream(query)\r\n\r\n\u5176\u4e2d\uff0c`query`\u8868\u793a\u7528\u6237\u8f93\u5165\u5185\u5bb9\uff0c\u7b80\u5355\u4f8b\u5b50\u5982\u4e0b\uff1a\r\n\r\n```python\r\nfrom pylmkit.llms import ChatQianfan # \u767e\u5ea6-\u5343\u5e06\r\nfrom pylmkit.llms import ChatSpark # \u8baf\u98de-\u661f\u706b\r\nfrom pylmkit.llms import ChatZhipu # \u6e05\u534e-\u667a\u8c31\r\nfrom pylmkit.llms import ChatHunyuan # \u817e\u8baf-\u6df7\u5143\r\nfrom pylmkit.llms import ChatBaichuan # \u767e\u5ddd\r\nfrom pylmkit.llms import ChatTongyi # \u963f\u91cc-\u901a\u4e49\r\nfrom pylmkit.llms import ChatOpenAI # OpenAI\r\nfrom dotenv import load_dotenv\r\n\r\n\r\n# \u52a0\u8f7d .env\r\nload_dotenv()\r\n# \u52a0\u8f7d\u6a21\u578b\r\nmodel = ChatQianfan()\r\n\r\n# \u666e\u901a\u6a21\u5f0f\r\nres = model.invoke(query=\"\u5982\u4f55\u5b66\u4e60python\uff1f\")\r\nprint(res)\r\n\r\n# \u6d41\u5f0f\u6a21\u5f0f\r\nres = model.stream(query=\"\u5982\u4f55\u5b66\u4e60python\uff1f\")\r\nfor i in res:\r\n print(i)\r\n\r\n```\r\n\r\n**\u672c\u5730\u5f00\u6e90\u6a21\u578b\u90e8\u7f72\u548c\u8c03\u7528**\r\n\r\n\u9884\u5148\u5b89\u88c5 `modelscope` \u4f9d\u8d56\u5305\r\n```bash\r\npip install modelscope\r\n```\r\n\r\n- \uff081\uff09\u5f00\u6e90\u6a21\u578b\u4e0b\u8f7d\r\n\r\n\u672c\u6848\u4f8b\u91c7\u7528\u4e00\u4e2a\u5f88\u5c0f\u7684\u6a21\u578b\u4f5c\u4e3a\u4f8b\u5b50\uff0c\u8ba9\u5927\u591a\u6570\u4eba\u90fd\u80fd\u6210\u529f\u8fd0\u884c\u8fd9\u4e2aDemo\u3002\u8fd9\u4e2aDemo\u5728CPU\u73af\u5883\u4e5f\u80fd\u6210\u529f\u8fd0\u884c\uff0c\u5e76\u4e14\u5185\u5b58\u5360\u7528\u5c0f\uff0c\u6a21\u578b\u540d\u79f0\u4e3a `Qwen/Qwen-1_8B-Chat`\uff0c\u66f4\u591a\r\n\u6a21\u578b\u540d\u79f0\uff0c\u53ef\u4ee5\u53bb\u9b54\u642d\u793e\u533a\u53bb\u627e\u76f8\u5e94\u7684\u6a21\u578b\uff0c\u94fe\u63a5\uff1a[\u9b54\u642d\u793e\u533a-\u6587\u672c\u751f\u6210](https://www.modelscope.cn/models?page=1&tasks=text-generation&type=nlp)\r\n\r\n```python\r\nfrom pylmkit.llms import LocalLLMModel\r\n\r\nLocalLLMModel(model_path='Qwen/Qwen-1_8B-Chat', # \u6a21\u578b\u540d\u79f0\r\n tokenizer_kwargs={\"revision\": 'master'}, \r\n model_kwargs={\"revision\": 'master'},\r\n language='zh', # zh \u8868\u793a\u4f7f\u7528\u56fd\u5185\u73af\u5883\u4e0b\u8f7d\uff0c\u901f\u5ea6\u5feb\r\n)\r\n```\r\n\r\n\u4e0b\u8f7d\u597d\u7684\u6548\u679c\u5982\u4e0b\uff1a\r\n\r\n\r\n\r\n\r\n- \uff082\uff09\u5c06\u4e0b\u8f7d\u6a21\u578b\u53ca\u6587\u4ef6\u62f7\u8d1d\u5230\u81ea\u5df1\u60f3\u8981\u7684\u4f4d\u7f6e\r\n\r\n\u4e00\u822c\u91c7\u7528\u4e0a\u8ff0\u4e0b\u8f7d\u65b9\u5f0f\uff0c\u6a21\u578b\u90fd\u4f1a\u4e0b\u8f7d\u7f13\u5b58\u5230 `C` \u76d8\uff0c\u7f13\u5b58\u7684\u4f4d\u7f6e\u4e00\u822c\u5728\uff1a`C:\\Users\\\u4f60\u7684\u7528\u6237\u540d\\.cache\\modelscope\\hub`\uff0c\r\n\u5c06\u521a\u521a\u4e0b\u8f7d\u7684`qwen/Qwen-1_8B-Chat`\u6587\u4ef6\u5939\u526a\u5207\u4fdd\u5b58\u5230 `D:/mycode_llm/` \uff08\u8fd9\u91cc\u4e3e\u4e2a\u4f8b\u5b50\uff0c\u53ef\u4ee5\u81ea\u5b9a\u4e49\uff09\u8def\u5f84\u4e0b\uff0c\u4e3b\u8981\u907f\u514d\u6587\u4ef6\u592a\u5927\u5360\u6ee1 `C` \u76d8\u3002\r\n\r\n- \uff083\uff09\u5b89\u88c5\u8be5\u5f00\u6e90\u6a21\u578b\u7684\u4f9d\u8d56\u5e93\r\n\r\n\u8fd9\u4e00\u6b65\u9aa4\u5f88\u91cd\u8981\uff0c\u4e00\u65b9\u9762\u8981\u4fdd\u8bc1\u4f60\u7684`CUDA`\u73af\u5883\u914d\u7f6e\u6b63\u786e\u3001\u5185\u5b58\u8db3\u591f\uff0c\u5373\u786c\u4ef6\u8d44\u6e90\u5f97\u591f\uff08\u5f53\u7136\uff0cCPU\u73af\u5883\u4e5f\u53ef\u4ee5\u8dd1\u4e00\u4e9b\u6a21\u578b\uff09\uff1b\u53e6\u4e00\u65b9\u9762\u5219\u9700\u8981\u5230\u5f00\u6e90\u6a21\u578b\u7684github\u6216\u8005\u6839\u636e\u5176\u5b83\u6559\u7a0b\uff0c\u5b89\u88c5\u4f9d\u8d56\u5305\u548c\u63a8\u7406\u52a0\u901f\u7b49\u4f9d\u8d56\u6587\u4ef6\u3002\r\n\r\n\u4e00\u822c\u5728\u4e0b\u8f7d\u7684\u6a21\u578b\u6587\u4ef6\u5939\u91cc\uff0c\u4f1a\u6709\u4e00\u4e2a`README.md`\u6587\u4ef6\uff0c\u8fd9\u662f\u4e00\u4e2a\u5173\u4e8e\u6a21\u578b\u57fa\u672c\u4fe1\u606f\u548c\u5b89\u88c5\u7684\u8bf4\u660e\u4e66\uff0c\u4e0d\u61c2\u5b89\u88c5\u4f9d\u8d56\u5e93\u7684\u540c\u5b66\u53ef\u4ee5\u8be6\u7ec6\u53c2\u7167\u7740\u8fd9\u4e2a\u6559\u7a0b\u6765\u5b89\u88c5\uff0c\u6bd4\u5982`Qwen/Qwen-1_8B-Chat`\u6587\u4ef6\u5939\u5185\u5bb9\u622a\u56fe\u5982\u4e0b\uff1a\r\n\r\n\r\n\r\n\u8fd9\u4e2a\u6b65\u9aa4\u505a\u597d\u4e86\uff0c\u4e0b\u4e00\u6b65\u9aa4\u624d\u80fd\u6b63\u786e\u8fd0\u884c\u6a21\u578b\uff0c\u4e0d\u7136\u53ef\u80fd\u4f1a\u9047\u5230\u5927\u91cfBUG\uff0c\u90a3\u4e48\u8fd9\u4e2a\u65f6\u5019\u5c31\u9700\u8981\u4e00\u4e2a\u4e00\u4e2a\u6392\u67e5\u548c\u89e3\u51b3\u3002\r\n\r\n- \uff084\uff09\u52a0\u8f7d\u672c\u5730\u6a21\u578b\u5e76\u4f7f\u7528\r\n\r\n```python\r\nfrom pylmkit.llms import LocalLLMModel\r\n\r\n\r\nmodel = LocalLLMModel(model_path='D:/mycode_llm/qwen/Qwen-1_8B-Chat', # \u524d\u9762\u4fdd\u5b58\u7684\u6a21\u578b\u6587\u4ef6\u8def\u5f84\r\n tokenizer_kwargs={\"revision\": 'master'},\r\n model_kwargs={\"revision\": 'master'},\r\n language='zh'\r\n )\r\n\r\n# \u666e\u901a\u6a21\u5f0f\r\nres = model.invoke(query=\"\u5982\u4f55\u5b66\u4e60python\uff1f\")\r\nprint(\">>>invoke \", res)\r\n\r\n# \u6d41\u5f0f\u6a21\u5f0f\r\nres = model.stream(query=\"\u5982\u4f55\u5b66\u4e60python\uff1f\")\r\nfor i in res:\r\n print(\">>>stream \", i)\r\n\r\n\r\n```\r\n\r\n\u6548\u679c\u5982\u4e0b\r\n\r\n\r\n\r\n## \u5f00\u6e90\u534f\u8bae\r\n\r\nApache License Version 2.0\r\n\r\n\r\n\r\n\r\n\r\n",
"bugtrack_url": null,
"license": "Apache License-2.0",
"summary": "pylmkit: Help users quickly build practical large model applications!",
"version": "0.0.14",
"project_urls": {
"Documentation": "http://en.pylmkit.cn",
"Homepage": "http://app.pylmkit.cn",
"\u4e2d\u6587\u6587\u6863": "http://zh.pylmkit.cn",
"\u5e94\u7528\u4e3b\u9875": "http://app.pylmkit.cn"
},
"split_keywords": [],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "b2fde686f20f7967748fde7bf8ebcf488a7b92da5dafeea74a654297543a83ee",
"md5": "72af5e4f9d735493b4044b8c1fa632e1",
"sha256": "1460d25513485676b15aa7b1987078db6d9a63547f52c20a4b8394e7f1ff84b6"
},
"downloads": -1,
"filename": "PyLMKit-0.0.14.tar.gz",
"has_sig": false,
"md5_digest": "72af5e4f9d735493b4044b8c1fa632e1",
"packagetype": "sdist",
"python_version": "source",
"requires_python": null,
"size": 28234,
"upload_time": "2023-12-10T09:36:55",
"upload_time_iso_8601": "2023-12-10T09:36:55.880664Z",
"url": "https://files.pythonhosted.org/packages/b2/fd/e686f20f7967748fde7bf8ebcf488a7b92da5dafeea74a654297543a83ee/PyLMKit-0.0.14.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2023-12-10 09:36:55",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "52phm",
"github_project": "pylmkit",
"travis_ci": false,
"coveralls": false,
"github_actions": false,
"requirements": [],
"lcname": "pylmkit"
}