lmclient-core


Namelmclient-core JSON
Version 0.8.4 PyPI version JSON
download
home_pagehttps://github.com/wangyuxinwhy/lmclient
SummaryLM Async Client, openai client, azure openai client ...
upload_time2023-10-30 04:02:10
maintainer
docs_urlNone
authorwangyuxin
requires_python>=3.8,<4.0
licenseApache-2.0
keywords lmclient openai azure async
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # lmclient

面向于大规模异步请求 OpenAI 接口设计的客户端,使用场景 self-instruct, 大规模翻译等

## Features

1. 支持大规模异步请求 openai 接口
2. 支持进度条
3. 支持限制每分钟最大请求次数
4. 支持限制异步容量 (类似于线程池的大小)
5. 支持磁盘缓存
6. 100% type hints
7. 非常易用
8. 支持 OpenAI, Azure, Minimax, MinimaxPro, 智谱, 百度文心, 腾讯混元
9. 支持 FunctionCall

## 安装方式
支持 python3.8 及以上
```shell
pip install lmclient-core
```

## 使用方法

1. CompletionEngine
```python
from lmclient import CompletionEngine, OpenAIChat, OpenAIChatParameters

model = OpenAIChat('gpt-3.5-turbo',  parameters=OpenAIChatParameters(temperature=0))
# 控制每分钟最大请求次数为 20, 异步容量为 5
client = CompletionEngine(model, async_capacity=5, max_requests_per_minute=20)
prompts = [
    'Hello, my name is',
    'can you please tell me your name?',
    [{'role': 'user', 'content': 'hello, who are you?'}],
    'what is your name?',
]
outputs = client.async_run(prompts=prompts)
for output in outputs:
    print(output.reply)
```

2. ChatEngine
```python
from lmclient import ChatEngine, OpenAIChat

model = OpenAIChat('gpt-3.5-turbo')
chat_engine = ChatEngine(model)
print(chat_engine.chat('你好,我是 chat_engine'))
print(chat_engine.chat('我上一句话是什么?'))
```

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/wangyuxinwhy/lmclient",
    "name": "lmclient-core",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.8,<4.0",
    "maintainer_email": "",
    "keywords": "lmclient,openai,azure,async",
    "author": "wangyuxin",
    "author_email": "wangyuxin@mokahr.com",
    "download_url": "https://files.pythonhosted.org/packages/1f/cc/73ca958d49bf913883eb65c58050f93526ff8479fe4d2eed50489fcb92f5/lmclient_core-0.8.4.tar.gz",
    "platform": null,
    "description": "# lmclient\n\n\u9762\u5411\u4e8e\u5927\u89c4\u6a21\u5f02\u6b65\u8bf7\u6c42 OpenAI \u63a5\u53e3\u8bbe\u8ba1\u7684\u5ba2\u6237\u7aef\uff0c\u4f7f\u7528\u573a\u666f self-instruct, \u5927\u89c4\u6a21\u7ffb\u8bd1\u7b49\n\n## Features\n\n1. \u652f\u6301\u5927\u89c4\u6a21\u5f02\u6b65\u8bf7\u6c42 openai \u63a5\u53e3\n2. \u652f\u6301\u8fdb\u5ea6\u6761\n3. \u652f\u6301\u9650\u5236\u6bcf\u5206\u949f\u6700\u5927\u8bf7\u6c42\u6b21\u6570\n4. \u652f\u6301\u9650\u5236\u5f02\u6b65\u5bb9\u91cf \uff08\u7c7b\u4f3c\u4e8e\u7ebf\u7a0b\u6c60\u7684\u5927\u5c0f\uff09\n5. \u652f\u6301\u78c1\u76d8\u7f13\u5b58\n6. 100% type hints\n7. \u975e\u5e38\u6613\u7528\n8. \u652f\u6301 OpenAI, Azure, Minimax, MinimaxPro, \u667a\u8c31, \u767e\u5ea6\u6587\u5fc3, \u817e\u8baf\u6df7\u5143\n9. \u652f\u6301 FunctionCall\n\n## \u5b89\u88c5\u65b9\u5f0f\n\u652f\u6301 python3.8 \u53ca\u4ee5\u4e0a\n```shell\npip install lmclient-core\n```\n\n## \u4f7f\u7528\u65b9\u6cd5\n\n1. CompletionEngine\n```python\nfrom lmclient import CompletionEngine, OpenAIChat, OpenAIChatParameters\n\nmodel = OpenAIChat('gpt-3.5-turbo',  parameters=OpenAIChatParameters(temperature=0))\n# \u63a7\u5236\u6bcf\u5206\u949f\u6700\u5927\u8bf7\u6c42\u6b21\u6570\u4e3a 20\uff0c \u5f02\u6b65\u5bb9\u91cf\u4e3a 5\nclient = CompletionEngine(model, async_capacity=5, max_requests_per_minute=20)\nprompts = [\n    'Hello, my name is',\n    'can you please tell me your name?',\n    [{'role': 'user', 'content': 'hello, who are you?'}],\n    'what is your name?',\n]\noutputs = client.async_run(prompts=prompts)\nfor output in outputs:\n    print(output.reply)\n```\n\n2. ChatEngine\n```python\nfrom lmclient import ChatEngine, OpenAIChat\n\nmodel = OpenAIChat('gpt-3.5-turbo')\nchat_engine = ChatEngine(model)\nprint(chat_engine.chat('\u4f60\u597d\uff0c\u6211\u662f chat_engine'))\nprint(chat_engine.chat('\u6211\u4e0a\u4e00\u53e5\u8bdd\u662f\u4ec0\u4e48\uff1f'))\n```\n",
    "bugtrack_url": null,
    "license": "Apache-2.0",
    "summary": "LM Async Client, openai client, azure openai client ...",
    "version": "0.8.4",
    "project_urls": {
        "Homepage": "https://github.com/wangyuxinwhy/lmclient",
        "Repository": "https://github.com/wangyuxinwhy/lmclient"
    },
    "split_keywords": [
        "lmclient",
        "openai",
        "azure",
        "async"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "545b2373263b03f39000f339214cce47023e0b3af24efef030ccde8c3af9ddd1",
                "md5": "79fb7de884898a35d664b27f54f95465",
                "sha256": "6bd726a51a2d413a905b26f9b8f1cb841a48e2521344e86cde69dea4aa4dec64"
            },
            "downloads": -1,
            "filename": "lmclient_core-0.8.4-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "79fb7de884898a35d664b27f54f95465",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.8,<4.0",
            "size": 32266,
            "upload_time": "2023-10-30T04:02:08",
            "upload_time_iso_8601": "2023-10-30T04:02:08.892658Z",
            "url": "https://files.pythonhosted.org/packages/54/5b/2373263b03f39000f339214cce47023e0b3af24efef030ccde8c3af9ddd1/lmclient_core-0.8.4-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "1fcc73ca958d49bf913883eb65c58050f93526ff8479fe4d2eed50489fcb92f5",
                "md5": "15f11aeae59595cedeb07eb308f7a5f4",
                "sha256": "b687543dfb63da2674aae9a09a796944f28ed7d7636b927fb56533c999e6c620"
            },
            "downloads": -1,
            "filename": "lmclient_core-0.8.4.tar.gz",
            "has_sig": false,
            "md5_digest": "15f11aeae59595cedeb07eb308f7a5f4",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.8,<4.0",
            "size": 21398,
            "upload_time": "2023-10-30T04:02:10",
            "upload_time_iso_8601": "2023-10-30T04:02:10.558010Z",
            "url": "https://files.pythonhosted.org/packages/1f/cc/73ca958d49bf913883eb65c58050f93526ff8479fe4d2eed50489fcb92f5/lmclient_core-0.8.4.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-10-30 04:02:10",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "wangyuxinwhy",
    "github_project": "lmclient",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "lmclient-core"
}
        
Elapsed time: 0.12414s