chatapi-toolkit


Namechatapi-toolkit JSON
Version 2.0.0 PyPI version JSON
download
home_pagehttps://github.com/cubenlp/chatapi_toolkit
SummaryA short wrapper of the Chatapi Toolkit.
upload_time2023-09-19 08:01:27
maintainer
docs_urlNone
authorRex Wang
requires_python>=3.7
licenseMIT license
keywords chatapi_toolkit
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            > **中文文档移步[这里](README_zh_CN.md)。**

# Chatapi Toolkit
[![PyPI version](https://img.shields.io/pypi/v/chatapi_toolkit.svg)](https://pypi.python.org/pypi/chatapi_toolkit)
[![Tests](https://github.com/cubenlp/chatapi_toolkit/actions/workflows/test.yml/badge.svg)](https://github.com/cubenlp/chatapi_toolkit/actions/workflows/test.yml/)
[![Documentation Status](https://img.shields.io/badge/docs-github_pages-blue.svg)](https://apicall.wzhecnu.cn)
[![Coverage](https://codecov.io/gh/cubenlp/chatapi_toolkit/branch/master/graph/badge.svg)](https://codecov.io/gh/cubenlp/chatapi_toolkit.jl)

<!-- 
[![Updates](https://pyup.io/repos/github/cubenlp/chatapi_toolkit/shield.svg)](https://pyup.io/repos/github/cubenlp/chatapi_toolkit/) 
-->

A Python wrapper for ChatAPI Toolkit, supporting multi-turn dialogue, proxy, and asynchronous data processing.

## Installation

```bash
pip install chatapi-toolkit --upgrade
```

## Usage

### Set API Key and Base URL

Method 1, write in Python code:

```python
import chatapi_toolkit
chatapi_toolkit.api_key = "sk-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx"
chatapi_toolkit.base_url = "https://api.example.com"
```

Method 2, set environment variables in `~/.bashrc` or `~/.zshrc`:

```bash
export OPENAI_API_KEY="sk-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx"
export OPENAI_BASE_URL="https://api.example.com"
```

## Examples

Example 1, simulate multi-turn dialogue:

```python
# first chat
chat = Chat("Hello, GPT-3.5!")
resp = chat.getresponse()

# continue the chat
chat.user("How are you?")
next_resp = chat.getresponse()

# add response manually
chat.user("What's your name?")
chat.assistant("My name is GPT-3.5.")

# save the chat history
chat.save("chat.json", mode="w") # default to "a"

# print the chat history
chat.print_log()
```

Example 2, process data in batch, and use a checkpoint file `checkpoint`:

```python
# write a function to process the data
def msg2chat(msg):
    chat = Chat(api_key=api_key)
    chat.system("You are a helpful translator for numbers.")
    chat.user(f"Please translate the digit to Roman numerals: {msg}")
    chat.getresponse()

checkpoint = "chat.jsonl"
msgs = ["%d" % i for i in range(1, 10)]
# process the data
chats = process_chats(msgs[:5], msg2chat, checkpoint, clearfile=True)
# process the rest data, and read the cache from the last time
continue_chats = process_chats(msgs, msg2chat, checkpoint)
```

Example 3, process data in batch (asynchronous), print hello using different languages, and use two coroutines:

```python
from chatapi_toolkit import async_chat_completion, load_chats

langs = ["python", "java", "Julia", "C++"]
chatlogs = ["print hello using %s" % lang for lang in langs]
async_chat_completion(chatlogs, chkpoint="async_chat.jsonl", ncoroutines=2)
chats = load_chats("async_chat.jsonl")
```

## License

This package is licensed under the MIT license. See the LICENSE file for more details.

## update log

Current version `1.0.0` is a stable version, with the redundant feature `function call` removed, and the asynchronous processing tool added.

### Beta version
- Since version `0.2.0`, `Chat` type is used to handle data
- Since version `0.3.0`, you can use different API Key to send requests.
- Since version `0.4.0`, this package is mantained by [cubenlp](https://github.com/cubenlp).
- Since version `0.5.0`, one can use `process_chats` to process the data, with a customized `msg2chat` function and a checkpoint file.
- Since version `0.6.0`, the feature [function call](https://platform.openai.com/docs/guides/gpt/function-calling) is added.


            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/cubenlp/chatapi_toolkit",
    "name": "chatapi-toolkit",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.7",
    "maintainer_email": "",
    "keywords": "chatapi_toolkit",
    "author": "Rex Wang",
    "author_email": "1073853456@qq.com",
    "download_url": "https://files.pythonhosted.org/packages/38/bf/e39f94ecf0b3e321f58b0c8f023e179adfcdf0bdb2b9d10fdb9b222bffe6/chatapi_toolkit-2.0.0.tar.gz",
    "platform": null,
    "description": "> **\u4e2d\u6587\u6587\u6863\u79fb\u6b65[\u8fd9\u91cc](README_zh_CN.md)\u3002**\n\n# Chatapi Toolkit\n[![PyPI version](https://img.shields.io/pypi/v/chatapi_toolkit.svg)](https://pypi.python.org/pypi/chatapi_toolkit)\n[![Tests](https://github.com/cubenlp/chatapi_toolkit/actions/workflows/test.yml/badge.svg)](https://github.com/cubenlp/chatapi_toolkit/actions/workflows/test.yml/)\n[![Documentation Status](https://img.shields.io/badge/docs-github_pages-blue.svg)](https://apicall.wzhecnu.cn)\n[![Coverage](https://codecov.io/gh/cubenlp/chatapi_toolkit/branch/master/graph/badge.svg)](https://codecov.io/gh/cubenlp/chatapi_toolkit.jl)\n\n<!-- \n[![Updates](https://pyup.io/repos/github/cubenlp/chatapi_toolkit/shield.svg)](https://pyup.io/repos/github/cubenlp/chatapi_toolkit/) \n-->\n\nA Python wrapper for ChatAPI Toolkit, supporting multi-turn dialogue, proxy, and asynchronous data processing.\n\n## Installation\n\n```bash\npip install chatapi-toolkit --upgrade\n```\n\n## Usage\n\n### Set API Key and Base URL\n\nMethod 1, write in Python code:\n\n```python\nimport chatapi_toolkit\nchatapi_toolkit.api_key = \"sk-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx\"\nchatapi_toolkit.base_url = \"https://api.example.com\"\n```\n\nMethod 2, set environment variables in `~/.bashrc` or `~/.zshrc`:\n\n```bash\nexport OPENAI_API_KEY=\"sk-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx\"\nexport OPENAI_BASE_URL=\"https://api.example.com\"\n```\n\n## Examples\n\nExample 1, simulate multi-turn dialogue:\n\n```python\n# first chat\nchat = Chat(\"Hello, GPT-3.5!\")\nresp = chat.getresponse()\n\n# continue the chat\nchat.user(\"How are you?\")\nnext_resp = chat.getresponse()\n\n# add response manually\nchat.user(\"What's your name?\")\nchat.assistant(\"My name is GPT-3.5.\")\n\n# save the chat history\nchat.save(\"chat.json\", mode=\"w\") # default to \"a\"\n\n# print the chat history\nchat.print_log()\n```\n\nExample 2, process data in batch, and use a checkpoint file `checkpoint`:\n\n```python\n# write a function to process the data\ndef msg2chat(msg):\n    chat = Chat(api_key=api_key)\n    chat.system(\"You are a helpful translator for numbers.\")\n    chat.user(f\"Please translate the digit to Roman numerals: {msg}\")\n    chat.getresponse()\n\ncheckpoint = \"chat.jsonl\"\nmsgs = [\"%d\" % i for i in range(1, 10)]\n# process the data\nchats = process_chats(msgs[:5], msg2chat, checkpoint, clearfile=True)\n# process the rest data, and read the cache from the last time\ncontinue_chats = process_chats(msgs, msg2chat, checkpoint)\n```\n\nExample 3, process data in batch (asynchronous), print hello using different languages, and use two coroutines:\n\n```python\nfrom chatapi_toolkit import async_chat_completion, load_chats\n\nlangs = [\"python\", \"java\", \"Julia\", \"C++\"]\nchatlogs = [\"print hello using %s\" % lang for lang in langs]\nasync_chat_completion(chatlogs, chkpoint=\"async_chat.jsonl\", ncoroutines=2)\nchats = load_chats(\"async_chat.jsonl\")\n```\n\n## License\n\nThis package is licensed under the MIT license. See the LICENSE file for more details.\n\n## update log\n\nCurrent version `1.0.0` is a stable version, with the redundant feature `function call` removed, and the asynchronous processing tool added.\n\n### Beta version\n- Since version `0.2.0`, `Chat` type is used to handle data\n- Since version `0.3.0`, you can use different API Key to send requests.\n- Since version `0.4.0`, this package is mantained by [cubenlp](https://github.com/cubenlp).\n- Since version `0.5.0`, one can use `process_chats` to process the data, with a customized `msg2chat` function and a checkpoint file.\n- Since version `0.6.0`, the feature [function call](https://platform.openai.com/docs/guides/gpt/function-calling) is added.\n\n",
    "bugtrack_url": null,
    "license": "MIT license",
    "summary": "A short wrapper of the Chatapi Toolkit.",
    "version": "2.0.0",
    "project_urls": {
        "Homepage": "https://github.com/cubenlp/chatapi_toolkit"
    },
    "split_keywords": [
        "chatapi_toolkit"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "b59b506b1865af4db360614bdee3752050872a50e0437dcc1d61486a4aeb6ffc",
                "md5": "24c9bce719a41a2522b1a91bad32453e",
                "sha256": "b80d7ce6995837407d6da7d1646fcaee52dc365f63ebb6570801cdf34c413abf"
            },
            "downloads": -1,
            "filename": "chatapi_toolkit-2.0.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "24c9bce719a41a2522b1a91bad32453e",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.7",
            "size": 16110,
            "upload_time": "2023-09-19T08:01:25",
            "upload_time_iso_8601": "2023-09-19T08:01:25.275563Z",
            "url": "https://files.pythonhosted.org/packages/b5/9b/506b1865af4db360614bdee3752050872a50e0437dcc1d61486a4aeb6ffc/chatapi_toolkit-2.0.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "38bfe39f94ecf0b3e321f58b0c8f023e179adfcdf0bdb2b9d10fdb9b222bffe6",
                "md5": "52c4530ce3b3e4b5dace807dabc91a16",
                "sha256": "8bd7d5343ecb7e2ad030c791baf635760f1430c6f717414412d32f1637a9fde2"
            },
            "downloads": -1,
            "filename": "chatapi_toolkit-2.0.0.tar.gz",
            "has_sig": false,
            "md5_digest": "52c4530ce3b3e4b5dace807dabc91a16",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.7",
            "size": 14607,
            "upload_time": "2023-09-19T08:01:27",
            "upload_time_iso_8601": "2023-09-19T08:01:27.122355Z",
            "url": "https://files.pythonhosted.org/packages/38/bf/e39f94ecf0b3e321f58b0c8f023e179adfcdf0bdb2b9d10fdb9b222bffe6/chatapi_toolkit-2.0.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-09-19 08:01:27",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "cubenlp",
    "github_project": "chatapi_toolkit",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "chatapi-toolkit"
}
        
Elapsed time: 0.15896s