openai-api-call


Nameopenai-api-call JSON
Version 1.4.0 PyPI version JSON
download
home_pagehttps://github.com/cubenlp/openai_api_call
SummaryA short wrapper of the OpenAI api call.
upload_time2023-09-17 09:37:50
maintainer
docs_urlNone
authorRex Wang
requires_python>=3.7
licenseMIT license
keywords openai_api_call
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            > **中文文档移步[这里](README_zh_CN.md)。**

# Openai API call
[![PyPI version](https://img.shields.io/pypi/v/openai_api_call.svg)](https://pypi.python.org/pypi/openai_api_call)
[![Tests](https://github.com/cubenlp/openai_api_call/actions/workflows/test.yml/badge.svg)](https://github.com/cubenlp/openai_api_call/actions/workflows/test.yml/)
[![Documentation Status](https://img.shields.io/badge/docs-github_pages-blue.svg)](https://apicall.wzhecnu.cn)
[![Coverage](https://codecov.io/gh/cubenlp/openai_api_call/branch/master/graph/badge.svg)](https://codecov.io/gh/cubenlp/openai_api_call.jl)

<!-- 
[![Updates](https://pyup.io/repos/github/cubenlp/openai_api_call/shield.svg)](https://pyup.io/repos/github/cubenlp/openai_api_call/) 
-->

A Python wrapper for OpenAI API, supporting multi-turn dialogue, proxy, and asynchronous data processing.

## Installation

```bash
pip install openai-api-call --upgrade
```

## Usage

### Set API Key and Base URL

Method 1, write in Python code:

```python
import openai_api_call
openai_api_call.api_key = "sk-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx"
openai_api_call.base_url = "https://api.example.com"
```

Method 2, set environment variables in `~/.bashrc` or `~/.zshrc`:

```bash
export OPENAI_API_KEY="sk-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx"
export OPENAI_BASE_URL="https://api.example.com"
```

## Examples

Example 1, simulate multi-turn dialogue:

```python
# first chat
chat = Chat("Hello, GPT-3.5!")
resp = chat.getresponse()

# continue the chat
chat.user("How are you?")
next_resp = chat.getresponse()

# add response manually
chat.user("What's your name?")
chat.assistant("My name is GPT-3.5.")

# save the chat history
chat.save("chat.json", mode="w") # default to "a"

# print the chat history
chat.print_log()
```

Example 2, process data in batch, and use a checkpoint file `checkpoint`:

```python
# write a function to process the data
def msg2chat(msg):
    chat = Chat(api_key=api_key)
    chat.system("You are a helpful translator for numbers.")
    chat.user(f"Please translate the digit to Roman numerals: {msg}")
    chat.getresponse()

checkpoint = "chat.jsonl"
msgs = ["%d" % i for i in range(1, 10)]
# process the data
chats = process_chats(msgs[:5], msg2chat, checkpoint, clearfile=True)
# process the rest data, and read the cache from the last time
continue_chats = process_chats(msgs, msg2chat, checkpoint)
```

Example 3, process data in batch (asynchronous), print hello using different languages, and use two coroutines:

```python
from openai_api_call import async_chat_completion, load_chats

langs = ["python", "java", "Julia", "C++"]
chatlogs = ["print hello using %s" % lang for lang in langs]
async_chat_completion(chatlogs, chkpoint="async_chat.jsonl", ncoroutines=2)
chats = load_chats("async_chat.jsonl")
```

## License

This package is licensed under the MIT license. See the LICENSE file for more details.

## update log

Current version `1.0.0` is a stable version, with the redundant feature `function call` removed, and the asynchronous processing tool added.

### Beta version
- Since version `0.2.0`, `Chat` type is used to handle data
- Since version `0.3.0`, you can use different API Key to send requests.
- Since version `0.4.0`, this package is mantained by [cubenlp](https://github.com/cubenlp).
- Since version `0.5.0`, one can use `process_chats` to process the data, with a customized `msg2chat` function and a checkpoint file.
- Since version `0.6.0`, the feature [function call](https://platform.openai.com/docs/guides/gpt/function-calling) is added.


            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/cubenlp/openai_api_call",
    "name": "openai-api-call",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.7",
    "maintainer_email": "",
    "keywords": "openai_api_call",
    "author": "Rex Wang",
    "author_email": "1073853456@qq.com",
    "download_url": "https://files.pythonhosted.org/packages/7f/70/3300960de6ebaabc05ad2fd7203651a794e6096311bc9d7d5e10be3b5b44/openai_api_call-1.4.0.tar.gz",
    "platform": null,
    "description": "> **\u4e2d\u6587\u6587\u6863\u79fb\u6b65[\u8fd9\u91cc](README_zh_CN.md)\u3002**\n\n# Openai API call\n[![PyPI version](https://img.shields.io/pypi/v/openai_api_call.svg)](https://pypi.python.org/pypi/openai_api_call)\n[![Tests](https://github.com/cubenlp/openai_api_call/actions/workflows/test.yml/badge.svg)](https://github.com/cubenlp/openai_api_call/actions/workflows/test.yml/)\n[![Documentation Status](https://img.shields.io/badge/docs-github_pages-blue.svg)](https://apicall.wzhecnu.cn)\n[![Coverage](https://codecov.io/gh/cubenlp/openai_api_call/branch/master/graph/badge.svg)](https://codecov.io/gh/cubenlp/openai_api_call.jl)\n\n<!-- \n[![Updates](https://pyup.io/repos/github/cubenlp/openai_api_call/shield.svg)](https://pyup.io/repos/github/cubenlp/openai_api_call/) \n-->\n\nA Python wrapper for OpenAI API, supporting multi-turn dialogue, proxy, and asynchronous data processing.\n\n## Installation\n\n```bash\npip install openai-api-call --upgrade\n```\n\n## Usage\n\n### Set API Key and Base URL\n\nMethod 1, write in Python code:\n\n```python\nimport openai_api_call\nopenai_api_call.api_key = \"sk-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx\"\nopenai_api_call.base_url = \"https://api.example.com\"\n```\n\nMethod 2, set environment variables in `~/.bashrc` or `~/.zshrc`:\n\n```bash\nexport OPENAI_API_KEY=\"sk-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx\"\nexport OPENAI_BASE_URL=\"https://api.example.com\"\n```\n\n## Examples\n\nExample 1, simulate multi-turn dialogue:\n\n```python\n# first chat\nchat = Chat(\"Hello, GPT-3.5!\")\nresp = chat.getresponse()\n\n# continue the chat\nchat.user(\"How are you?\")\nnext_resp = chat.getresponse()\n\n# add response manually\nchat.user(\"What's your name?\")\nchat.assistant(\"My name is GPT-3.5.\")\n\n# save the chat history\nchat.save(\"chat.json\", mode=\"w\") # default to \"a\"\n\n# print the chat history\nchat.print_log()\n```\n\nExample 2, process data in batch, and use a checkpoint file `checkpoint`:\n\n```python\n# write a function to process the data\ndef msg2chat(msg):\n    chat = Chat(api_key=api_key)\n    chat.system(\"You are a helpful translator for numbers.\")\n    chat.user(f\"Please translate the digit to Roman numerals: {msg}\")\n    chat.getresponse()\n\ncheckpoint = \"chat.jsonl\"\nmsgs = [\"%d\" % i for i in range(1, 10)]\n# process the data\nchats = process_chats(msgs[:5], msg2chat, checkpoint, clearfile=True)\n# process the rest data, and read the cache from the last time\ncontinue_chats = process_chats(msgs, msg2chat, checkpoint)\n```\n\nExample 3, process data in batch (asynchronous), print hello using different languages, and use two coroutines:\n\n```python\nfrom openai_api_call import async_chat_completion, load_chats\n\nlangs = [\"python\", \"java\", \"Julia\", \"C++\"]\nchatlogs = [\"print hello using %s\" % lang for lang in langs]\nasync_chat_completion(chatlogs, chkpoint=\"async_chat.jsonl\", ncoroutines=2)\nchats = load_chats(\"async_chat.jsonl\")\n```\n\n## License\n\nThis package is licensed under the MIT license. See the LICENSE file for more details.\n\n## update log\n\nCurrent version `1.0.0` is a stable version, with the redundant feature `function call` removed, and the asynchronous processing tool added.\n\n### Beta version\n- Since version `0.2.0`, `Chat` type is used to handle data\n- Since version `0.3.0`, you can use different API Key to send requests.\n- Since version `0.4.0`, this package is mantained by [cubenlp](https://github.com/cubenlp).\n- Since version `0.5.0`, one can use `process_chats` to process the data, with a customized `msg2chat` function and a checkpoint file.\n- Since version `0.6.0`, the feature [function call](https://platform.openai.com/docs/guides/gpt/function-calling) is added.\n\n",
    "bugtrack_url": null,
    "license": "MIT license",
    "summary": "A short wrapper of the OpenAI api call.",
    "version": "1.4.0",
    "project_urls": {
        "Homepage": "https://github.com/cubenlp/openai_api_call"
    },
    "split_keywords": [
        "openai_api_call"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "9609807d8bfc0b742275b0b67a91d177c6a747bf647bb58d34b8f71a9f839ce1",
                "md5": "899dcdb403b247aebeb56c9776f088c1",
                "sha256": "ed7a08109a82a4851d7084a79e76df0f1603e8e3c777b3ad695225a2099bdd84"
            },
            "downloads": -1,
            "filename": "openai_api_call-1.4.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "899dcdb403b247aebeb56c9776f088c1",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.7",
            "size": 16109,
            "upload_time": "2023-09-17T09:37:48",
            "upload_time_iso_8601": "2023-09-17T09:37:48.601537Z",
            "url": "https://files.pythonhosted.org/packages/96/09/807d8bfc0b742275b0b67a91d177c6a747bf647bb58d34b8f71a9f839ce1/openai_api_call-1.4.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "7f703300960de6ebaabc05ad2fd7203651a794e6096311bc9d7d5e10be3b5b44",
                "md5": "dc7402ecfc7b57c830162dc308beddda",
                "sha256": "8ac0997552d71b3715ba65498d3a5804620aae2d5d73a212bb536b84fc1e828a"
            },
            "downloads": -1,
            "filename": "openai_api_call-1.4.0.tar.gz",
            "has_sig": false,
            "md5_digest": "dc7402ecfc7b57c830162dc308beddda",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.7",
            "size": 14611,
            "upload_time": "2023-09-17T09:37:50",
            "upload_time_iso_8601": "2023-09-17T09:37:50.367259Z",
            "url": "https://files.pythonhosted.org/packages/7f/70/3300960de6ebaabc05ad2fd7203651a794e6096311bc9d7d5e10be3b5b44/openai_api_call-1.4.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-09-17 09:37:50",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "cubenlp",
    "github_project": "openai_api_call",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "openai-api-call"
}
        
Elapsed time: 0.28726s