OpenaiBatchAPI


NameOpenaiBatchAPI JSON
Version 1.1.2 PyPI version JSON
download
home_pagehttps://github.com/it-dainb/batch_api.git
SummaryOpenaiBatchAPI: A Python Library that supports OpenAI Batch API
upload_time2024-11-07 10:39:52
maintainerNone
docs_urlNone
authorIT.DAINB
requires_python<=3.12,>=3.8
licenseApache License 2.0
keywords openai batch_api batch api
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # OpenaiBatchAPI: A Python Library for OpenAI Batch API
[OpenAI Batch API Documentation](https://platform.openai.com/docs/guides/batch)

**Note:** *Currently supports only `gpt-4o` and `gpt-4o-mini` models.*

## Installation

Install the package from PyPI using [pip](http://www.pip-installer.org):

```bash
$ pip install OpenaiBatchAPI
```

## Usage Example

### Basic Example

```python
from openai_batch_api import OpenaiBatchAPI

# Initialize the Batch API client
batch_client = OpenaiBatchAPI(
    api_key="API_TOKEN",
    max_retry=3,
    timeout=5 * 60  # Timeout in seconds
)

messages = []

# Add messages with custom IDs (e.g., "calc_{i}")
for i in range(7):
    messages.append({
        "id": f"calc_{i}",
        "content": [
            {
                "role": "user",
                "content": f"Calculate: 1 + {i} = "
            }
        ]
    })

# Add messages with auto-generated IDs (index-based)
for i in range(7):
    messages.append({
        "role": "user",
        "content": f"Calculate: 1 + {i} = "
    })

# Execute batch completion
batchs = batch_client.batchs_completion(
    messages,
    max_completion_tokens=32,
    model="gpt-4o-mini",
    temperature=0,
    seed=42,
    batch_size=2
)

# Print batch outputs
for batch in batchs:
    print("BATCH ID:", batch.id)
    print("FILE ID:", batch.output_file.id)
    for content in batch.output_file.contents:
        print("Message ID:", content["id"])
        print("Response:", content["choices"][0]['message']['content'])
    print()
```

### Output Sample

The script will output results similar to this:

```plaintext
Preparing: 100%|███████████████████████| 4/4 [00:10<00:00,  2.74s/it]
Sending: 100%|█████████████████████████| 4/4 [00:02<00:00,  1.60it/s]
Processing: 100%|██████████████████████| 4/4 [00:12<00:00,  3.10s/it]

BATCH ID: batch_672b237c62f88190b0d69cbf789eb754
FILE ID: file-IlJUOenuE8p7okKB3TX2ANKW
Message ID: calc_0
Response: 1 + 0 = 1
Message ID: calc_1
Response: 1 + 1 = 2.

BATCH ID: batch_672b237ce4fc8190813e7b1239f0336d
FILE ID: file-hTFNk0uBC37plDaWtgYtD0Dr
Message ID: calc_2
Response: 1 + 2 = 3.
Message ID: calc_3
Response: 1 + 3 = 4.

BATCH ID: batch_672b237d937c8190a2ec942aa9401732
FILE ID: file-AbzBSJGl2pPU1B6KPmMuIDBB
Message ID: calc_4
Response: 1 + 4 = 5.
Message ID: calc_5
Response: 1 + 5 = 6.

BATCH ID: batch_672b237e076c8190807819fde4cdd093
FILE ID: file-jRAePmaheNSNAWYcawdmSgYF
Message ID: calc_6
Response: 1 + 6 = 7.
```

### Usage Statistics

The `print_usage_table` method outputs a table summarizing token usage and cost.

```python
batch_client.print_usage_table(digits=8)
```

This outputs:

```plaintext
+------------+--------+-------------+
| Category   | Tokens |       Price |
+------------+--------+-------------+
| Prompt     |    119 | $0.00000892 |
| Completion |     55 | $0.00001650 |
+------------+--------+-------------+
| Total      |    174 | $0.00002543 |
+------------+--------+-------------+
```

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/it-dainb/batch_api.git",
    "name": "OpenaiBatchAPI",
    "maintainer": null,
    "docs_url": null,
    "requires_python": "<=3.12,>=3.8",
    "maintainer_email": null,
    "keywords": "openai batch_api batch api",
    "author": "IT.DAINB",
    "author_email": "it.dainb@gmail.com",
    "download_url": null,
    "platform": null,
    "description": "# OpenaiBatchAPI: A Python Library for OpenAI Batch API\n[OpenAI Batch API Documentation](https://platform.openai.com/docs/guides/batch)\n\n**Note:** *Currently supports only `gpt-4o` and `gpt-4o-mini` models.*\n\n## Installation\n\nInstall the package from PyPI using [pip](http://www.pip-installer.org):\n\n```bash\n$ pip install OpenaiBatchAPI\n```\n\n## Usage Example\n\n### Basic Example\n\n```python\nfrom openai_batch_api import OpenaiBatchAPI\n\n# Initialize the Batch API client\nbatch_client = OpenaiBatchAPI(\n    api_key=\"API_TOKEN\",\n    max_retry=3,\n    timeout=5 * 60  # Timeout in seconds\n)\n\nmessages = []\n\n# Add messages with custom IDs (e.g., \"calc_{i}\")\nfor i in range(7):\n    messages.append({\n        \"id\": f\"calc_{i}\",\n        \"content\": [\n            {\n                \"role\": \"user\",\n                \"content\": f\"Calculate: 1 + {i} = \"\n            }\n        ]\n    })\n\n# Add messages with auto-generated IDs (index-based)\nfor i in range(7):\n    messages.append({\n        \"role\": \"user\",\n        \"content\": f\"Calculate: 1 + {i} = \"\n    })\n\n# Execute batch completion\nbatchs = batch_client.batchs_completion(\n    messages,\n    max_completion_tokens=32,\n    model=\"gpt-4o-mini\",\n    temperature=0,\n    seed=42,\n    batch_size=2\n)\n\n# Print batch outputs\nfor batch in batchs:\n    print(\"BATCH ID:\", batch.id)\n    print(\"FILE ID:\", batch.output_file.id)\n    for content in batch.output_file.contents:\n        print(\"Message ID:\", content[\"id\"])\n        print(\"Response:\", content[\"choices\"][0]['message']['content'])\n    print()\n```\n\n### Output Sample\n\nThe script will output results similar to this:\n\n```plaintext\nPreparing: 100%|\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588| 4/4 [00:10<00:00,  2.74s/it]\nSending: 100%|\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588| 4/4 [00:02<00:00,  1.60it/s]\nProcessing: 100%|\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588| 4/4 [00:12<00:00,  3.10s/it]\n\nBATCH ID: batch_672b237c62f88190b0d69cbf789eb754\nFILE ID: file-IlJUOenuE8p7okKB3TX2ANKW\nMessage ID: calc_0\nResponse: 1 + 0 = 1\nMessage ID: calc_1\nResponse: 1 + 1 = 2.\n\nBATCH ID: batch_672b237ce4fc8190813e7b1239f0336d\nFILE ID: file-hTFNk0uBC37plDaWtgYtD0Dr\nMessage ID: calc_2\nResponse: 1 + 2 = 3.\nMessage ID: calc_3\nResponse: 1 + 3 = 4.\n\nBATCH ID: batch_672b237d937c8190a2ec942aa9401732\nFILE ID: file-AbzBSJGl2pPU1B6KPmMuIDBB\nMessage ID: calc_4\nResponse: 1 + 4 = 5.\nMessage ID: calc_5\nResponse: 1 + 5 = 6.\n\nBATCH ID: batch_672b237e076c8190807819fde4cdd093\nFILE ID: file-jRAePmaheNSNAWYcawdmSgYF\nMessage ID: calc_6\nResponse: 1 + 6 = 7.\n```\n\n### Usage Statistics\n\nThe `print_usage_table` method outputs a table summarizing token usage and cost.\n\n```python\nbatch_client.print_usage_table(digits=8)\n```\n\nThis outputs:\n\n```plaintext\n+------------+--------+-------------+\n| Category   | Tokens |       Price |\n+------------+--------+-------------+\n| Prompt     |    119 | $0.00000892 |\n| Completion |     55 | $0.00001650 |\n+------------+--------+-------------+\n| Total      |    174 | $0.00002543 |\n+------------+--------+-------------+\n```\n",
    "bugtrack_url": null,
    "license": "Apache License 2.0",
    "summary": "OpenaiBatchAPI: A Python Library that supports OpenAI Batch API",
    "version": "1.1.2",
    "project_urls": {
        "Homepage": "https://github.com/it-dainb/batch_api.git"
    },
    "split_keywords": [
        "openai",
        "batch_api",
        "batch",
        "api"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "121d12e5a9b8736d524454e221a079a16bdbd6668a4276638643d95da40f9184",
                "md5": "88104a4b4911db7f28aaee6b00f20181",
                "sha256": "d0a5bf6e852b4adfdbab8033c1b94fd2fb67ec47c31231a32470c1ddcfbb621e"
            },
            "downloads": -1,
            "filename": "OpenaiBatchAPI-1.1.2-py2.py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "88104a4b4911db7f28aaee6b00f20181",
            "packagetype": "bdist_wheel",
            "python_version": "py2.py3",
            "requires_python": "<=3.12,>=3.8",
            "size": 7222,
            "upload_time": "2024-11-07T10:39:52",
            "upload_time_iso_8601": "2024-11-07T10:39:52.974692Z",
            "url": "https://files.pythonhosted.org/packages/12/1d/12e5a9b8736d524454e221a079a16bdbd6668a4276638643d95da40f9184/OpenaiBatchAPI-1.1.2-py2.py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-11-07 10:39:52",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "it-dainb",
    "github_project": "batch_api",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "requirements": [],
    "lcname": "openaibatchapi"
}
        
Elapsed time: 0.38050s