openai-batch


Nameopenai-batch JSON
Version 0.1.2 PyPI version JSON
download
home_pageNone
SummaryMake OpenAI batch easy to use.
upload_time2024-12-11 18:11:43
maintainerNone
docs_urlNone
authorParasail
requires_python>=3.8
licenseNone
keywords openai batch chatgpt gpt llm language model
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            [![Tests](https://github.com/parasail-ai/openai-batch/actions/workflows/tests.yml/badge.svg)](https://github.com/parasail-ai/openai-batch/actions/workflows/tests.yml)
![PyPI - Version](https://img.shields.io/pypi/v/openai-batch)

# openai-batch

Batch inferencing is an easy and inexpensive way to process thousands or millions of LLM inferences.

The process is:
1. Write inferencing requests to an input file
2. start a batch job
3. wait for it to finish
4. download the output

This library aims to make these steps easier. The OpenAI protocol is relatively easy to use, but it has a lot of boilerplate steps. This library automates those.

#### Supported Providers

* [OpenAI](https://openai.com/) - ChatGPT, GPT4o, etc.
* [Parasail](https://parasail.io/) - Most transformer models on HuggingFace, such as LLama, Qwen, LLava, etc.


## Command-Line Utilities

Use `openai_batch.run` to run a batch from an input file on disk:
```bash
python -m openai_batch.run input.jsonl
```

This will start the batch, wait for it to complete, then download the results.

Useful switches:
* `-c` Only create the batch, do not wait for it.
* `--resume` Attach to an existing batch job. Wait for it to finish then download results.
* `--dry-run` Confirm your configuration without making an actual request.
* Full list: `python -m openai_batch.run --help`

### OpenAI Example
```bash
export OPENAI_API_KEY="<Your OpenAI API Key>"

# Create an example batch input file
python -m openai_batch.example_prompts | \
  python -m openai_batch.create_input --model 'gpt-4o-mini' > input.jsonl

# Run this batch (resumable with `--resume <BATCH_ID>`)
python -m openai_batch.run input.jsonl
```

### Parasail Example

```bash
export PARASAIL_API_KEY="<Your Parasail API Key>"

# Create an example batch input file
python -m openai_batch.example_prompts | \
  python -m openai_batch.create_input --model 'meta-llama/Meta-Llama-3-8B-Instruct' > input.jsonl

# Run this batch (resumable with `--resume <BATCH_ID>`)
python -m openai_batch.run -p parasail input.jsonl
```

## Resources

* [OpenAI Batch Cookbook](https://cookbook.openai.com/examples/batch_processing)
* [OpenAI Batch API reference](https://platform.openai.com/docs/api-reference/batch)
* [OpenAI Files API reference](https://platform.openai.com/docs/api-reference/files)
* [Anthropic's Message Batches](https://www.anthropic.com/news/message-batches-api) - Uses a different API

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "openai-batch",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.8",
    "maintainer_email": null,
    "keywords": "openai, batch, chatgpt, gpt, llm, language model",
    "author": "Parasail",
    "author_email": null,
    "download_url": "https://files.pythonhosted.org/packages/7d/72/c8a079e5b7d8614c607e96d21b8d2beb55507a4080361613916543550e4b/openai_batch-0.1.2.tar.gz",
    "platform": null,
    "description": "[![Tests](https://github.com/parasail-ai/openai-batch/actions/workflows/tests.yml/badge.svg)](https://github.com/parasail-ai/openai-batch/actions/workflows/tests.yml)\n![PyPI - Version](https://img.shields.io/pypi/v/openai-batch)\n\n# openai-batch\n\nBatch inferencing is an easy and inexpensive way to process thousands or millions of LLM inferences.\n\nThe process is:\n1. Write inferencing requests to an input file\n2. start a batch job\n3. wait for it to finish\n4. download the output\n\nThis library aims to make these steps easier. The OpenAI protocol is relatively easy to use, but it has a lot of boilerplate steps. This library automates those.\n\n#### Supported Providers\n\n* [OpenAI](https://openai.com/) - ChatGPT, GPT4o, etc.\n* [Parasail](https://parasail.io/) - Most transformer models on HuggingFace, such as LLama, Qwen, LLava, etc.\n\n\n## Command-Line Utilities\n\nUse `openai_batch.run` to run a batch from an input file on disk:\n```bash\npython -m openai_batch.run input.jsonl\n```\n\nThis will start the batch, wait for it to complete, then download the results.\n\nUseful switches:\n* `-c` Only create the batch, do not wait for it.\n* `--resume` Attach to an existing batch job. Wait for it to finish then download results.\n* `--dry-run` Confirm your configuration without making an actual request.\n* Full list: `python -m openai_batch.run --help`\n\n### OpenAI Example\n```bash\nexport OPENAI_API_KEY=\"<Your OpenAI API Key>\"\n\n# Create an example batch input file\npython -m openai_batch.example_prompts | \\\n  python -m openai_batch.create_input --model 'gpt-4o-mini' > input.jsonl\n\n# Run this batch (resumable with `--resume <BATCH_ID>`)\npython -m openai_batch.run input.jsonl\n```\n\n### Parasail Example\n\n```bash\nexport PARASAIL_API_KEY=\"<Your Parasail API Key>\"\n\n# Create an example batch input file\npython -m openai_batch.example_prompts | \\\n  python -m openai_batch.create_input --model 'meta-llama/Meta-Llama-3-8B-Instruct' > input.jsonl\n\n# Run this batch (resumable with `--resume <BATCH_ID>`)\npython -m openai_batch.run -p parasail input.jsonl\n```\n\n## Resources\n\n* [OpenAI Batch Cookbook](https://cookbook.openai.com/examples/batch_processing)\n* [OpenAI Batch API reference](https://platform.openai.com/docs/api-reference/batch)\n* [OpenAI Files API reference](https://platform.openai.com/docs/api-reference/files)\n* [Anthropic's Message Batches](https://www.anthropic.com/news/message-batches-api) - Uses a different API\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "Make OpenAI batch easy to use.",
    "version": "0.1.2",
    "project_urls": {
        "homepage": "https://github.com/parasail-ai/openai-batch",
        "repository": "https://github.com/parasail-ai/openai-batch"
    },
    "split_keywords": [
        "openai",
        " batch",
        " chatgpt",
        " gpt",
        " llm",
        " language model"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "d6173eb74c5bf69fd2ac5d360bf5cf95f857370c63a996210d33407c2bda1f86",
                "md5": "c2b423b777a12704575a2b7a924377b6",
                "sha256": "eaae69067c1b91c3b7aa295ac8a99407113cdd0818545006c463862046307e55"
            },
            "downloads": -1,
            "filename": "openai_batch-0.1.2-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "c2b423b777a12704575a2b7a924377b6",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.8",
            "size": 14142,
            "upload_time": "2024-12-11T18:11:41",
            "upload_time_iso_8601": "2024-12-11T18:11:41.354426Z",
            "url": "https://files.pythonhosted.org/packages/d6/17/3eb74c5bf69fd2ac5d360bf5cf95f857370c63a996210d33407c2bda1f86/openai_batch-0.1.2-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "7d72c8a079e5b7d8614c607e96d21b8d2beb55507a4080361613916543550e4b",
                "md5": "8cf6109f1515d9421ef32acfa7cda7b8",
                "sha256": "cee6842f79db2656c032a3b01f3ea5aeffbb581874540510a9b56e45c239b2e2"
            },
            "downloads": -1,
            "filename": "openai_batch-0.1.2.tar.gz",
            "has_sig": false,
            "md5_digest": "8cf6109f1515d9421ef32acfa7cda7b8",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.8",
            "size": 17839,
            "upload_time": "2024-12-11T18:11:43",
            "upload_time_iso_8601": "2024-12-11T18:11:43.738860Z",
            "url": "https://files.pythonhosted.org/packages/7d/72/c8a079e5b7d8614c607e96d21b8d2beb55507a4080361613916543550e4b/openai_batch-0.1.2.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-12-11 18:11:43",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "parasail-ai",
    "github_project": "openai-batch",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "openai-batch"
}
        
Elapsed time: 0.43469s