localopenai


Namelocalopenai JSON
Version 0.27.6 PyPI version JSON
download
home_pagehttps://github.com/lishoulong/openai-python
SummaryPython client library for the OpenAI API
upload_time2023-04-05 04:51:50
maintainer
docs_urlNone
authorOpenAI
requires_python>=3.7.1
license
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # OpenAI Python Library

The OpenAI Python library provides convenient access to the OpenAI API
from applications written in the Python language. It includes a
pre-defined set of classes for API resources that initialize
themselves dynamically from API responses which makes it compatible
with a wide range of versions of the OpenAI API.

You can find usage examples for the OpenAI Python library in our [API reference](https://beta.openai.com/docs/api-reference?lang=python) and the [OpenAI Cookbook](https://github.com/openai/openai-cookbook/).

## Installation

You don't need this source code unless you want to modify the package. If you just
want to use the package, just run:

```sh
pip install --upgrade openai
```

Install from source with:

```sh
python setup.py install
```

### Optional dependencies

Install dependencies for [`openai.embeddings_utils`](openai/embeddings_utils.py):

```sh
pip install openai[embeddings]
```

Install support for [Weights & Biases](https://wandb.me/openai-docs):

```
pip install openai[wandb]
```

Data libraries like `numpy` and `pandas` are not installed by default due to their size. They’re needed for some functionality of this library, but generally not for talking to the API. If you encounter a `MissingDependencyError`, install them with:

```sh
pip install openai[datalib]
````

## Usage

The library needs to be configured with your account's secret key which is available on the [website](https://platform.openai.com/account/api-keys). Either set it as the `OPENAI_API_KEY` environment variable before using the library:

```bash
export OPENAI_API_KEY='sk-...'
```

Or set `openai.api_key` to its value:

```python
import openai
openai.api_key = "sk-..."

# list models
models = openai.Model.list()

# print the first model's id
print(models.data[0].id)

# create a completion
completion = openai.Completion.create(model="ada", prompt="Hello world")

# print the completion
print(completion.choices[0].text)
```


### Params
All endpoints have a `.create` method that supports a `request_timeout` param.  This param takes a `Union[float, Tuple[float, float]]` and will raise an `openai.error.Timeout` error if the request exceeds that time in seconds (See: https://requests.readthedocs.io/en/latest/user/quickstart/#timeouts).

### Microsoft Azure Endpoints

In order to use the library with Microsoft Azure endpoints, you need to set the `api_type`, `api_base` and `api_version` in addition to the `api_key`. The `api_type` must be set to 'azure' and the others correspond to the properties of your endpoint.
In addition, the deployment name must be passed as the engine parameter.

```python
import openai
openai.api_type = "azure"
openai.api_key = "..."
openai.api_base = "https://example-endpoint.openai.azure.com"
openai.api_version = "2023-03-15-preview"

# create a completion
completion = openai.Completion.create(deployment_id="deployment-name", prompt="Hello world")

# print the completion
print(completion.choices[0].text)
```

Please note that for the moment, the Microsoft Azure endpoints can only be used for completion, embedding, and fine-tuning operations.
For a detailed example of how to use fine-tuning and other operations using Azure endpoints, please check out the following Jupyter notebooks:
* [Using Azure completions](https://github.com/openai/openai-cookbook/tree/main/examples/azure/completions.ipynb)
* [Using Azure fine-tuning](https://github.com/openai/openai-cookbook/tree/main/examples/azure/finetuning.ipynb)
* [Using Azure embeddings](https://github.com/openai/openai-cookbook/blob/main/examples/azure/embeddings.ipynb)

### Microsoft Azure Active Directory Authentication

In order to use Microsoft Active Directory to authenticate to your Azure endpoint, you need to set the `api_type` to "azure_ad" and pass the acquired credential token to `api_key`. The rest of the parameters need to be set as specified in the previous section.


```python
from azure.identity import DefaultAzureCredential
import openai

# Request credential
default_credential = DefaultAzureCredential()
token = default_credential.get_token("https://cognitiveservices.azure.com/.default")

# Setup parameters
openai.api_type = "azure_ad"
openai.api_key = token.token
openai.api_base = "https://example-endpoint.openai.azure.com/"
openai.api_version = "2023-03-15-preview"

# ...
```
### Command-line interface

This library additionally provides an `openai` command-line utility
which makes it easy to interact with the API from your terminal. Run
`openai api -h` for usage.

```sh
# list models
openai api models.list

# create a completion
openai api completions.create -m ada -p "Hello world"

# create a chat completion
openai api chat_completions.create -m gpt-3.5-turbo -g user "Hello world"

# generate images via DALL·E API
openai api image.create -p "two dogs playing chess, cartoon" -n 1
```

## Example code

Examples of how to use this Python library to accomplish various tasks can be found in the [OpenAI Cookbook](https://github.com/openai/openai-cookbook/). It contains code examples for:

* Classification using fine-tuning
* Clustering
* Code search
* Customizing embeddings
* Question answering from a corpus of documents
* Recommendations
* Visualization of embeddings
* And more

Prior to July 2022, this OpenAI Python library hosted code examples in its examples folder, but since then all examples have been migrated to the [OpenAI Cookbook](https://github.com/openai/openai-cookbook/).

### Chat

Conversational models such as `gpt-3.5-turbo` can be called using the chat completions endpoint.

```python
import openai
openai.api_key = "sk-..."  # supply your API key however you choose

completion = openai.ChatCompletion.create(model="gpt-3.5-turbo", messages=[{"role": "user", "content": "Hello world!"}])
print(completion.choices[0].message.content)
```

### Embeddings

In the OpenAI Python library, an embedding represents a text string as a fixed-length vector of floating point numbers. Embeddings are designed to measure the similarity or relevance between text strings.

To get an embedding for a text string, you can use the embeddings method as follows in Python:

```python
import openai
openai.api_key = "sk-..."  # supply your API key however you choose

# choose text to embed
text_string = "sample text"

# choose an embedding
model_id = "text-similarity-davinci-001"

# compute the embedding of the text
embedding = openai.Embedding.create(input=text_string, model=model_id)['data'][0]['embedding']
```

An example of how to call the embeddings method is shown in this [get embeddings notebook](https://github.com/openai/openai-cookbook/blob/main/examples/Get_embeddings.ipynb).

Examples of how to use embeddings are shared in the following Jupyter notebooks:

- [Classification using embeddings](https://github.com/openai/openai-cookbook/blob/main/examples/Classification_using_embeddings.ipynb)
- [Clustering using embeddings](https://github.com/openai/openai-cookbook/blob/main/examples/Clustering.ipynb)
- [Code search using embeddings](https://github.com/openai/openai-cookbook/blob/main/examples/Code_search.ipynb)
- [Semantic text search using embeddings](https://github.com/openai/openai-cookbook/blob/main/examples/Semantic_text_search_using_embeddings.ipynb)
- [User and product embeddings](https://github.com/openai/openai-cookbook/blob/main/examples/User_and_product_embeddings.ipynb)
- [Zero-shot classification using embeddings](https://github.com/openai/openai-cookbook/blob/main/examples/Zero-shot_classification_with_embeddings.ipynb)
- [Recommendation using embeddings](https://github.com/openai/openai-cookbook/blob/main/examples/Recommendation_using_embeddings.ipynb)

For more information on embeddings and the types of embeddings OpenAI offers, read the [embeddings guide](https://beta.openai.com/docs/guides/embeddings) in the OpenAI documentation.

### Fine-tuning

Fine-tuning a model on training data can both improve the results (by giving the model more examples to learn from) and reduce the cost/latency of API calls (chiefly through reducing the need to include training examples in prompts).

Examples of fine-tuning are shared in the following Jupyter notebooks:

- [Classification with fine-tuning](https://github.com/openai/openai-cookbook/blob/main/examples/Fine-tuned_classification.ipynb) (a simple notebook that shows the steps required for fine-tuning)
- Fine-tuning a model that answers questions about the 2020 Olympics
  - [Step 1: Collecting data](https://github.com/openai/openai-cookbook/blob/main/examples/fine-tuned_qa/olympics-1-collect-data.ipynb)
  - [Step 2: Creating a synthetic Q&A dataset](https://github.com/openai/openai-cookbook/blob/main/examples/fine-tuned_qa/olympics-2-create-qa.ipynb)
  - [Step 3: Train a fine-tuning model specialized for Q&A](https://github.com/openai/openai-cookbook/blob/main/examples/fine-tuned_qa/olympics-3-train-qa.ipynb)

Sync your fine-tunes to [Weights & Biases](https://wandb.me/openai-docs) to track experiments, models, and datasets in your central dashboard with:

```bash
openai wandb sync
```

For more information on fine-tuning, read the [fine-tuning guide](https://beta.openai.com/docs/guides/fine-tuning) in the OpenAI documentation.

### Moderation

OpenAI provides a Moderation endpoint that can be used to check whether content complies with the OpenAI [content policy](https://platform.openai.com/docs/usage-policies)

```python
import openai
openai.api_key = "sk-..."  # supply your API key however you choose

moderation_resp = openai.Moderation.create(input="Here is some perfectly innocuous text that follows all OpenAI content policies.")
```

See the [moderation guide](https://platform.openai.com/docs/guides/moderation) for more details.

## Image generation (DALL·E)

```python
import openai
openai.api_key = "sk-..."  # supply your API key however you choose

image_resp = openai.Image.create(prompt="two dogs playing chess, oil painting", n=4, size="512x512")

```

## Audio transcription (Whisper)
```python
import openai
openai.api_key = "sk-..."  # supply your API key however you choose
f = open("path/to/file.mp3", "rb")
transcript = openai.Audio.transcribe("whisper-1", f)

```

## Async API

Async support is available in the API by prepending `a` to a network-bound method:

```python
import openai
openai.api_key = "sk-..."  # supply your API key however you choose

async def create_completion():
    completion_resp = await openai.Completion.acreate(prompt="This is a test", model="davinci")

```

To make async requests more efficient, you can pass in your own
``aiohttp.ClientSession``, but you must manually close the client session at the end 
of your program/event loop:

```python
import openai
from aiohttp import ClientSession

openai.aiosession.set(ClientSession())
# At the end of your program, close the http session
await openai.aiosession.get().close()
```

See the [usage guide](https://platform.openai.com/docs/guides/images) for more details.

## Requirements

- Python 3.7.1+

In general, we want to support the versions of Python that our
customers are using. If you run into problems with any version
issues, please let us know on our [support page](https://help.openai.com/en/).

## Credit

This library is forked from the [Stripe Python Library](https://github.com/stripe/stripe-python).

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/lishoulong/openai-python",
    "name": "localopenai",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.7.1",
    "maintainer_email": "",
    "keywords": "",
    "author": "OpenAI",
    "author_email": "support@openai.com",
    "download_url": "https://files.pythonhosted.org/packages/52/0c/29f1719c0cf8bffffa8ad46e55521837a81bbbe331ebfd8dcead01626851/localopenai-0.27.6.tar.gz",
    "platform": null,
    "description": "# OpenAI Python Library\n\nThe OpenAI Python library provides convenient access to the OpenAI API\nfrom applications written in the Python language. It includes a\npre-defined set of classes for API resources that initialize\nthemselves dynamically from API responses which makes it compatible\nwith a wide range of versions of the OpenAI API.\n\nYou can find usage examples for the OpenAI Python library in our [API reference](https://beta.openai.com/docs/api-reference?lang=python) and the [OpenAI Cookbook](https://github.com/openai/openai-cookbook/).\n\n## Installation\n\nYou don't need this source code unless you want to modify the package. If you just\nwant to use the package, just run:\n\n```sh\npip install --upgrade openai\n```\n\nInstall from source with:\n\n```sh\npython setup.py install\n```\n\n### Optional dependencies\n\nInstall dependencies for [`openai.embeddings_utils`](openai/embeddings_utils.py):\n\n```sh\npip install openai[embeddings]\n```\n\nInstall support for [Weights & Biases](https://wandb.me/openai-docs):\n\n```\npip install openai[wandb]\n```\n\nData libraries like `numpy` and `pandas` are not installed by default due to their size. They\u2019re needed for some functionality of this library, but generally not for talking to the API. If you encounter a `MissingDependencyError`, install them with:\n\n```sh\npip install openai[datalib]\n````\n\n## Usage\n\nThe library needs to be configured with your account's secret key which is available on the [website](https://platform.openai.com/account/api-keys). Either set it as the `OPENAI_API_KEY` environment variable before using the library:\n\n```bash\nexport OPENAI_API_KEY='sk-...'\n```\n\nOr set `openai.api_key` to its value:\n\n```python\nimport openai\nopenai.api_key = \"sk-...\"\n\n# list models\nmodels = openai.Model.list()\n\n# print the first model's id\nprint(models.data[0].id)\n\n# create a completion\ncompletion = openai.Completion.create(model=\"ada\", prompt=\"Hello world\")\n\n# print the completion\nprint(completion.choices[0].text)\n```\n\n\n### Params\nAll endpoints have a `.create` method that supports a `request_timeout` param.  This param takes a `Union[float, Tuple[float, float]]` and will raise an `openai.error.Timeout` error if the request exceeds that time in seconds (See: https://requests.readthedocs.io/en/latest/user/quickstart/#timeouts).\n\n### Microsoft Azure Endpoints\n\nIn order to use the library with Microsoft Azure endpoints, you need to set the `api_type`, `api_base` and `api_version` in addition to the `api_key`. The `api_type` must be set to 'azure' and the others correspond to the properties of your endpoint.\nIn addition, the deployment name must be passed as the engine parameter.\n\n```python\nimport openai\nopenai.api_type = \"azure\"\nopenai.api_key = \"...\"\nopenai.api_base = \"https://example-endpoint.openai.azure.com\"\nopenai.api_version = \"2023-03-15-preview\"\n\n# create a completion\ncompletion = openai.Completion.create(deployment_id=\"deployment-name\", prompt=\"Hello world\")\n\n# print the completion\nprint(completion.choices[0].text)\n```\n\nPlease note that for the moment, the Microsoft Azure endpoints can only be used for completion, embedding, and fine-tuning operations.\nFor a detailed example of how to use fine-tuning and other operations using Azure endpoints, please check out the following Jupyter notebooks:\n* [Using Azure completions](https://github.com/openai/openai-cookbook/tree/main/examples/azure/completions.ipynb)\n* [Using Azure fine-tuning](https://github.com/openai/openai-cookbook/tree/main/examples/azure/finetuning.ipynb)\n* [Using Azure embeddings](https://github.com/openai/openai-cookbook/blob/main/examples/azure/embeddings.ipynb)\n\n### Microsoft Azure Active Directory Authentication\n\nIn order to use Microsoft Active Directory to authenticate to your Azure endpoint, you need to set the `api_type` to \"azure_ad\" and pass the acquired credential token to `api_key`. The rest of the parameters need to be set as specified in the previous section.\n\n\n```python\nfrom azure.identity import DefaultAzureCredential\nimport openai\n\n# Request credential\ndefault_credential = DefaultAzureCredential()\ntoken = default_credential.get_token(\"https://cognitiveservices.azure.com/.default\")\n\n# Setup parameters\nopenai.api_type = \"azure_ad\"\nopenai.api_key = token.token\nopenai.api_base = \"https://example-endpoint.openai.azure.com/\"\nopenai.api_version = \"2023-03-15-preview\"\n\n# ...\n```\n### Command-line interface\n\nThis library additionally provides an `openai` command-line utility\nwhich makes it easy to interact with the API from your terminal. Run\n`openai api -h` for usage.\n\n```sh\n# list models\nopenai api models.list\n\n# create a completion\nopenai api completions.create -m ada -p \"Hello world\"\n\n# create a chat completion\nopenai api chat_completions.create -m gpt-3.5-turbo -g user \"Hello world\"\n\n# generate images via DALL\u00b7E API\nopenai api image.create -p \"two dogs playing chess, cartoon\" -n 1\n```\n\n## Example code\n\nExamples of how to use this Python library to accomplish various tasks can be found in the [OpenAI Cookbook](https://github.com/openai/openai-cookbook/). It contains code examples for:\n\n* Classification using fine-tuning\n* Clustering\n* Code search\n* Customizing embeddings\n* Question answering from a corpus of documents\n* Recommendations\n* Visualization of embeddings\n* And more\n\nPrior to July 2022, this OpenAI Python library hosted code examples in its examples folder, but since then all examples have been migrated to the [OpenAI Cookbook](https://github.com/openai/openai-cookbook/).\n\n### Chat\n\nConversational models such as `gpt-3.5-turbo` can be called using the chat completions endpoint.\n\n```python\nimport openai\nopenai.api_key = \"sk-...\"  # supply your API key however you choose\n\ncompletion = openai.ChatCompletion.create(model=\"gpt-3.5-turbo\", messages=[{\"role\": \"user\", \"content\": \"Hello world!\"}])\nprint(completion.choices[0].message.content)\n```\n\n### Embeddings\n\nIn the OpenAI Python library, an embedding represents a text string as a fixed-length vector of floating point numbers. Embeddings are designed to measure the similarity or relevance between text strings.\n\nTo get an embedding for a text string, you can use the embeddings method as follows in Python:\n\n```python\nimport openai\nopenai.api_key = \"sk-...\"  # supply your API key however you choose\n\n# choose text to embed\ntext_string = \"sample text\"\n\n# choose an embedding\nmodel_id = \"text-similarity-davinci-001\"\n\n# compute the embedding of the text\nembedding = openai.Embedding.create(input=text_string, model=model_id)['data'][0]['embedding']\n```\n\nAn example of how to call the embeddings method is shown in this [get embeddings notebook](https://github.com/openai/openai-cookbook/blob/main/examples/Get_embeddings.ipynb).\n\nExamples of how to use embeddings are shared in the following Jupyter notebooks:\n\n- [Classification using embeddings](https://github.com/openai/openai-cookbook/blob/main/examples/Classification_using_embeddings.ipynb)\n- [Clustering using embeddings](https://github.com/openai/openai-cookbook/blob/main/examples/Clustering.ipynb)\n- [Code search using embeddings](https://github.com/openai/openai-cookbook/blob/main/examples/Code_search.ipynb)\n- [Semantic text search using embeddings](https://github.com/openai/openai-cookbook/blob/main/examples/Semantic_text_search_using_embeddings.ipynb)\n- [User and product embeddings](https://github.com/openai/openai-cookbook/blob/main/examples/User_and_product_embeddings.ipynb)\n- [Zero-shot classification using embeddings](https://github.com/openai/openai-cookbook/blob/main/examples/Zero-shot_classification_with_embeddings.ipynb)\n- [Recommendation using embeddings](https://github.com/openai/openai-cookbook/blob/main/examples/Recommendation_using_embeddings.ipynb)\n\nFor more information on embeddings and the types of embeddings OpenAI offers, read the [embeddings guide](https://beta.openai.com/docs/guides/embeddings) in the OpenAI documentation.\n\n### Fine-tuning\n\nFine-tuning a model on training data can both improve the results (by giving the model more examples to learn from) and reduce the cost/latency of API calls (chiefly through reducing the need to include training examples in prompts).\n\nExamples of fine-tuning are shared in the following Jupyter notebooks:\n\n- [Classification with fine-tuning](https://github.com/openai/openai-cookbook/blob/main/examples/Fine-tuned_classification.ipynb) (a simple notebook that shows the steps required for fine-tuning)\n- Fine-tuning a model that answers questions about the 2020 Olympics\n  - [Step 1: Collecting data](https://github.com/openai/openai-cookbook/blob/main/examples/fine-tuned_qa/olympics-1-collect-data.ipynb)\n  - [Step 2: Creating a synthetic Q&A dataset](https://github.com/openai/openai-cookbook/blob/main/examples/fine-tuned_qa/olympics-2-create-qa.ipynb)\n  - [Step 3: Train a fine-tuning model specialized for Q&A](https://github.com/openai/openai-cookbook/blob/main/examples/fine-tuned_qa/olympics-3-train-qa.ipynb)\n\nSync your fine-tunes to [Weights & Biases](https://wandb.me/openai-docs) to track experiments, models, and datasets in your central dashboard with:\n\n```bash\nopenai wandb sync\n```\n\nFor more information on fine-tuning, read the [fine-tuning guide](https://beta.openai.com/docs/guides/fine-tuning) in the OpenAI documentation.\n\n### Moderation\n\nOpenAI provides a Moderation endpoint that can be used to check whether content complies with the OpenAI [content policy](https://platform.openai.com/docs/usage-policies)\n\n```python\nimport openai\nopenai.api_key = \"sk-...\"  # supply your API key however you choose\n\nmoderation_resp = openai.Moderation.create(input=\"Here is some perfectly innocuous text that follows all OpenAI content policies.\")\n```\n\nSee the [moderation guide](https://platform.openai.com/docs/guides/moderation) for more details.\n\n## Image generation (DALL\u00b7E)\n\n```python\nimport openai\nopenai.api_key = \"sk-...\"  # supply your API key however you choose\n\nimage_resp = openai.Image.create(prompt=\"two dogs playing chess, oil painting\", n=4, size=\"512x512\")\n\n```\n\n## Audio transcription (Whisper)\n```python\nimport openai\nopenai.api_key = \"sk-...\"  # supply your API key however you choose\nf = open(\"path/to/file.mp3\", \"rb\")\ntranscript = openai.Audio.transcribe(\"whisper-1\", f)\n\n```\n\n## Async API\n\nAsync support is available in the API by prepending `a` to a network-bound method:\n\n```python\nimport openai\nopenai.api_key = \"sk-...\"  # supply your API key however you choose\n\nasync def create_completion():\n    completion_resp = await openai.Completion.acreate(prompt=\"This is a test\", model=\"davinci\")\n\n```\n\nTo make async requests more efficient, you can pass in your own\n``aiohttp.ClientSession``, but you must manually close the client session at the end \nof your program/event loop:\n\n```python\nimport openai\nfrom aiohttp import ClientSession\n\nopenai.aiosession.set(ClientSession())\n# At the end of your program, close the http session\nawait openai.aiosession.get().close()\n```\n\nSee the [usage guide](https://platform.openai.com/docs/guides/images) for more details.\n\n## Requirements\n\n- Python 3.7.1+\n\nIn general, we want to support the versions of Python that our\ncustomers are using. If you run into problems with any version\nissues, please let us know on our [support page](https://help.openai.com/en/).\n\n## Credit\n\nThis library is forked from the [Stripe Python Library](https://github.com/stripe/stripe-python).\n",
    "bugtrack_url": null,
    "license": "",
    "summary": "Python client library for the OpenAI API",
    "version": "0.27.6",
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "2d3c30254e4742dfb7090050101d2f0b25efe710088249541a08ba18155f0290",
                "md5": "0f70defe6fa6cf9d53619dcbf4c115ca",
                "sha256": "c6416a99de248b00c3d3c27b2e43c4d89effc0fc9e885d7b858c8404e50a3578"
            },
            "downloads": -1,
            "filename": "localopenai-0.27.6-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "0f70defe6fa6cf9d53619dcbf4c115ca",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.7.1",
            "size": 70392,
            "upload_time": "2023-04-05T04:51:47",
            "upload_time_iso_8601": "2023-04-05T04:51:47.847153Z",
            "url": "https://files.pythonhosted.org/packages/2d/3c/30254e4742dfb7090050101d2f0b25efe710088249541a08ba18155f0290/localopenai-0.27.6-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "520c29f1719c0cf8bffffa8ad46e55521837a81bbbe331ebfd8dcead01626851",
                "md5": "2dfd66485974f97312147be37e94d7ab",
                "sha256": "b897b86bd39abdb9c0ad6666a172edb0c25fa05c24c85f8c1b5a6c238e17258c"
            },
            "downloads": -1,
            "filename": "localopenai-0.27.6.tar.gz",
            "has_sig": false,
            "md5_digest": "2dfd66485974f97312147be37e94d7ab",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.7.1",
            "size": 53827,
            "upload_time": "2023-04-05T04:51:50",
            "upload_time_iso_8601": "2023-04-05T04:51:50.493178Z",
            "url": "https://files.pythonhosted.org/packages/52/0c/29f1719c0cf8bffffa8ad46e55521837a81bbbe331ebfd8dcead01626851/localopenai-0.27.6.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-04-05 04:51:50",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "github_user": "lishoulong",
    "github_project": "openai-python",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "lcname": "localopenai"
}
        
Elapsed time: 0.05978s