# Azure OpenAI Limits
A Python package that provides static token limits for Azure OpenAI models, including context and output limits for various model versions.
## Installation
```bash
pip install azure-openai-limits
```
## Usage
### Python API
```python
from azure_openai_limits import get_limits, context_limit, output_limit
# Get limits for a model (uses default version)
limits = get_limits("gpt-4o")
print(f"Context: {limits.context}, Output: {limits.output}")
# Get limits for a specific version
limits = get_limits("gpt-4o", "2024-05-13")
print(f"Context: {limits.context}, Output: {limits.output}")
# Try the new O3 Pro model
limits = get_limits("o3-pro")
print(f"O3 Pro - Context: {limits.context}, Output: {limits.output}")
# Check Codex Mini limits
limits = get_limits("codex-mini")
print(f"Codex Mini - Context: {limits.context}, Output: {limits.output}")
# Get just context or output limits
context = context_limit("gpt-4o")
output = output_limit("gpt-4o")
# Check if a model/version exists
from azure_openai_limits import model_exists
if model_exists("gpt-4o", "2024-08-06"):
print("Model version exists!")
# List all available models
from azure_openai_limits import list_models
models = list_models()
print("Available models:", models)
# List all versions for a model
from azure_openai_limits import list_versions
versions = list_versions("gpt-4o")
print("Available versions:", versions)
```
### Command Line Interface
```bash
# List all available models and their versions
azure-openai-limits list
# Show limits for a specific model (default version)
azure-openai-limits show gpt-4o
# Show limits for a specific model version
azure-openai-limits show gpt-4o --version 2024-05-13
# Check out the new O3 Pro model
azure-openai-limits show o3-pro
```
## Supported Models
The package includes limits for the following Azure OpenAI models:
- **Codex**: codex-mini
- **GPT-3.5**: gpt-35-turbo, gpt-35-turbo-instruct
- **GPT-4**: gpt-4, gpt-4-32k
- **GPT-4 Turbo**: gpt-4.1, gpt-4.1-mini, gpt-4.1-nano
- **GPT-4.5**: gpt-4.5-preview
- **GPT-4o**: gpt-4o, gpt-4o-mini
- **GPT-4o Audio**: gpt-4o-audio-preview, gpt-4o-mini-audio-preview
- **GPT-4o Realtime**: gpt-4o-realtime-preview, gpt-4o-mini-realtime-preview
- **GPT-5**: gpt-5, gpt-5-mini, gpt-5-nano, gpt-5-chat
- **GPT OSS**: gpt-oss-20b, gpt-oss-120b
- **O1 Models**: o1, o1-mini, o1-preview
- **O3 Models**: o3, o3-mini, o3-pro
- **O4 Models**: o4-mini
- **Model Router**: model-router
Each model may have multiple versions with different limits. Use `azure-openai-limits list` to see all available models and their versions.
## Data Source
The model limits data in this package is sourced from the official Microsoft documentation:
[Azure AI Foundry OpenAI Models](https://learn.microsoft.com/en-us/azure/ai-foundry/openai/concepts/models)
**Last Updated:** August 19, 2025
## API Reference
### Classes
#### `Limits`
A dataclass representing token limits for a model.
**Attributes:**
- `context: int` - Maximum context tokens (input + conversation history)
- `output: int` - Maximum output tokens that can be generated
- `total_max: int` - Property returning context + output
### Functions
#### `get_limits(model: str, version: str | None = None) -> Limits`
Get context and output limits for a specific model.
#### `context_limit(model: str, version: str | None = None) -> int`
Get just the context limit for a model.
#### `output_limit(model: str, version: str | None = None) -> int`
Get just the output limit for a model.
#### `model_exists(model: str, version: str | None = None) -> bool`
Check if a model (and optionally version) exists.
#### `list_models() -> list[str]`
Get a sorted list of all available model names.
#### `list_versions(model: str) -> list[str]`
Get a sorted list of all versions for a specific model.
#### `all_models() -> dict[str, dict[str, Limits]]`
Get the complete model data structure.
## Error Handling
The package raises appropriate exceptions:
- `KeyError`: When a model or version is not found
- `ValueError`: When invalid input is provided (e.g., empty model name)
All error messages include helpful information about available models/versions.
## Development
To set up for development:
```bash
git clone <repository-url>
cd azure-openai-limits
pip install -e ".[dev]"
```
Run tests:
```bash
pytest
```
Run linting:
```bash
ruff check src/
mypy src/
```
## License
MIT License
Raw data
{
"_id": null,
"home_page": null,
"name": "azure-openai-limits",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.10",
"maintainer_email": "Taylor Nelson <your-email@example.com>",
"keywords": "azure, openai, token-limits, gpt, ai, machine-learning",
"author": null,
"author_email": "Taylor Nelson <your-email@example.com>",
"download_url": "https://files.pythonhosted.org/packages/a8/ee/3ae1f0eb816a5184b8f7603bd71bd752db6267842a3179a6a280e2e90012/azure_openai_limits-0.1.1.tar.gz",
"platform": null,
"description": "# Azure OpenAI Limits\n\nA Python package that provides static token limits for Azure OpenAI models, including context and output limits for various model versions.\n\n## Installation\n\n```bash\npip install azure-openai-limits\n```\n\n## Usage\n\n### Python API\n\n```python\nfrom azure_openai_limits import get_limits, context_limit, output_limit\n\n# Get limits for a model (uses default version)\nlimits = get_limits(\"gpt-4o\")\nprint(f\"Context: {limits.context}, Output: {limits.output}\")\n\n# Get limits for a specific version\nlimits = get_limits(\"gpt-4o\", \"2024-05-13\")\nprint(f\"Context: {limits.context}, Output: {limits.output}\")\n\n# Try the new O3 Pro model\nlimits = get_limits(\"o3-pro\")\nprint(f\"O3 Pro - Context: {limits.context}, Output: {limits.output}\")\n\n# Check Codex Mini limits\nlimits = get_limits(\"codex-mini\")\nprint(f\"Codex Mini - Context: {limits.context}, Output: {limits.output}\")\n\n# Get just context or output limits\ncontext = context_limit(\"gpt-4o\")\noutput = output_limit(\"gpt-4o\")\n\n# Check if a model/version exists\nfrom azure_openai_limits import model_exists\nif model_exists(\"gpt-4o\", \"2024-08-06\"):\n print(\"Model version exists!\")\n\n# List all available models\nfrom azure_openai_limits import list_models\nmodels = list_models()\nprint(\"Available models:\", models)\n\n# List all versions for a model\nfrom azure_openai_limits import list_versions\nversions = list_versions(\"gpt-4o\")\nprint(\"Available versions:\", versions)\n```\n\n### Command Line Interface\n\n```bash\n# List all available models and their versions\nazure-openai-limits list\n\n# Show limits for a specific model (default version)\nazure-openai-limits show gpt-4o\n\n# Show limits for a specific model version\nazure-openai-limits show gpt-4o --version 2024-05-13\n\n# Check out the new O3 Pro model\nazure-openai-limits show o3-pro\n```\n\n## Supported Models\n\nThe package includes limits for the following Azure OpenAI models:\n\n- **Codex**: codex-mini\n- **GPT-3.5**: gpt-35-turbo, gpt-35-turbo-instruct\n- **GPT-4**: gpt-4, gpt-4-32k\n- **GPT-4 Turbo**: gpt-4.1, gpt-4.1-mini, gpt-4.1-nano\n- **GPT-4.5**: gpt-4.5-preview\n- **GPT-4o**: gpt-4o, gpt-4o-mini\n- **GPT-4o Audio**: gpt-4o-audio-preview, gpt-4o-mini-audio-preview\n- **GPT-4o Realtime**: gpt-4o-realtime-preview, gpt-4o-mini-realtime-preview\n- **GPT-5**: gpt-5, gpt-5-mini, gpt-5-nano, gpt-5-chat\n- **GPT OSS**: gpt-oss-20b, gpt-oss-120b\n- **O1 Models**: o1, o1-mini, o1-preview\n- **O3 Models**: o3, o3-mini, o3-pro\n- **O4 Models**: o4-mini\n- **Model Router**: model-router\n\nEach model may have multiple versions with different limits. Use `azure-openai-limits list` to see all available models and their versions.\n\n## Data Source\n\nThe model limits data in this package is sourced from the official Microsoft documentation:\n[Azure AI Foundry OpenAI Models](https://learn.microsoft.com/en-us/azure/ai-foundry/openai/concepts/models)\n\n**Last Updated:** August 19, 2025\n\n## API Reference\n\n### Classes\n\n#### `Limits`\n\nA dataclass representing token limits for a model.\n\n**Attributes:**\n\n- `context: int` - Maximum context tokens (input + conversation history)\n- `output: int` - Maximum output tokens that can be generated\n- `total_max: int` - Property returning context + output\n\n### Functions\n\n#### `get_limits(model: str, version: str | None = None) -> Limits`\n\nGet context and output limits for a specific model.\n\n#### `context_limit(model: str, version: str | None = None) -> int`\n\nGet just the context limit for a model.\n\n#### `output_limit(model: str, version: str | None = None) -> int`\n\nGet just the output limit for a model.\n\n#### `model_exists(model: str, version: str | None = None) -> bool`\n\nCheck if a model (and optionally version) exists.\n\n#### `list_models() -> list[str]`\n\nGet a sorted list of all available model names.\n\n#### `list_versions(model: str) -> list[str]`\n\nGet a sorted list of all versions for a specific model.\n\n#### `all_models() -> dict[str, dict[str, Limits]]`\n\nGet the complete model data structure.\n\n## Error Handling\n\nThe package raises appropriate exceptions:\n\n- `KeyError`: When a model or version is not found\n- `ValueError`: When invalid input is provided (e.g., empty model name)\n\nAll error messages include helpful information about available models/versions.\n\n## Development\n\nTo set up for development:\n\n```bash\ngit clone <repository-url>\ncd azure-openai-limits\npip install -e \".[dev]\"\n```\n\nRun tests:\n\n```bash\npytest\n```\n\nRun linting:\n\n```bash\nruff check src/\nmypy src/\n```\n\n## License\n\nMIT License\n",
"bugtrack_url": null,
"license": null,
"summary": "Static token limits for Azure OpenAI models (context/output), with CLI.",
"version": "0.1.1",
"project_urls": {
"Changelog": "https://github.com/taylorn-ai/azure-openai-limits/releases",
"Documentation": "https://github.com/taylorn-ai/azure-openai-limits#readme",
"Homepage": "https://github.com/taylorn-ai/azure-openai-limits",
"Issues": "https://github.com/taylorn-ai/azure-openai-limits/issues",
"Repository": "https://github.com/taylorn-ai/azure-openai-limits"
},
"split_keywords": [
"azure",
" openai",
" token-limits",
" gpt",
" ai",
" machine-learning"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "7b806e663ea49fa87ef58b5a9de0a6ae3ce4e2ca90675417624dde561118fa2d",
"md5": "1ac6422a57318102e60283ff1a21b59a",
"sha256": "e5c8d77a89e9adc04a9507c09732d52df7c85bd13b4ccbd36fd65ac1199f5c83"
},
"downloads": -1,
"filename": "azure_openai_limits-0.1.1-py3-none-any.whl",
"has_sig": false,
"md5_digest": "1ac6422a57318102e60283ff1a21b59a",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.10",
"size": 8535,
"upload_time": "2025-08-18T22:59:54",
"upload_time_iso_8601": "2025-08-18T22:59:54.447069Z",
"url": "https://files.pythonhosted.org/packages/7b/80/6e663ea49fa87ef58b5a9de0a6ae3ce4e2ca90675417624dde561118fa2d/azure_openai_limits-0.1.1-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "a8ee3ae1f0eb816a5184b8f7603bd71bd752db6267842a3179a6a280e2e90012",
"md5": "563c0b74f8db32197f234f7c233fbf51",
"sha256": "ed87a341336924a04d86e497792f5445979dc4a3a69eb3d44fe84fdfdea67292"
},
"downloads": -1,
"filename": "azure_openai_limits-0.1.1.tar.gz",
"has_sig": false,
"md5_digest": "563c0b74f8db32197f234f7c233fbf51",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.10",
"size": 9783,
"upload_time": "2025-08-18T22:59:56",
"upload_time_iso_8601": "2025-08-18T22:59:56.032637Z",
"url": "https://files.pythonhosted.org/packages/a8/ee/3ae1f0eb816a5184b8f7603bd71bd752db6267842a3179a6a280e2e90012/azure_openai_limits-0.1.1.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-08-18 22:59:56",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "taylorn-ai",
"github_project": "azure-openai-limits",
"travis_ci": false,
"coveralls": false,
"github_actions": false,
"lcname": "azure-openai-limits"
}