Name | cmdrdata-openai JSON |
Version |
0.2.1
JSON |
| download |
home_page | None |
Summary | Customer tracking and usage-based billing for OpenAI APIs with arbitrary metadata support |
upload_time | 2025-08-08 07:27:37 |
maintainer | None |
docs_url | None |
author | None |
requires_python | >=3.9 |
license | MIT License
Copyright (c) 2025 cmdrdata-ai
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
|
keywords |
openai
ai
customer-tracking
usage-based-billing
fine-grained-billing
tokens
metadata
sdk
wrapper
|
VCS |
 |
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
# cmdrdata-openai
[](https://github.com/cmdrdata-ai/cmdrdata-openai/actions/workflows/ci.yml)
[](https://codecov.io/gh/cmdrdata-ai/cmdrdata-openai)
[](https://badge.fury.io/py/cmdrdata-openai)
[](https://opensource.org/licenses/MIT)
[](https://pypi.org/project/cmdrdata-openai/)
[](https://pepy.tech/project/cmdrdata-openai)
[](https://github.com/psf/black)
**Customer tracking and usage-based billing for OpenAI APIs**
Transform your OpenAI integration into a customer-aware, usage-based billing system. Track exactly what each customer consumes and bill them accordingly with fine-grained precision.
## 🛡️ Production Ready
**Extremely robust and reliable** - Built for production environments with:
- **Resilient Tracking:** OpenAI calls succeed even if tracking fails.
- **Non-blocking I/O:** Fire-and-forget tracking never slows down your application.
- **Automatic Retries:** Failed tracking attempts are automatically retried with exponential backoff.
- **Thread-Safe Context:** Safely track usage across multi-threaded and async applications.
- **Enterprise Security:** API key sanitization and input validation.
## 💰 Customer Tracking & Usage-Based Billing
`cmdrdata-openai` enables **fine-grained customer tracking** and **usage-based billing** for your AI application:
### **Customer-Level Visibility**
- **Per-customer token consumption** - Track exactly how much each customer uses
- **Usage attribution** - Every API call is attributed to a specific customer
- **Customer context management** - Automatic customer tracking across your application
### **Fine-Grained Billing Control**
- **Custom pricing models** - Set your own rates beyond simple token counts
- **Arbitrary metadata tracking** - Attach any billing-relevant data to each API call
- **Multi-dimensional billing** - Bill based on tokens, requests, models, or custom metrics
- **Real-time usage monitoring** - Track costs and usage as they happen
### **What Gets Tracked**
- **Token usage** (input/output tokens for accurate billing)
- **Model information** (gpt-5, gpt-4o, gpt-4, gpt-3.5-turbo, etc.)
- **Customer identification** (your customer IDs)
- **Custom metadata** (request types, feature usage, geographic data, etc.)
- **Performance metrics** (response times, error rates)
## 🚀 Quick Start
### 1. Install
```bash
pip install cmdrdata-openai
```
**Note**: This package wraps the official OpenAI SDK. If you already have `openai` installed, CmdrData will use your existing version. If not, it will install a compatible version automatically. [Learn more about dependency management →](docs/DEPENDENCY_MANAGEMENT.md)
### 2. Replace Your OpenAI Import
It's a drop-in replacement. All you need to do is change how you initialize the client and add the `customer_id` to your API calls.
**Before:**
```python
from openai import OpenAI
# This client is not tracked
client = OpenAI(api_key="sk-...")
```
**After:**
```python
from cmdrdata_openai import TrackedOpenAI
# This client automatically tracks usage
client = TrackedOpenAI(
api_key="sk-...",
tracker_key="tk-..." # Get this from your cmdrdata dashboard
)
# Add customer_id to your calls to enable tracking
response = client.chat.completions.create(
model="gpt-5", # Supports GPT-5, GPT-4o, GPT-4, etc.
messages=[{"role": "user", "content": "Hello!"}],
customer_id="customer-123"
)
```
That's it! **Every API call now automatically tracks token usage, performance, and errors.**
## 📖 Usage Patterns
### Flask/FastAPI Integration
```python
from flask import Flask, request, jsonify
from cmdrdata_openai import TrackedOpenAI, set_customer_context, clear_customer_context
app = Flask(__name__)
client = TrackedOpenAI(
api_key="your-openai-key",
tracker_key="your-cmdrdata-key"
)
@app.route('/chat', methods=['POST'])
def chat():
data = request.json
customer_id = data['customer_id']
# Set context for this request
set_customer_context(customer_id)
try:
response = client.chat.completions.create(
model="gpt-5",
messages=[{"role": "user", "content": data['message']}]
)
return jsonify({"response": response.choices[0].message.content})
finally:
clear_customer_context()
```
### Context Manager (Automatic Cleanup)
```python
from cmdrdata_openai import customer_context
with customer_context("customer-456"):
response1 = client.chat.completions.create(...)
response2 = client.chat.completions.create(...)
# Both calls tracked for customer-456
# Context automatically cleared
```
### Async Support
```python
from cmdrdata_openai import AsyncTrackedOpenAI
client = AsyncTrackedOpenAI(
api_key="your-openai-key",
tracker_key="your-cmdrdata-key"
)
response = await client.chat.completions.create(
model="gpt-5",
messages=[{"role": "user", "content": "Hello!"}],
customer_id="customer-789"
)
```
### 💎 Fine-Grained Billing with Custom Metadata
Track arbitrary metadata with each API call to enable sophisticated billing models:
```python
# Example: SaaS application with feature-based billing
response = client.chat.completions.create(
model="gpt-5",
messages=[{"role": "user", "content": "Analyze this data..."}],
customer_id="customer-123",
# Custom metadata for fine-grained billing
custom_metadata={
"feature": "data_analysis",
"plan_tier": "premium",
"region": "us-east",
"request_size": "large",
"processing_type": "batch"
}
)
# Example: Usage-based pricing by request complexity
response = client.chat.completions.create(
model="gpt-5",
messages=long_conversation_history,
customer_id="customer-456",
custom_metadata={
"request_complexity": "high",
"conversation_length": len(long_conversation_history),
"business_unit": "sales",
"priority": "high"
}
)
```
**Billing Use Cases:**
- **Feature-based pricing**: Bill differently for different app features
- **Complexity-based pricing**: Higher rates for complex requests
- **Geographic pricing**: Different rates by customer region
- **Plan-tier pricing**: Premium customers pay different rates
- **Volume discounts**: Track cumulative usage for volume pricing
- **Department billing**: Track usage by business unit or team
## 🔧 Configuration
### Basic Configuration
```python
client = TrackedOpenAI(
api_key="your-openai-key", # OpenAI API key
tracker_key="your-cmdrdata-key", # cmdrdata API key
tracker_endpoint="https://api.cmdrdata.ai/api/events", # cmdrdata endpoint
tracker_timeout=5.0 # Timeout for tracking requests
)
```
### Environment Variables
```bash
export OPENAI_API_KEY="your-openai-key"
export CMDRDATA_API_KEY="your-cmdrdata-key"
```
```python
import os
client = TrackedOpenAI(
api_key=os.getenv("OPENAI_API_KEY"),
tracker_key=os.getenv("CMDRDATA_API_KEY")
)
```
## 🎛️ Advanced Features
### Disable Tracking for Specific Calls
```python
response = client.chat.completions.create(
model="gpt-4",
messages=[{"role": "user", "content": "Internal query"}],
track_usage=False # This call won't be tracked
)
```
### Priority System
Customer ID resolution follows this priority:
1. **Explicit `customer_id` parameter** (highest priority)
2. **Customer ID from context**
3. **No tracking** (warning logged)
```python
set_customer_context("context-customer")
# This will be tracked for "explicit-customer"
response = client.chat.completions.create(
model="gpt-4",
messages=[{"role": "user", "content": "Hello"}],
customer_id="explicit-customer" # Overrides context
)
```
### Error Handling
cmdrdata-openai is designed to never break your OpenAI calls:
- **Tracking failures are logged but don't raise exceptions**
- **OpenAI calls proceed normally even if tracking fails**
- **Background tracking doesn't block your application**
```python
# Even if cmdrdata is down, this still works
response = client.chat.completions.create(
model="gpt-4",
messages=[{"role": "user", "content": "Hello"}],
customer_id="customer-123"
)
# OpenAI call succeeds, tracking failure is logged
```
## 📊 What Gets Tracked
For each OpenAI API call, cmdrdata-openai automatically tracks:
- **Customer ID** (from parameter or context)
- **Model used** (e.g., gpt-4, gpt-3.5-turbo)
- **Token usage** (input tokens, output tokens, total tokens)
- **Provider** (openai)
- **Timestamp** (when the call was made)
- **Metadata** (response ID, finish reason, etc.)
Example tracked event:
```json
{
"customer_id": "customer-123",
"model": "gpt-4",
"input_tokens": 15,
"output_tokens": 25,
"total_tokens": 40,
"provider": "openai",
"timestamp": "2025-07-04T10:30:00Z",
"metadata": {
"response_id": "chatcmpl-abc123",
"finish_reason": "stop"
}
}
```
## 🔧 How It Works
CmdrData-OpenAI uses a **proxy pattern** to wrap your existing OpenAI client:
1. **You import CmdrData**: `from cmdrdata_openai import TrackedOpenAI`
2. **CmdrData imports OpenAI**: Uses your installed `openai` package
3. **Creates a wrapper**: Wraps the OpenAI client with tracking
4. **Forwards everything**: All OpenAI methods work exactly the same
5. **Tracks usage**: Intercepts responses to track token usage
**This means**:
- ✅ No conflicts with your OpenAI version
- ✅ All OpenAI features continue working
- ✅ You can upgrade OpenAI independently
- ✅ Zero performance overhead (async tracking)
## 🔌 Compatibility
- **OpenAI Models**: Full support for GPT-5, GPT-4o, GPT-4, GPT-3.5, DALL-E, Whisper, and all OpenAI models
- **OpenAI SDK**: Compatible with OpenAI SDK v1.0.0+ (tested with 1.99.0+)
- **Python**: Supports Python 3.9, 3.10, 3.11, 3.12, and 3.13
- **Async**: Full support for both sync and async usage
- **Frameworks**: Works with Flask, FastAPI, Django, etc.
## 📦 Installation
```bash
# Basic installation
pip install cmdrdata-openai
# For development
git clone https://github.com/cmdrdata-ai/cmdrdata-openai.git
cd cmdrdata-openai
uv pip install -e .[dev]
```
## 🛠️ Development
### Setup
```bash
# Install uv (if not already installed)
curl -LsSf https://astral.sh/uv/install.sh | sh
# Install with dev dependencies
uv pip install -e .[dev]
```
### Running Tests
```bash
# Run all tests
uv run pytest
# Run with coverage reporting
uv run pytest --cov=cmdrdata_openai --cov-report=term-missing
# Run specific test file
uv run pytest tests/test_client.py -v
```
### Code Quality
```bash
# Format code
uv run black cmdrdata_openai/
# Sort imports
uv run isort cmdrdata_openai/
# Type checking
uv run mypy cmdrdata_openai/ --ignore-missing-imports
# Security check
uv run safety check
```
### CI/CD
The project uses GitHub Actions for:
- **Continuous Integration** - Tests across Python 3.9-3.13
- **Code Quality** - Black, isort, mypy, safety checks
- **Coverage Reporting** - >90% test coverage with Codecov
- **Automated Publishing** - PyPI releases on GitHub releases
## 🆘 Troubleshooting
### Common Issues
**"tracker_key is required" error:**
```python
# Make sure you provide the tracker_key
client = TrackedOpenAI(
api_key="your-openai-key",
tracker_key="your-cmdrdata-key" # Don't forget this!
)
```
**No usage tracking:**
```python
# Make sure you provide customer_id or set context
set_customer_context("customer-123")
# OR
response = client.chat.completions.create(..., customer_id="customer-123")
```
**Tracking timeouts:**
```python
# Increase timeout for slow networks
client = TrackedOpenAI(
api_key="your-openai-key",
tracker_key="your-cmdrdata-key",
tracker_timeout=10.0 # Increase from default 5.0
)
```
### Get Help
- 📧 **Email**: hello@cmdrdata.ai
- 🐛 **Issues**: [GitHub Issues](https://github.com/cmdrdata-ai/cmdrdata-openai/issues)
- 📖 **Docs**: [Documentation](https://github.com/cmdrdata-ai/cmdrdata-openai#readme)
## 📄 License
This project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.
## 🚨 Important Notes
- **Never commit API keys** to version control
- **Always clean up context** in web applications
- **Test with small limits** before production deployment
- **Monitor tracking errors** in your logs
## 🤝 Contributing
1. Fork the repository
2. Create a feature branch
3. Add tests for new functionality
4. Run the test suite
5. Submit a pull request
For more details, see [CONTRIBUTING.md](CONTRIBUTING.md).
Raw data
{
"_id": null,
"home_page": null,
"name": "cmdrdata-openai",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.9",
"maintainer_email": "cmdrdata <terencenathan@gmail.com>",
"keywords": "openai, ai, customer-tracking, usage-based-billing, fine-grained-billing, tokens, metadata, sdk, wrapper",
"author": null,
"author_email": "cmdrdata <terencenathan@gmail.com>",
"download_url": "https://files.pythonhosted.org/packages/48/6d/d10c4ae699db120aa2ad3cf0238710bc151b86885930f93d90080f2307be/cmdrdata_openai-0.2.1.tar.gz",
"platform": null,
"description": "# cmdrdata-openai\n\n[](https://github.com/cmdrdata-ai/cmdrdata-openai/actions/workflows/ci.yml)\n[](https://codecov.io/gh/cmdrdata-ai/cmdrdata-openai)\n[](https://badge.fury.io/py/cmdrdata-openai)\n[](https://opensource.org/licenses/MIT)\n[](https://pypi.org/project/cmdrdata-openai/)\n[](https://pepy.tech/project/cmdrdata-openai)\n[](https://github.com/psf/black)\n\n**Customer tracking and usage-based billing for OpenAI APIs**\n\nTransform your OpenAI integration into a customer-aware, usage-based billing system. Track exactly what each customer consumes and bill them accordingly with fine-grained precision.\n\n## \ud83d\udee1\ufe0f Production Ready\n\n**Extremely robust and reliable** - Built for production environments with:\n\n- **Resilient Tracking:** OpenAI calls succeed even if tracking fails.\n- **Non-blocking I/O:** Fire-and-forget tracking never slows down your application.\n- **Automatic Retries:** Failed tracking attempts are automatically retried with exponential backoff.\n- **Thread-Safe Context:** Safely track usage across multi-threaded and async applications.\n- **Enterprise Security:** API key sanitization and input validation.\n\n## \ud83d\udcb0 Customer Tracking & Usage-Based Billing\n\n`cmdrdata-openai` enables **fine-grained customer tracking** and **usage-based billing** for your AI application:\n\n### **Customer-Level Visibility**\n- **Per-customer token consumption** - Track exactly how much each customer uses\n- **Usage attribution** - Every API call is attributed to a specific customer\n- **Customer context management** - Automatic customer tracking across your application\n\n### **Fine-Grained Billing Control**\n- **Custom pricing models** - Set your own rates beyond simple token counts\n- **Arbitrary metadata tracking** - Attach any billing-relevant data to each API call\n- **Multi-dimensional billing** - Bill based on tokens, requests, models, or custom metrics\n- **Real-time usage monitoring** - Track costs and usage as they happen\n\n### **What Gets Tracked**\n- **Token usage** (input/output tokens for accurate billing)\n- **Model information** (gpt-5, gpt-4o, gpt-4, gpt-3.5-turbo, etc.)\n- **Customer identification** (your customer IDs)\n- **Custom metadata** (request types, feature usage, geographic data, etc.)\n- **Performance metrics** (response times, error rates)\n\n## \ud83d\ude80 Quick Start\n\n### 1. Install\n\n```bash\npip install cmdrdata-openai\n```\n\n**Note**: This package wraps the official OpenAI SDK. If you already have `openai` installed, CmdrData will use your existing version. If not, it will install a compatible version automatically. [Learn more about dependency management \u2192](docs/DEPENDENCY_MANAGEMENT.md)\n\n### 2. Replace Your OpenAI Import\n\nIt's a drop-in replacement. All you need to do is change how you initialize the client and add the `customer_id` to your API calls.\n\n**Before:**\n```python\nfrom openai import OpenAI\n\n# This client is not tracked\nclient = OpenAI(api_key=\"sk-...\")\n```\n\n**After:**\n```python\nfrom cmdrdata_openai import TrackedOpenAI\n\n# This client automatically tracks usage\nclient = TrackedOpenAI(\n api_key=\"sk-...\",\n tracker_key=\"tk-...\" # Get this from your cmdrdata dashboard\n)\n\n# Add customer_id to your calls to enable tracking\nresponse = client.chat.completions.create(\n model=\"gpt-5\", # Supports GPT-5, GPT-4o, GPT-4, etc.\n messages=[{\"role\": \"user\", \"content\": \"Hello!\"}],\n customer_id=\"customer-123\"\n)\n```\n\nThat's it! **Every API call now automatically tracks token usage, performance, and errors.**\n\n## \ud83d\udcd6 Usage Patterns\n\n### Flask/FastAPI Integration\n\n```python\nfrom flask import Flask, request, jsonify\nfrom cmdrdata_openai import TrackedOpenAI, set_customer_context, clear_customer_context\n\napp = Flask(__name__)\nclient = TrackedOpenAI(\n api_key=\"your-openai-key\",\n tracker_key=\"your-cmdrdata-key\"\n)\n\n@app.route('/chat', methods=['POST'])\ndef chat():\n data = request.json\n customer_id = data['customer_id']\n \n # Set context for this request\n set_customer_context(customer_id)\n \n try:\n response = client.chat.completions.create(\n model=\"gpt-5\",\n messages=[{\"role\": \"user\", \"content\": data['message']}]\n )\n return jsonify({\"response\": response.choices[0].message.content})\n finally:\n clear_customer_context()\n```\n\n### Context Manager (Automatic Cleanup)\n\n```python\nfrom cmdrdata_openai import customer_context\n\nwith customer_context(\"customer-456\"):\n response1 = client.chat.completions.create(...)\n response2 = client.chat.completions.create(...)\n # Both calls tracked for customer-456\n# Context automatically cleared\n```\n\n### Async Support\n\n```python\nfrom cmdrdata_openai import AsyncTrackedOpenAI\n\nclient = AsyncTrackedOpenAI(\n api_key=\"your-openai-key\",\n tracker_key=\"your-cmdrdata-key\"\n)\n\nresponse = await client.chat.completions.create(\n model=\"gpt-5\",\n messages=[{\"role\": \"user\", \"content\": \"Hello!\"}],\n customer_id=\"customer-789\"\n)\n```\n\n### \ud83d\udc8e Fine-Grained Billing with Custom Metadata\n\nTrack arbitrary metadata with each API call to enable sophisticated billing models:\n\n```python\n# Example: SaaS application with feature-based billing\nresponse = client.chat.completions.create(\n model=\"gpt-5\",\n messages=[{\"role\": \"user\", \"content\": \"Analyze this data...\"}],\n customer_id=\"customer-123\",\n # Custom metadata for fine-grained billing\n custom_metadata={\n \"feature\": \"data_analysis\",\n \"plan_tier\": \"premium\", \n \"region\": \"us-east\",\n \"request_size\": \"large\",\n \"processing_type\": \"batch\"\n }\n)\n\n# Example: Usage-based pricing by request complexity\nresponse = client.chat.completions.create(\n model=\"gpt-5\",\n messages=long_conversation_history,\n customer_id=\"customer-456\",\n custom_metadata={\n \"request_complexity\": \"high\",\n \"conversation_length\": len(long_conversation_history),\n \"business_unit\": \"sales\",\n \"priority\": \"high\"\n }\n)\n```\n\n**Billing Use Cases:**\n- **Feature-based pricing**: Bill differently for different app features\n- **Complexity-based pricing**: Higher rates for complex requests\n- **Geographic pricing**: Different rates by customer region \n- **Plan-tier pricing**: Premium customers pay different rates\n- **Volume discounts**: Track cumulative usage for volume pricing\n- **Department billing**: Track usage by business unit or team\n\n## \ud83d\udd27 Configuration\n\n### Basic Configuration\n\n```python\nclient = TrackedOpenAI(\n api_key=\"your-openai-key\", # OpenAI API key\n tracker_key=\"your-cmdrdata-key\", # cmdrdata API key\n tracker_endpoint=\"https://api.cmdrdata.ai/api/events\", # cmdrdata endpoint\n tracker_timeout=5.0 # Timeout for tracking requests\n)\n```\n\n### Environment Variables\n\n```bash\nexport OPENAI_API_KEY=\"your-openai-key\"\nexport CMDRDATA_API_KEY=\"your-cmdrdata-key\"\n```\n\n```python\nimport os\nclient = TrackedOpenAI(\n api_key=os.getenv(\"OPENAI_API_KEY\"),\n tracker_key=os.getenv(\"CMDRDATA_API_KEY\")\n)\n```\n\n## \ud83c\udf9b\ufe0f Advanced Features\n\n### Disable Tracking for Specific Calls\n\n```python\nresponse = client.chat.completions.create(\n model=\"gpt-4\",\n messages=[{\"role\": \"user\", \"content\": \"Internal query\"}],\n track_usage=False # This call won't be tracked\n)\n```\n\n### Priority System\n\nCustomer ID resolution follows this priority:\n\n1. **Explicit `customer_id` parameter** (highest priority)\n2. **Customer ID from context**\n3. **No tracking** (warning logged)\n\n```python\nset_customer_context(\"context-customer\")\n\n# This will be tracked for \"explicit-customer\"\nresponse = client.chat.completions.create(\n model=\"gpt-4\",\n messages=[{\"role\": \"user\", \"content\": \"Hello\"}],\n customer_id=\"explicit-customer\" # Overrides context\n)\n```\n\n### Error Handling\n\ncmdrdata-openai is designed to never break your OpenAI calls:\n\n- **Tracking failures are logged but don't raise exceptions**\n- **OpenAI calls proceed normally even if tracking fails**\n- **Background tracking doesn't block your application**\n\n```python\n# Even if cmdrdata is down, this still works\nresponse = client.chat.completions.create(\n model=\"gpt-4\",\n messages=[{\"role\": \"user\", \"content\": \"Hello\"}],\n customer_id=\"customer-123\"\n)\n# OpenAI call succeeds, tracking failure is logged\n```\n\n## \ud83d\udcca What Gets Tracked\n\nFor each OpenAI API call, cmdrdata-openai automatically tracks:\n\n- **Customer ID** (from parameter or context)\n- **Model used** (e.g., gpt-4, gpt-3.5-turbo)\n- **Token usage** (input tokens, output tokens, total tokens)\n- **Provider** (openai)\n- **Timestamp** (when the call was made)\n- **Metadata** (response ID, finish reason, etc.)\n\nExample tracked event:\n```json\n{\n \"customer_id\": \"customer-123\",\n \"model\": \"gpt-4\",\n \"input_tokens\": 15,\n \"output_tokens\": 25,\n \"total_tokens\": 40,\n \"provider\": \"openai\",\n \"timestamp\": \"2025-07-04T10:30:00Z\",\n \"metadata\": {\n \"response_id\": \"chatcmpl-abc123\",\n \"finish_reason\": \"stop\"\n }\n}\n```\n\n## \ud83d\udd27 How It Works\n\nCmdrData-OpenAI uses a **proxy pattern** to wrap your existing OpenAI client:\n\n1. **You import CmdrData**: `from cmdrdata_openai import TrackedOpenAI`\n2. **CmdrData imports OpenAI**: Uses your installed `openai` package\n3. **Creates a wrapper**: Wraps the OpenAI client with tracking\n4. **Forwards everything**: All OpenAI methods work exactly the same\n5. **Tracks usage**: Intercepts responses to track token usage\n\n**This means**:\n- \u2705 No conflicts with your OpenAI version\n- \u2705 All OpenAI features continue working\n- \u2705 You can upgrade OpenAI independently\n- \u2705 Zero performance overhead (async tracking)\n\n## \ud83d\udd0c Compatibility\n\n- **OpenAI Models**: Full support for GPT-5, GPT-4o, GPT-4, GPT-3.5, DALL-E, Whisper, and all OpenAI models\n- **OpenAI SDK**: Compatible with OpenAI SDK v1.0.0+ (tested with 1.99.0+)\n- **Python**: Supports Python 3.9, 3.10, 3.11, 3.12, and 3.13\n- **Async**: Full support for both sync and async usage\n- **Frameworks**: Works with Flask, FastAPI, Django, etc.\n\n## \ud83d\udce6 Installation\n\n```bash\n# Basic installation\npip install cmdrdata-openai\n\n# For development\ngit clone https://github.com/cmdrdata-ai/cmdrdata-openai.git\ncd cmdrdata-openai\nuv pip install -e .[dev]\n```\n\n## \ud83d\udee0\ufe0f Development\n\n### Setup\n\n```bash\n# Install uv (if not already installed)\ncurl -LsSf https://astral.sh/uv/install.sh | sh\n\n# Install with dev dependencies\nuv pip install -e .[dev]\n```\n\n### Running Tests\n\n```bash\n# Run all tests \nuv run pytest\n\n# Run with coverage reporting\nuv run pytest --cov=cmdrdata_openai --cov-report=term-missing\n\n# Run specific test file\nuv run pytest tests/test_client.py -v\n```\n\n### Code Quality\n\n```bash\n# Format code\nuv run black cmdrdata_openai/\n\n# Sort imports\nuv run isort cmdrdata_openai/\n\n# Type checking\nuv run mypy cmdrdata_openai/ --ignore-missing-imports\n\n# Security check\nuv run safety check\n```\n\n### CI/CD\n\nThe project uses GitHub Actions for:\n\n- **Continuous Integration** - Tests across Python 3.9-3.13\n- **Code Quality** - Black, isort, mypy, safety checks \n- **Coverage Reporting** - >90% test coverage with Codecov\n- **Automated Publishing** - PyPI releases on GitHub releases\n\n## \ud83c\udd98 Troubleshooting\n\n### Common Issues\n\n**\"tracker_key is required\" error:**\n```python\n# Make sure you provide the tracker_key\nclient = TrackedOpenAI(\n api_key=\"your-openai-key\",\n tracker_key=\"your-cmdrdata-key\" # Don't forget this!\n)\n```\n\n**No usage tracking:**\n```python\n# Make sure you provide customer_id or set context\nset_customer_context(\"customer-123\")\n# OR\nresponse = client.chat.completions.create(..., customer_id=\"customer-123\")\n```\n\n**Tracking timeouts:**\n```python\n# Increase timeout for slow networks\nclient = TrackedOpenAI(\n api_key=\"your-openai-key\",\n tracker_key=\"your-cmdrdata-key\",\n tracker_timeout=10.0 # Increase from default 5.0\n)\n```\n\n### Get Help\n\n- \ud83d\udce7 **Email**: hello@cmdrdata.ai\n- \ud83d\udc1b **Issues**: [GitHub Issues](https://github.com/cmdrdata-ai/cmdrdata-openai/issues)\n- \ud83d\udcd6 **Docs**: [Documentation](https://github.com/cmdrdata-ai/cmdrdata-openai#readme)\n\n## \ud83d\udcc4 License\n\nThis project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.\n\n## \ud83d\udea8 Important Notes\n\n- **Never commit API keys** to version control\n- **Always clean up context** in web applications\n- **Test with small limits** before production deployment\n- **Monitor tracking errors** in your logs\n\n## \ud83e\udd1d Contributing\n\n1. Fork the repository\n2. Create a feature branch\n3. Add tests for new functionality\n4. Run the test suite\n5. Submit a pull request\n\nFor more details, see [CONTRIBUTING.md](CONTRIBUTING.md).\n",
"bugtrack_url": null,
"license": "MIT License\n \n Copyright (c) 2025 cmdrdata-ai\n \n Permission is hereby granted, free of charge, to any person obtaining a copy\n of this software and associated documentation files (the \"Software\"), to deal\n in the Software without restriction, including without limitation the rights\n to use, copy, modify, merge, publish, distribute, sublicense, and/or sell\n copies of the Software, and to permit persons to whom the Software is\n furnished to do so, subject to the following conditions:\n \n The above copyright notice and this permission notice shall be included in all\n copies or substantial portions of the Software.\n \n THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\n IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\n FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\n AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\n LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\n OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\n SOFTWARE.\n ",
"summary": "Customer tracking and usage-based billing for OpenAI APIs with arbitrary metadata support",
"version": "0.2.1",
"project_urls": {
"Changelog": "https://github.com/cmdrdata-ai/cmdrdata-openai/releases",
"Documentation": "https://github.com/cmdrdata-ai/cmdrdata-openai#readme",
"Homepage": "https://www.cmdrdata.ai",
"Issues": "https://github.com/cmdrdata-ai/cmdrdata-openai/issues",
"Repository": "https://github.com/cmdrdata-ai/cmdrdata-openai"
},
"split_keywords": [
"openai",
" ai",
" customer-tracking",
" usage-based-billing",
" fine-grained-billing",
" tokens",
" metadata",
" sdk",
" wrapper"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "f5dca3c32253b289a0fbe223a760832164b1d2b1f9b4ba85af6703ef97fdb09b",
"md5": "a051c2d12d89b2162230836297d7bb7f",
"sha256": "96d83cee25341ea81f38893d26255814b871a77b33b01b1e24269f947a09ca80"
},
"downloads": -1,
"filename": "cmdrdata_openai-0.2.1-py3-none-any.whl",
"has_sig": false,
"md5_digest": "a051c2d12d89b2162230836297d7bb7f",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.9",
"size": 42211,
"upload_time": "2025-08-08T07:27:36",
"upload_time_iso_8601": "2025-08-08T07:27:36.437651Z",
"url": "https://files.pythonhosted.org/packages/f5/dc/a3c32253b289a0fbe223a760832164b1d2b1f9b4ba85af6703ef97fdb09b/cmdrdata_openai-0.2.1-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "486dd10c4ae699db120aa2ad3cf0238710bc151b86885930f93d90080f2307be",
"md5": "16d3c4e1ef818a22131af74ac4846dee",
"sha256": "f6c9793819ca2495e3a2071024c54fb0833ada21045f93aee9fea7f5a61b2ce7"
},
"downloads": -1,
"filename": "cmdrdata_openai-0.2.1.tar.gz",
"has_sig": false,
"md5_digest": "16d3c4e1ef818a22131af74ac4846dee",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.9",
"size": 71566,
"upload_time": "2025-08-08T07:27:37",
"upload_time_iso_8601": "2025-08-08T07:27:37.452057Z",
"url": "https://files.pythonhosted.org/packages/48/6d/d10c4ae699db120aa2ad3cf0238710bc151b86885930f93d90080f2307be/cmdrdata_openai-0.2.1.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-08-08 07:27:37",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "cmdrdata-ai",
"github_project": "cmdrdata-openai",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"lcname": "cmdrdata-openai"
}