# LLM Warehouse
๐ **Auto-capture OpenAI and Anthropic LLM calls for warehousing**
A lightweight Python library that automatically logs all your OpenAI and Anthropic API calls to various storage backends, including your own Flask app, Supabase, or local files.
## ๐ Quick Start
### Installation
```bash
pip install llm-warehouse
```
Or for the latest development version:
```bash
pip install git+https://github.com/sinanozdemir/llm-warehouse.git
```
### Basic Usage
For automatic patching on import, set environment variables:
```bash
export LLM_WAREHOUSE_API_KEY="your-warehouse-api-key"
export LLM_WAREHOUSE_URL="https://your-warehouse.com"
```
Then just import any LLM library AFTER importing this package - logging happens automatically:
```python
import llm_warehouse # BEFORE openai or anthropic
import openai # Automatically patched!
# or
import anthropic # Automatically patched!
```
Sample usage:
```python
import llm_warehouse
# Now use OpenAI/Anthropic normally - all calls are automatically logged!
import openai
client = openai.Client()
response = client.chat.completions.create(
model="gpt-4.1-mini",
messages=[{"role": "user", "content": "Hello!"}]
)
```
## ๐ What Gets Logged
- **Request data**: Model, messages, parameters
- **Response data**: Completions, token usage, timing
- **Metadata**: Timestamps, SDK method, streaming info
- **Errors**: API errors and exceptions
## ๐ง Configuration Options
## ๐ก๏ธ Environment Variables
| Variable | Description |
|----------|-------------|
| `LLM_WAREHOUSE_API_KEY` | Your warehouse API token (enables auto-patching) |
| `LLM_WAREHOUSE_URL` | Your warehouse URL |
## ๐ Programmatic Control (for advanced users)
```python
import llm_warehouse
# Enable logging
llm_warehouse.patch(warehouse_url="...", api_key="...")
# Disable logging
llm_warehouse.unpatch()
# Check status
if llm_warehouse.is_patched():
print("LLM calls are being logged")
```
## ๐๏ธ Backend Options
### API Warehouse Backend
Use with the included API (recommended):
```python
llm_warehouse.patch(
warehouse_url="https://your-warehouse.com",
api_key="your-warehouse-api-key"
)
```
### Supabase Backend
Direct integration with Supabase:
```python
llm_warehouse.patch(
supabase_url="https://your-project.supabase.co",
supabase_key="your-supabase-anon-key"
)
```
### Local File Backend
For development and testing:
```python
llm_warehouse.patch(log_file="llm_calls.jsonl")
```
## ๐ฆ Features
- โ
**Zero-configuration**: Works out of the box with environment variables
- โ
**Multiple backends**: Flask warehouse, Supabase, local files
- โ
**Async support**: Full async/await compatibility
- โ
**Streaming support**: Captures streaming responses
- โ
**Error handling**: Logs API errors and exceptions
- โ
**Minimal overhead**: Designed for production use
- โ
**Thread-safe**: Works in multi-threaded applications
## ๐งช Development
```bash
git clone https://github.com/sinanozdemir/llm-warehouse.git
cd llm-warehouse/llm-warehouse-package
pip install -e ".[dev]"
```
Run tests:
```bash
pytest
```
Format code:
```bash
black llm_warehouse/
isort llm_warehouse/
```
## ๐ License
MIT License - see [LICENSE](LICENSE) file for details.
Raw data
{
"_id": null,
"home_page": null,
"name": "llm-warehouse",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.9",
"maintainer_email": "Sinan Ozdemir <sinan.u.ozdemir@gmail.com>",
"keywords": "llm, openai, anthropic, logging, observability, monitoring, warehouse",
"author": null,
"author_email": "Sinan Ozdemir <sinan.u.ozdemir@gmail.com>",
"download_url": "https://files.pythonhosted.org/packages/5b/32/8496c340089b7e4f5c65131dfb01e3554b6e01c373bcd0988b5bcf27978c/llm_warehouse-0.1.1.tar.gz",
"platform": null,
"description": "# LLM Warehouse\n\n\ud83c\udfe0 **Auto-capture OpenAI and Anthropic LLM calls for warehousing**\n\nA lightweight Python library that automatically logs all your OpenAI and Anthropic API calls to various storage backends, including your own Flask app, Supabase, or local files.\n\n## \ud83d\ude80 Quick Start\n\n### Installation\n\n```bash\npip install llm-warehouse\n```\n\nOr for the latest development version:\n\n```bash\npip install git+https://github.com/sinanozdemir/llm-warehouse.git\n```\n\n### Basic Usage\n\nFor automatic patching on import, set environment variables:\n\n```bash\nexport LLM_WAREHOUSE_API_KEY=\"your-warehouse-api-key\"\nexport LLM_WAREHOUSE_URL=\"https://your-warehouse.com\"\n```\n\nThen just import any LLM library AFTER importing this package - logging happens automatically:\n\n```python\nimport llm_warehouse # BEFORE openai or anthropic\n\nimport openai # Automatically patched!\n# or\nimport anthropic # Automatically patched!\n```\n\nSample usage:\n\n```python\nimport llm_warehouse\n\n# Now use OpenAI/Anthropic normally - all calls are automatically logged!\nimport openai\nclient = openai.Client()\nresponse = client.chat.completions.create(\n model=\"gpt-4.1-mini\",\n messages=[{\"role\": \"user\", \"content\": \"Hello!\"}]\n)\n```\n\n## \ud83d\udcca What Gets Logged\n\n- **Request data**: Model, messages, parameters\n- **Response data**: Completions, token usage, timing\n- **Metadata**: Timestamps, SDK method, streaming info\n- **Errors**: API errors and exceptions\n\n## \ud83d\udd27 Configuration Options\n\n## \ud83d\udee1\ufe0f Environment Variables\n\n| Variable | Description |\n|----------|-------------|\n| `LLM_WAREHOUSE_API_KEY` | Your warehouse API token (enables auto-patching) |\n| `LLM_WAREHOUSE_URL` | Your warehouse URL |\n\n## \ud83d\udd04 Programmatic Control (for advanced users)\n\n```python\nimport llm_warehouse\n\n# Enable logging\nllm_warehouse.patch(warehouse_url=\"...\", api_key=\"...\")\n\n# Disable logging\nllm_warehouse.unpatch()\n\n# Check status\nif llm_warehouse.is_patched():\n print(\"LLM calls are being logged\")\n```\n\n## \ud83c\udfd7\ufe0f Backend Options\n\n### API Warehouse Backend\nUse with the included API (recommended):\n\n```python\nllm_warehouse.patch(\n warehouse_url=\"https://your-warehouse.com\",\n api_key=\"your-warehouse-api-key\"\n)\n```\n\n### Supabase Backend\nDirect integration with Supabase:\n\n```python\nllm_warehouse.patch(\n supabase_url=\"https://your-project.supabase.co\",\n supabase_key=\"your-supabase-anon-key\"\n)\n```\n\n### Local File Backend\nFor development and testing:\n\n```python\nllm_warehouse.patch(log_file=\"llm_calls.jsonl\")\n```\n\n## \ud83d\udce6 Features\n\n- \u2705 **Zero-configuration**: Works out of the box with environment variables\n- \u2705 **Multiple backends**: Flask warehouse, Supabase, local files\n- \u2705 **Async support**: Full async/await compatibility\n- \u2705 **Streaming support**: Captures streaming responses\n- \u2705 **Error handling**: Logs API errors and exceptions\n- \u2705 **Minimal overhead**: Designed for production use\n- \u2705 **Thread-safe**: Works in multi-threaded applications\n\n## \ud83e\uddea Development\n\n```bash\ngit clone https://github.com/sinanozdemir/llm-warehouse.git\ncd llm-warehouse/llm-warehouse-package\npip install -e \".[dev]\"\n```\n\nRun tests:\n```bash\npytest\n```\n\nFormat code:\n```bash\nblack llm_warehouse/\nisort llm_warehouse/\n```\n\n## \ud83d\udcdd License\n\nMIT License - see [LICENSE](LICENSE) file for details.\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "Auto-capture OpenAI and Anthropic LLM calls for warehousing and observability",
"version": "0.1.1",
"project_urls": {
"Documentation": "https://github.com/Agenti-Code/llm-warehousing#readme",
"Homepage": "https://github.com/Agenti-Code/llm-warehousing",
"Issues": "https://github.com/Agenti-Code/llm-warehousing/issues",
"Repository": "https://github.com/Agenti-Code/llm-warehousing.git"
},
"split_keywords": [
"llm",
" openai",
" anthropic",
" logging",
" observability",
" monitoring",
" warehouse"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "83b733556087d70113a0bf587c12d8ec803e85c98a4887bed7d7931b3166794b",
"md5": "261c8334517f153f2bfece1a40b6810c",
"sha256": "3a0bb8e7fc8ec87e4fc104722d1c1abca8a387b91a9fa5e714f0a943dc4ec7a8"
},
"downloads": -1,
"filename": "llm_warehouse-0.1.1-py3-none-any.whl",
"has_sig": false,
"md5_digest": "261c8334517f153f2bfece1a40b6810c",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.9",
"size": 14058,
"upload_time": "2025-09-18T16:20:44",
"upload_time_iso_8601": "2025-09-18T16:20:44.711645Z",
"url": "https://files.pythonhosted.org/packages/83/b7/33556087d70113a0bf587c12d8ec803e85c98a4887bed7d7931b3166794b/llm_warehouse-0.1.1-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "5b328496c340089b7e4f5c65131dfb01e3554b6e01c373bcd0988b5bcf27978c",
"md5": "5644cfa6c467b81a9da4ffdb62626ee8",
"sha256": "78ab6da335c6745fca5440fc99120bdb5087ed5e7494e4fa575b5ac958bc3ac5"
},
"downloads": -1,
"filename": "llm_warehouse-0.1.1.tar.gz",
"has_sig": false,
"md5_digest": "5644cfa6c467b81a9da4ffdb62626ee8",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.9",
"size": 14334,
"upload_time": "2025-09-18T16:20:46",
"upload_time_iso_8601": "2025-09-18T16:20:46.147123Z",
"url": "https://files.pythonhosted.org/packages/5b/32/8496c340089b7e4f5c65131dfb01e3554b6e01c373bcd0988b5bcf27978c/llm_warehouse-0.1.1.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-09-18 16:20:46",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "Agenti-Code",
"github_project": "llm-warehousing#readme",
"travis_ci": false,
"coveralls": false,
"github_actions": false,
"lcname": "llm-warehouse"
}