<h1 align="center" style="border-bottom: none">
    <div>
        <a href="https://phoenix.arize.com/?utm_medium=github&utm_content=header_img&utm_campaign=phoenix-client">
            <picture>
                <source media="(prefers-color-scheme: dark)" srcset="https://raw.githubusercontent.com/Arize-ai/phoenix-assets/refs/heads/main/logos/Phoenix/phoenix.svg">
                <source media="(prefers-color-scheme: light)" srcset="https://raw.githubusercontent.com/Arize-ai/phoenix-assets/refs/heads/main/logos/Phoenix/phoenix-white.svg">
                <img alt="Arize Phoenix logo" src="https://raw.githubusercontent.com/Arize-ai/phoenix-assets/refs/heads/main/logos/Phoenix/phoenix.svg" width="100" />
            </picture>
        </a>
        <br>
        arize-phoenix-otel
    </div>
</h1>
<p align="center">
    <a href="https://pypi.org/project/arize-phoenix-otel/">
        <img src="https://img.shields.io/pypi/v/arize-phoenix-otel" alt="PyPI Version">
    </a>
    <a href="https://arize-phoenix.readthedocs.io/projects/otel/en/latest/index.html">
        <img src="https://img.shields.io/badge/docs-blue?logo=readthedocs&logoColor=white" alt="Documentation">
    </a>
</p>
Provides a lightweight wrapper around OpenTelemetry primitives with Phoenix-aware defaults. Phoenix OTEL also gives you access to tracing decorators for common GenAI patterns.
## Features
`arize-phoenix-otel` simplifies OpenTelemetry configuration for Phoenix users by providing:
- **Phoenix-aware defaults** for common OpenTelemetry primitives
- **Automatic configuration** from environment variables
- **Drop-in replacements** for OTel classes with enhanced functionality
- **Simplified tracing setup** with the `register()` function
- **Tracing decorators** for GenAI patterns
## Key Benefits
- **Zero Code Changes**: Enable `auto_instrument=True` to automatically instrument AI libraries
- **Production Ready**: Built-in batching and authentication
- **Phoenix Integration**: Seamless integration with Phoenix Cloud and self-hosted instances
- **OpenTelemetry Compatible**: Works with existing OpenTelemetry infrastructure
These defaults are aware of environment variables you may have set to configure Phoenix:
- `PHOENIX_COLLECTOR_ENDPOINT`
- `PHOENIX_PROJECT_NAME`
- `PHOENIX_CLIENT_HEADERS`
- `PHOENIX_API_KEY`
- `PHOENIX_GRPC_PORT`
## Installation
Install via `pip`:
```shell
pip install arize-phoenix-otel
```
## Quick Start
**Recommended**: Enable automatic instrumentation to trace your AI libraries with zero code changes:
```python
from phoenix.otel import register
# Recommended: Automatic instrumentation + production settings
tracer_provider = register(
    auto_instrument=True,  # Auto-trace OpenAI, LangChain, LlamaIndex, etc.
    batch=True,           # Production-ready batching
    project_name="my-app" # Organize your traces
)
```
That's it! All `openinference-*` AI libraries are now automatically traced and sent to Phoenix.
**Note**: `auto_instrument=True` only works if the corresponding OpenInference instrumentation libraries are installed. For example, to automatically trace OpenAI calls, you need `openinference-instrumentation-openai` installed:
```bash
pip install openinference-instrumentation-openai
pip install openinference-instrumentation-langchain  # For LangChain
pip install openinference-instrumentation-llama-index  # For LlamaIndex
```
See the [OpenInference repository](https://github.com/Arize-ai/openinference) for the complete list of available instrumentation packages.
### Authentication
```bash
export PHOENIX_API_KEY="your-api-key"
```
```python
# Or pass directly to register()
tracer_provider = register(api_key="your-api-key")
```
### Endpoint Configuration
Configure where to send your traces:
**Environment Variables** (Recommended):
```bash
export PHOENIX_COLLECTOR_ENDPOINT="https://app.phoenix.arize.com/s/your-space"
export PHOENIX_PROJECT_NAME="my-project"
```
**Direct Configuration**:
```python
tracer_provider = register(
    endpoint="http://localhost:6006/v1/traces",  # HTTP endpoint
    protocol="grpc"  # Or force gRPC protocol
)
```
## Usage Examples
### Simple Setup
```python
from phoenix.otel import register
# Basic setup - sends to localhost
tracer_provider = register(auto_instrument=True)
```
### Production Configuration
```python
tracer_provider = register(
    project_name="my-production-app",
    auto_instrument=True,      # Auto-trace AI/ML libraries
    batch=True,               # Background batching for performance
    api_key="your-api-key",   # Authentication
    endpoint="https://app.phoenix.arize.com/s/your-space"
)
```
### Manual Configuration
For advanced use cases, use Phoenix OTEL components directly:
```python
from phoenix.otel import TracerProvider, BatchSpanProcessor, HTTPSpanExporter
tracer_provider = TracerProvider()
exporter = HTTPSpanExporter(endpoint="http://localhost:6006/v1/traces")
processor = BatchSpanProcessor(span_exporter=exporter)
tracer_provider.add_span_processor(processor)
```
### Using Decorators
```python
from phoenix.otel import register
tracer_provider = register()
# Get a tracer for manual instrumentation
tracer = tracer_provider.get_tracer(__name__)
@tracer.chain
def process_data(data):
    return data + " processed"
@tracer.tool
def weather(location):
    return "sunny"
```
## Environment Variables
| Variable                     | Description          | Example                                      |
| ---------------------------- | -------------------- | -------------------------------------------- |
| `PHOENIX_COLLECTOR_ENDPOINT` | Where to send traces | `https://app.phoenix.arize.com/s/your-space` |
| `PHOENIX_PROJECT_NAME`       | Project name         | `my-llm-app`                                 |
| `PHOENIX_API_KEY`            | Authentication key   | `your-api-key`                               |
| `PHOENIX_CLIENT_HEADERS`     | Custom headers       | `Authorization=Bearer token`                 |
| `PHOENIX_GRPC_PORT`          | gRPC port override   | `4317`                                       |
## Documentation
- **[Full Documentation](https://arize-phoenix.readthedocs.io/projects/otel/en/latest/index.html)** - Complete API reference and guides
- **[Phoenix Docs](https://arize.com/docs/phoenix)** - Detailed tracing examples and patterns
- **[OpenInference](https://github.com/Arize-ai/openinference)** - Auto-instrumentation libraries for frameworks
## Community
Join our community to connect with thousands of AI builders:
- π Join our [Slack community](https://arize-ai.slack.com/join/shared_invite/zt-11t1vbu4x-xkBIHmOREQnYnYDH1GDfCg).
- π‘ Ask questions and provide feedback in the _#phoenix-support_ channel.
- π Leave a star on our [GitHub](https://github.com/Arize-ai/phoenix).
- π Report bugs with [GitHub Issues](https://github.com/Arize-ai/phoenix/issues).
- π Follow us on [π](https://twitter.com/ArizePhoenix).
- πΊοΈ Check out our [roadmap](https://github.com/orgs/Arize-ai/projects/45) to see where we're heading next.
            
         
        Raw data
        
            {
    "_id": null,
    "home_page": null,
    "name": "arize-phoenix-otel",
    "maintainer": null,
    "docs_url": null,
    "requires_python": "<3.14,>=3.8",
    "maintainer_email": null,
    "keywords": "Explainability, Monitoring, Observability",
    "author": null,
    "author_email": "Arize AI <phoenix-devs@arize.com>",
    "download_url": "https://files.pythonhosted.org/packages/44/3b/8745dc29b9d7db658f64d4a14d100285d1b74f38827171c22120aa5b7c9c/arize_phoenix_otel-0.13.1.tar.gz",
    "platform": null,
    "description": "<h1 align=\"center\" style=\"border-bottom: none\">\n    <div>\n        <a href=\"https://phoenix.arize.com/?utm_medium=github&utm_content=header_img&utm_campaign=phoenix-client\">\n            <picture>\n                <source media=\"(prefers-color-scheme: dark)\" srcset=\"https://raw.githubusercontent.com/Arize-ai/phoenix-assets/refs/heads/main/logos/Phoenix/phoenix.svg\">\n                <source media=\"(prefers-color-scheme: light)\" srcset=\"https://raw.githubusercontent.com/Arize-ai/phoenix-assets/refs/heads/main/logos/Phoenix/phoenix-white.svg\">\n                <img alt=\"Arize Phoenix logo\" src=\"https://raw.githubusercontent.com/Arize-ai/phoenix-assets/refs/heads/main/logos/Phoenix/phoenix.svg\" width=\"100\" />\n            </picture>\n        </a>\n        <br>\n        arize-phoenix-otel\n    </div>\n</h1>\n\n<p align=\"center\">\n    <a href=\"https://pypi.org/project/arize-phoenix-otel/\">\n        <img src=\"https://img.shields.io/pypi/v/arize-phoenix-otel\" alt=\"PyPI Version\">\n    </a>\n    <a href=\"https://arize-phoenix.readthedocs.io/projects/otel/en/latest/index.html\">\n        <img src=\"https://img.shields.io/badge/docs-blue?logo=readthedocs&logoColor=white\" alt=\"Documentation\">\n    </a>\n</p>\n\nProvides a lightweight wrapper around OpenTelemetry primitives with Phoenix-aware defaults. Phoenix OTEL also gives you access to tracing decorators for common GenAI patterns.\n\n## Features\n\n`arize-phoenix-otel` simplifies OpenTelemetry configuration for Phoenix users by providing:\n\n- **Phoenix-aware defaults** for common OpenTelemetry primitives\n- **Automatic configuration** from environment variables\n- **Drop-in replacements** for OTel classes with enhanced functionality\n- **Simplified tracing setup** with the `register()` function\n- **Tracing decorators** for GenAI patterns\n\n## Key Benefits\n\n- **Zero Code Changes**: Enable `auto_instrument=True` to automatically instrument AI libraries\n- **Production Ready**: Built-in batching and authentication\n- **Phoenix Integration**: Seamless integration with Phoenix Cloud and self-hosted instances\n- **OpenTelemetry Compatible**: Works with existing OpenTelemetry infrastructure\n\nThese defaults are aware of environment variables you may have set to configure Phoenix:\n\n- `PHOENIX_COLLECTOR_ENDPOINT`\n- `PHOENIX_PROJECT_NAME`\n- `PHOENIX_CLIENT_HEADERS`\n- `PHOENIX_API_KEY`\n- `PHOENIX_GRPC_PORT`\n\n## Installation\n\nInstall via `pip`:\n\n```shell\npip install arize-phoenix-otel\n```\n\n## Quick Start\n\n**Recommended**: Enable automatic instrumentation to trace your AI libraries with zero code changes:\n\n```python\nfrom phoenix.otel import register\n\n# Recommended: Automatic instrumentation + production settings\ntracer_provider = register(\n    auto_instrument=True,  # Auto-trace OpenAI, LangChain, LlamaIndex, etc.\n    batch=True,           # Production-ready batching\n    project_name=\"my-app\" # Organize your traces\n)\n```\n\nThat's it! All `openinference-*` AI libraries are now automatically traced and sent to Phoenix.\n\n**Note**: `auto_instrument=True` only works if the corresponding OpenInference instrumentation libraries are installed. For example, to automatically trace OpenAI calls, you need `openinference-instrumentation-openai` installed:\n\n```bash\npip install openinference-instrumentation-openai\npip install openinference-instrumentation-langchain  # For LangChain\npip install openinference-instrumentation-llama-index  # For LlamaIndex\n```\n\nSee the [OpenInference repository](https://github.com/Arize-ai/openinference) for the complete list of available instrumentation packages.\n\n### Authentication\n\n```bash\nexport PHOENIX_API_KEY=\"your-api-key\"\n```\n\n```python\n# Or pass directly to register()\ntracer_provider = register(api_key=\"your-api-key\")\n```\n\n### Endpoint Configuration\n\nConfigure where to send your traces:\n\n**Environment Variables** (Recommended):\n\n```bash\nexport PHOENIX_COLLECTOR_ENDPOINT=\"https://app.phoenix.arize.com/s/your-space\"\nexport PHOENIX_PROJECT_NAME=\"my-project\"\n```\n\n**Direct Configuration**:\n\n```python\ntracer_provider = register(\n    endpoint=\"http://localhost:6006/v1/traces\",  # HTTP endpoint\n    protocol=\"grpc\"  # Or force gRPC protocol\n)\n```\n\n## Usage Examples\n\n### Simple Setup\n\n```python\nfrom phoenix.otel import register\n\n# Basic setup - sends to localhost\ntracer_provider = register(auto_instrument=True)\n```\n\n### Production Configuration\n\n```python\ntracer_provider = register(\n    project_name=\"my-production-app\",\n    auto_instrument=True,      # Auto-trace AI/ML libraries\n    batch=True,               # Background batching for performance\n    api_key=\"your-api-key\",   # Authentication\n    endpoint=\"https://app.phoenix.arize.com/s/your-space\"\n)\n```\n\n### Manual Configuration\n\nFor advanced use cases, use Phoenix OTEL components directly:\n\n```python\nfrom phoenix.otel import TracerProvider, BatchSpanProcessor, HTTPSpanExporter\n\ntracer_provider = TracerProvider()\nexporter = HTTPSpanExporter(endpoint=\"http://localhost:6006/v1/traces\")\nprocessor = BatchSpanProcessor(span_exporter=exporter)\ntracer_provider.add_span_processor(processor)\n```\n\n### Using Decorators\n\n```python\nfrom phoenix.otel import register\n\ntracer_provider = register()\n\n# Get a tracer for manual instrumentation\ntracer = tracer_provider.get_tracer(__name__)\n\n@tracer.chain\ndef process_data(data):\n    return data + \" processed\"\n\n@tracer.tool\ndef weather(location):\n    return \"sunny\"\n```\n\n## Environment Variables\n\n| Variable                     | Description          | Example                                      |\n| ---------------------------- | -------------------- | -------------------------------------------- |\n| `PHOENIX_COLLECTOR_ENDPOINT` | Where to send traces | `https://app.phoenix.arize.com/s/your-space` |\n| `PHOENIX_PROJECT_NAME`       | Project name         | `my-llm-app`                                 |\n| `PHOENIX_API_KEY`            | Authentication key   | `your-api-key`                               |\n| `PHOENIX_CLIENT_HEADERS`     | Custom headers       | `Authorization=Bearer token`                 |\n| `PHOENIX_GRPC_PORT`          | gRPC port override   | `4317`                                       |\n\n## Documentation\n\n- **[Full Documentation](https://arize-phoenix.readthedocs.io/projects/otel/en/latest/index.html)** - Complete API reference and guides\n- **[Phoenix Docs](https://arize.com/docs/phoenix)** - Detailed tracing examples and patterns\n- **[OpenInference](https://github.com/Arize-ai/openinference)** - Auto-instrumentation libraries for frameworks\n\n## Community\n\nJoin our community to connect with thousands of AI builders:\n\n- \ud83c\udf0d Join our [Slack community](https://arize-ai.slack.com/join/shared_invite/zt-11t1vbu4x-xkBIHmOREQnYnYDH1GDfCg).\n- \ud83d\udca1 Ask questions and provide feedback in the _#phoenix-support_ channel.\n- \ud83c\udf1f Leave a star on our [GitHub](https://github.com/Arize-ai/phoenix).\n- \ud83d\udc1e Report bugs with [GitHub Issues](https://github.com/Arize-ai/phoenix/issues).\n- \ud835\udd4f Follow us on [\ud835\udd4f](https://twitter.com/ArizePhoenix).\n- \ud83d\uddfa\ufe0f Check out our [roadmap](https://github.com/orgs/Arize-ai/projects/45) to see where we're heading next.\n",
    "bugtrack_url": null,
    "license": "Apache-2.0",
    "summary": "LLM Observability",
    "version": "0.13.1",
    "project_urls": {
        "Documentation": "https://arize.com/docs/phoenix/",
        "Issues": "https://github.com/Arize-ai/phoenix/issues",
        "Source": "https://github.com/Arize-ai/phoenix"
    },
    "split_keywords": [
        "explainability",
        " monitoring",
        " observability"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "8d401d421e172453f07fc8a91b5a2407350bbba598e37b9025ae84b68b0ce8a2",
                "md5": "af71d5490a34041c61b72f069b23874e",
                "sha256": "98d34da78aebac7f60ec4bc30f0eab1e4490c7329b2c74988b9684f9dc182949"
            },
            "downloads": -1,
            "filename": "arize_phoenix_otel-0.13.1-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "af71d5490a34041c61b72f069b23874e",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<3.14,>=3.8",
            "size": 17717,
            "upload_time": "2025-09-10T06:09:30",
            "upload_time_iso_8601": "2025-09-10T06:09:30.037202Z",
            "url": "https://files.pythonhosted.org/packages/8d/40/1d421e172453f07fc8a91b5a2407350bbba598e37b9025ae84b68b0ce8a2/arize_phoenix_otel-0.13.1-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "443b8745dc29b9d7db658f64d4a14d100285d1b74f38827171c22120aa5b7c9c",
                "md5": "d34d652f6922edc8d821867059b81663",
                "sha256": "6fb7bcfed4260b9ddd5bcaa20a678b60e388525e4bbaec1247092f163a0e0204"
            },
            "downloads": -1,
            "filename": "arize_phoenix_otel-0.13.1.tar.gz",
            "has_sig": false,
            "md5_digest": "d34d652f6922edc8d821867059b81663",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<3.14,>=3.8",
            "size": 20166,
            "upload_time": "2025-09-10T06:09:30",
            "upload_time_iso_8601": "2025-09-10T06:09:30.954999Z",
            "url": "https://files.pythonhosted.org/packages/44/3b/8745dc29b9d7db658f64d4a14d100285d1b74f38827171c22120aa5b7c9c/arize_phoenix_otel-0.13.1.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-09-10 06:09:30",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "Arize-ai",
    "github_project": "phoenix",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "tox": true,
    "lcname": "arize-phoenix-otel"
}