# TCC OpenTelemetry SDK for Python
OpenTelemetry instrumentation for Python AI frameworks to send traces to [The Context Company](https://www.thecontext.company) platform.
## Features
- Zero-config setup - Just one function call to start tracing
- Framework-specific instrumentations - Currently supports LangChain, with more coming soon
- Automatic capture - LLM calls, tool executions, and workflow traces
- Custom metadata - Tag traces with your own business logic (user IDs, service names, environments, etc.)
- Secure - API key-based authentication
- Production-ready - Built on OpenTelemetry standards
## Installation
```bash
# Install base package
pip install tcc-otel
# Install with LangChain support
pip install tcc-otel[langchain]
```
## Quick Start
### LangChain
```python
import os
from dotenv import load_dotenv
# Load environment variables
load_dotenv()
# Initialize TCC instrumentation BEFORE importing LangChain
from tcc_otel import instrument_langchain
instrument_langchain(
    api_key=os.getenv("TCC_API_KEY"),
)
# Now import and use LangChain - all operations will be automatically traced
from langchain_openai import ChatOpenAI
from langgraph.prebuilt import create_react_agent
# Your code here...
```
## Configuration
### Environment Variables
```bash
TCC_API_KEY=your_api_key_here
```
### Parameters
- `api_key` (required): Your TCC API key
- `trace_content` (optional): Whether to capture prompts and completions (default: True)
## Adding Custom Metadata
Custom metadata allows you to tag your traces with your own business logic, such as:
- Service names (e.g., `"customer-chatbot"`, `"api-backend"`)
- User IDs (e.g., `"user-123"`)
- Environments (e.g., `"production"`, `"staging"`)
- Feature flags, tenant IDs, or any other custom dimensions
Custom metadata is added using **LangChain's RunnableConfig** by passing a `metadata` dict as the second argument to `invoke()`:
```python
from langchain_openai import ChatOpenAI
from langgraph.prebuilt import create_react_agent
# Create your agent
model = ChatOpenAI(model="gpt-4")
agent = create_react_agent(model, tools=[])
# Add custom metadata via RunnableConfig
result = agent.invoke(
    {"messages": [("user", "Hello!")]},
    {
        "metadata": {
            "serviceName": "customer-chatbot",
            "userId": "user_123",
            "environment": "production"
        }
    }
)
```
All metadata passed via RunnableConfig will be automatically extracted and stored in the TCC platform, allowing you to filter and analyze traces by your custom dimensions.
### LangGraph Example with Custom Metadata
```python
from tcc_otel import instrument_langchain
from langgraph.prebuilt import create_react_agent
from langchain_openai import ChatOpenAI
from langchain.tools import tool
# Initialize instrumentation
instrument_langchain()
# Define a simple tool
@tool
def get_weather(location: str) -> str:
    """Get weather for a location."""
    return f"The weather in {location} is sunny"
# Create agent
model = ChatOpenAI(model="gpt-4")
agent = create_react_agent(model, tools=[get_weather])
# Run agent with custom metadata via RunnableConfig
result = agent.invoke(
    {"messages": [("user", "What's the weather in NYC?")]},
    {
        "metadata": {
            "serviceName": "support-agent",
            "userId": "user_456",
            "tier": "premium"
        }
    }
)
```
## Requirements
Supports Python 3.9+
### Dependencies
- `opentelemetry-api>=1.29.0`
- `opentelemetry-sdk>=1.29.0`
- `opentelemetry-exporter-otlp>=1.29.0`
### LangChain Support
- `opentelemetry-instrumentation-langchain>=0.47.3`
## Troubleshooting
### Traces not appearing in TCC dashboard
1. **Check API key**: Ensure `TCC_API_KEY` is set correctly
2. **Instrumentation order**: Call `instrument_langchain()` BEFORE importing LangChain
3. **Network**: Ensure your application has internet connectivity
### Import errors
Make sure you've installed the framework-specific extras:
```bash
pip install tcc-otel[langchain]
```
### Custom metadata not showing up
- Ensure you're passing metadata via RunnableConfig as the second argument to `invoke()`
- Format: `agent.invoke(input, {"metadata": {"key": "value"}})`
- Metadata is stored in the `traceloop.entity.input` JSON structure
- Check the TCC dashboard's run details to verify metadata appears in the `run_metadata` table
## License
MIT License - see [LICENSE](LICENSE) for details.
## Resources
- Documentation: https://docs.thecontext.company
- Website: https://www.thecontext.company
- Contact: founders@thecontext.company
            
         
        Raw data
        
            {
    "_id": null,
    "home_page": null,
    "name": "tcc-otel",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.9",
    "maintainer_email": null,
    "keywords": "langchain, llm, observability, opentelemetry, tracing",
    "author": null,
    "author_email": "The Context Company <support@thecontext.company>",
    "download_url": "https://files.pythonhosted.org/packages/5c/1a/4cc61210e43ddfd7462d31213347b6f58689389f193392795a4d87f31631/tcc_otel-0.1.0.tar.gz",
    "platform": null,
    "description": "# TCC OpenTelemetry SDK for Python\n\nOpenTelemetry instrumentation for Python AI frameworks to send traces to [The Context Company](https://www.thecontext.company) platform.\n\n## Features\n\n- Zero-config setup - Just one function call to start tracing\n- Framework-specific instrumentations - Currently supports LangChain, with more coming soon\n- Automatic capture - LLM calls, tool executions, and workflow traces\n- Custom metadata - Tag traces with your own business logic (user IDs, service names, environments, etc.)\n- Secure - API key-based authentication\n- Production-ready - Built on OpenTelemetry standards\n\n## Installation\n\n```bash\n# Install base package\npip install tcc-otel\n\n# Install with LangChain support\npip install tcc-otel[langchain]\n```\n\n## Quick Start\n\n### LangChain\n\n```python\nimport os\nfrom dotenv import load_dotenv\n\n# Load environment variables\nload_dotenv()\n\n# Initialize TCC instrumentation BEFORE importing LangChain\nfrom tcc_otel import instrument_langchain\n\ninstrument_langchain(\n    api_key=os.getenv(\"TCC_API_KEY\"),\n)\n\n# Now import and use LangChain - all operations will be automatically traced\nfrom langchain_openai import ChatOpenAI\nfrom langgraph.prebuilt import create_react_agent\n\n# Your code here...\n```\n\n## Configuration\n\n### Environment Variables\n\n```bash\nTCC_API_KEY=your_api_key_here\n```\n\n### Parameters\n\n- `api_key` (required): Your TCC API key\n- `trace_content` (optional): Whether to capture prompts and completions (default: True)\n\n## Adding Custom Metadata\n\nCustom metadata allows you to tag your traces with your own business logic, such as:\n- Service names (e.g., `\"customer-chatbot\"`, `\"api-backend\"`)\n- User IDs (e.g., `\"user-123\"`)\n- Environments (e.g., `\"production\"`, `\"staging\"`)\n- Feature flags, tenant IDs, or any other custom dimensions\n\nCustom metadata is added using **LangChain's RunnableConfig** by passing a `metadata` dict as the second argument to `invoke()`:\n\n```python\nfrom langchain_openai import ChatOpenAI\nfrom langgraph.prebuilt import create_react_agent\n\n# Create your agent\nmodel = ChatOpenAI(model=\"gpt-4\")\nagent = create_react_agent(model, tools=[])\n\n# Add custom metadata via RunnableConfig\nresult = agent.invoke(\n    {\"messages\": [(\"user\", \"Hello!\")]},\n    {\n        \"metadata\": {\n            \"serviceName\": \"customer-chatbot\",\n            \"userId\": \"user_123\",\n            \"environment\": \"production\"\n        }\n    }\n)\n```\n\nAll metadata passed via RunnableConfig will be automatically extracted and stored in the TCC platform, allowing you to filter and analyze traces by your custom dimensions.\n\n### LangGraph Example with Custom Metadata\n\n```python\nfrom tcc_otel import instrument_langchain\nfrom langgraph.prebuilt import create_react_agent\nfrom langchain_openai import ChatOpenAI\nfrom langchain.tools import tool\n\n# Initialize instrumentation\ninstrument_langchain()\n\n# Define a simple tool\n@tool\ndef get_weather(location: str) -> str:\n    \"\"\"Get weather for a location.\"\"\"\n    return f\"The weather in {location} is sunny\"\n\n# Create agent\nmodel = ChatOpenAI(model=\"gpt-4\")\nagent = create_react_agent(model, tools=[get_weather])\n\n# Run agent with custom metadata via RunnableConfig\nresult = agent.invoke(\n    {\"messages\": [(\"user\", \"What's the weather in NYC?\")]},\n    {\n        \"metadata\": {\n            \"serviceName\": \"support-agent\",\n            \"userId\": \"user_456\",\n            \"tier\": \"premium\"\n        }\n    }\n)\n```\n\n## Requirements\n\nSupports Python 3.9+\n\n### Dependencies\n\n- `opentelemetry-api>=1.29.0`\n- `opentelemetry-sdk>=1.29.0`\n- `opentelemetry-exporter-otlp>=1.29.0`\n\n### LangChain Support\n- `opentelemetry-instrumentation-langchain>=0.47.3`\n\n## Troubleshooting\n\n### Traces not appearing in TCC dashboard\n\n1. **Check API key**: Ensure `TCC_API_KEY` is set correctly\n2. **Instrumentation order**: Call `instrument_langchain()` BEFORE importing LangChain\n3. **Network**: Ensure your application has internet connectivity\n\n### Import errors\n\nMake sure you've installed the framework-specific extras:\n\n```bash\npip install tcc-otel[langchain]\n```\n\n### Custom metadata not showing up\n\n- Ensure you're passing metadata via RunnableConfig as the second argument to `invoke()`\n- Format: `agent.invoke(input, {\"metadata\": {\"key\": \"value\"}})`\n- Metadata is stored in the `traceloop.entity.input` JSON structure\n- Check the TCC dashboard's run details to verify metadata appears in the `run_metadata` table\n\n## License\n\nMIT License - see [LICENSE](LICENSE) for details.\n\n## Resources\n\n- Documentation: https://docs.thecontext.company\n- Website: https://www.thecontext.company\n- Contact: founders@thecontext.company\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "The Context Company OpenTelemetry SDK for Python",
    "version": "0.1.0",
    "project_urls": {
        "Documentation": "https://docs.thecontext.company",
        "Homepage": "https://thecontext.company",
        "Issues": "https://github.com/contextcompany/tcc/issues",
        "Repository": "https://github.com/contextcompany/tcc"
    },
    "split_keywords": [
        "langchain",
        " llm",
        " observability",
        " opentelemetry",
        " tracing"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "7627e035c07a68f4679810386cc49bcfb3eaecbf9b481d748c57a68dbe3a8b92",
                "md5": "42e1465875d68f95e696960af5c86940",
                "sha256": "66d9469bd33e57885b597ad45edb75a8e26e24a01d6f5ae6c0c8801d04ce71e4"
            },
            "downloads": -1,
            "filename": "tcc_otel-0.1.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "42e1465875d68f95e696960af5c86940",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.9",
            "size": 7115,
            "upload_time": "2025-10-23T05:28:45",
            "upload_time_iso_8601": "2025-10-23T05:28:45.036551Z",
            "url": "https://files.pythonhosted.org/packages/76/27/e035c07a68f4679810386cc49bcfb3eaecbf9b481d748c57a68dbe3a8b92/tcc_otel-0.1.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "5c1a4cc61210e43ddfd7462d31213347b6f58689389f193392795a4d87f31631",
                "md5": "eaaa6f01b19cca673bd7a4b2fc2de092",
                "sha256": "e8ab14fd2574dfb8d5e0769c3a7db20565ea5bdb6f39e134c320d7ca191d7d29"
            },
            "downloads": -1,
            "filename": "tcc_otel-0.1.0.tar.gz",
            "has_sig": false,
            "md5_digest": "eaaa6f01b19cca673bd7a4b2fc2de092",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.9",
            "size": 5740,
            "upload_time": "2025-10-23T05:28:46",
            "upload_time_iso_8601": "2025-10-23T05:28:46.508173Z",
            "url": "https://files.pythonhosted.org/packages/5c/1a/4cc61210e43ddfd7462d31213347b6f58689389f193392795a4d87f31631/tcc_otel-0.1.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-10-23 05:28:46",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "contextcompany",
    "github_project": "tcc",
    "github_not_found": true,
    "lcname": "tcc-otel"
}