ntt-ai-observability-exporter


Namentt-ai-observability-exporter JSON
Version 0.1.4 PyPI version JSON
download
home_pageNone
SummaryNTT AI Observability Exporter for Azure Monitor OpenTelemetry in AI Foundry projects
upload_time2025-08-19 16:22:56
maintainerNone
docs_urlNone
authorNone
requires_python>=3.8
licenseMIT
keywords ntt azure telemetry opentelemetry monitoring ai observability
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # NTT AI Observability Exporter

A specialized telemetry exporter for NTT AI Foundry projects using Azure Monitor OpenTelemetry. This package simplifies telemetry setup for AI applications built with Azure services.

## Features

- Automatic instrumentation of Azure SDK libraries
- Simplified configuration for Azure Monitor OpenTelemetry
- Specialized support for Semantic Kernel telemetry
- Compatible with Azure AI Foundry projects

## Installation

```bash
# Using pip
pip install ntt-ai-observability-exporter

# Using uv
uv pip install ntt-ai-observability-exporter
```
## Updating Your Package Documentation

Make sure to add a note in your package documentation (such as README.md) about the dependencies:

```markdown
## Dependencies

This package depends on:
- azure-monitor-opentelemetry (>=1.0.0)
- opentelemetry-sdk (>=1.15.0)

These dependencies will be automatically installed when you install the package via pip.

```bash
# Using pip
pip install ntt-ai-observability-exporter

# Using uv
uv pip install ntt-ai-observability-exporter
```

## Usage

### Simple Usage - One Line Setup

```python
from ntt_ai_observability_exporter import configure_telemetry

# That's it! This single line configures all telemetry
configure_telemetry()

# Now you can use your AI components normally - telemetry is automatic
```

### Configuration Options

```python
# Explicit configuration
configure_telemetry(
    connection_string="InstrumentationKey=your-key;IngestionEndpoint=your-endpoint",
    customer_name="your-customer",
    agent_name="your-agent"
)

```

## What Gets Instrumented Automatically

The Azure Monitor OpenTelemetry package automatically instruments:

- **Azure SDK libraries** (including azure.ai.openai)
- **HTTP client libraries** (requests, aiohttp)

This means when you use Azure AI Foundry components, telemetry is captured without any additional code.

## Configuration Parameters

- `connection_string`: Azure Monitor connection string
- `customer_name`: Maps to `service.name` in OpenTelemetry resource
- `agent_name`: Maps to `service.instance.id` in OpenTelemetry resource

## Environment Variables

You can set these environment variables:

- `AZURE_MONITOR_CONNECTION_STRING`: The connection string for Azure Monitor
- `CUSTOMER_NAME`: Maps to `service.name` in OpenTelemetry resource
- `AGENT_NAME`: Maps to `service.instance.id` in OpenTelemetry resource



## Telemetry Types Captured

The configuration captures:

- **Traces**: Request flows and operations
- **Metrics**: Performance measurements 
- **Logs**: When integrated with Python logging

## Example in Azure AI Foundry Project

```python
# Import the NTT AI Observability Exporter
from ntt_ai_observability_exporter import configure_telemetry

# Configure telemetry with your project details
configure_telemetry(
    connection_string="InstrumentationKey=xxx;IngestionEndpoint=https://westeurope-5.in.applicationinsights.azure.com/",
    customer_name="customer-name-foundry",
    agent_name="ai-foundry-agent"
)

# Now use Azure AI components as normal - telemetry is automatic
from azure.ai.assistant import AssistantClient

client = AssistantClient(...)
# All operations are automatically instrumented
```


## Semantic Kernel Telemetry Support

For applications using Semantic Kernel, use the specialized configuration function:

```python
from ntt_ai_observability_exporter import configure_semantic_kernel_telemetry

# Configure Semantic Kernel telemetry BEFORE creating any Kernel instances
configure_semantic_kernel_telemetry(
    connection_string="your_connection_string",
    customer_name="your_customer_name",
    agent_name="your_agent_name"
)

# Then create and use your Semantic Kernel
from semantic_kernel import Kernel
kernel = Kernel()
# ... rest of your code
```

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "ntt-ai-observability-exporter",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.8",
    "maintainer_email": null,
    "keywords": "ntt, azure, telemetry, opentelemetry, monitoring, ai, observability",
    "author": null,
    "author_email": "Anand Vaibhav Singh <anandvaibhav-singh_nttltd@example.com>",
    "download_url": "https://files.pythonhosted.org/packages/1c/6b/8ce2475bd5a49b1118ed0d4db02b5f31c2334507ff93b350a95bf0e4ecac/ntt_ai_observability_exporter-0.1.4.tar.gz",
    "platform": null,
    "description": "# NTT AI Observability Exporter\r\n\r\nA specialized telemetry exporter for NTT AI Foundry projects using Azure Monitor OpenTelemetry. This package simplifies telemetry setup for AI applications built with Azure services.\r\n\r\n## Features\r\n\r\n- Automatic instrumentation of Azure SDK libraries\r\n- Simplified configuration for Azure Monitor OpenTelemetry\r\n- Specialized support for Semantic Kernel telemetry\r\n- Compatible with Azure AI Foundry projects\r\n\r\n## Installation\r\n\r\n```bash\r\n# Using pip\r\npip install ntt-ai-observability-exporter\r\n\r\n# Using uv\r\nuv pip install ntt-ai-observability-exporter\r\n```\r\n## Updating Your Package Documentation\r\n\r\nMake sure to add a note in your package documentation (such as README.md) about the dependencies:\r\n\r\n```markdown\r\n## Dependencies\r\n\r\nThis package depends on:\r\n- azure-monitor-opentelemetry (>=1.0.0)\r\n- opentelemetry-sdk (>=1.15.0)\r\n\r\nThese dependencies will be automatically installed when you install the package via pip.\r\n\r\n```bash\r\n# Using pip\r\npip install ntt-ai-observability-exporter\r\n\r\n# Using uv\r\nuv pip install ntt-ai-observability-exporter\r\n```\r\n\r\n## Usage\r\n\r\n### Simple Usage - One Line Setup\r\n\r\n```python\r\nfrom ntt_ai_observability_exporter import configure_telemetry\r\n\r\n# That's it! This single line configures all telemetry\r\nconfigure_telemetry()\r\n\r\n# Now you can use your AI components normally - telemetry is automatic\r\n```\r\n\r\n### Configuration Options\r\n\r\n```python\r\n# Explicit configuration\r\nconfigure_telemetry(\r\n    connection_string=\"InstrumentationKey=your-key;IngestionEndpoint=your-endpoint\",\r\n    customer_name=\"your-customer\",\r\n    agent_name=\"your-agent\"\r\n)\r\n\r\n```\r\n\r\n## What Gets Instrumented Automatically\r\n\r\nThe Azure Monitor OpenTelemetry package automatically instruments:\r\n\r\n- **Azure SDK libraries** (including azure.ai.openai)\r\n- **HTTP client libraries** (requests, aiohttp)\r\n\r\nThis means when you use Azure AI Foundry components, telemetry is captured without any additional code.\r\n\r\n## Configuration Parameters\r\n\r\n- `connection_string`: Azure Monitor connection string\r\n- `customer_name`: Maps to `service.name` in OpenTelemetry resource\r\n- `agent_name`: Maps to `service.instance.id` in OpenTelemetry resource\r\n\r\n## Environment Variables\r\n\r\nYou can set these environment variables:\r\n\r\n- `AZURE_MONITOR_CONNECTION_STRING`: The connection string for Azure Monitor\r\n- `CUSTOMER_NAME`: Maps to `service.name` in OpenTelemetry resource\r\n- `AGENT_NAME`: Maps to `service.instance.id` in OpenTelemetry resource\r\n\r\n\r\n\r\n## Telemetry Types Captured\r\n\r\nThe configuration captures:\r\n\r\n- **Traces**: Request flows and operations\r\n- **Metrics**: Performance measurements \r\n- **Logs**: When integrated with Python logging\r\n\r\n## Example in Azure AI Foundry Project\r\n\r\n```python\r\n# Import the NTT AI Observability Exporter\r\nfrom ntt_ai_observability_exporter import configure_telemetry\r\n\r\n# Configure telemetry with your project details\r\nconfigure_telemetry(\r\n    connection_string=\"InstrumentationKey=xxx;IngestionEndpoint=https://westeurope-5.in.applicationinsights.azure.com/\",\r\n    customer_name=\"customer-name-foundry\",\r\n    agent_name=\"ai-foundry-agent\"\r\n)\r\n\r\n# Now use Azure AI components as normal - telemetry is automatic\r\nfrom azure.ai.assistant import AssistantClient\r\n\r\nclient = AssistantClient(...)\r\n# All operations are automatically instrumented\r\n```\r\n\r\n\r\n## Semantic Kernel Telemetry Support\r\n\r\nFor applications using Semantic Kernel, use the specialized configuration function:\r\n\r\n```python\r\nfrom ntt_ai_observability_exporter import configure_semantic_kernel_telemetry\r\n\r\n# Configure Semantic Kernel telemetry BEFORE creating any Kernel instances\r\nconfigure_semantic_kernel_telemetry(\r\n    connection_string=\"your_connection_string\",\r\n    customer_name=\"your_customer_name\",\r\n    agent_name=\"your_agent_name\"\r\n)\r\n\r\n# Then create and use your Semantic Kernel\r\nfrom semantic_kernel import Kernel\r\nkernel = Kernel()\r\n# ... rest of your code\r\n```\r\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "NTT AI Observability Exporter for Azure Monitor OpenTelemetry in AI Foundry projects",
    "version": "0.1.4",
    "project_urls": {
        "Bug Tracker": "https://github.com/nttlimited/ntt-ai-observability-exporter/issues",
        "Homepage": "https://github.com/nttlimited/ntt-ai-observability-exporter"
    },
    "split_keywords": [
        "ntt",
        " azure",
        " telemetry",
        " opentelemetry",
        " monitoring",
        " ai",
        " observability"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "6d7a2b81e32ca367e503abd9418fae67405c504771749a2d44c8d0ffcc8f7457",
                "md5": "eb28433079ea577556ad48dc9e88d22c",
                "sha256": "8791927ba5061b1e1227ab939b2e09616cfa62c39be5cac2ad187ffcf1a3a62e"
            },
            "downloads": -1,
            "filename": "ntt_ai_observability_exporter-0.1.4-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "eb28433079ea577556ad48dc9e88d22c",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.8",
            "size": 7769572,
            "upload_time": "2025-08-19T16:22:51",
            "upload_time_iso_8601": "2025-08-19T16:22:51.982177Z",
            "url": "https://files.pythonhosted.org/packages/6d/7a/2b81e32ca367e503abd9418fae67405c504771749a2d44c8d0ffcc8f7457/ntt_ai_observability_exporter-0.1.4-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "1c6b8ce2475bd5a49b1118ed0d4db02b5f31c2334507ff93b350a95bf0e4ecac",
                "md5": "b2091a8bfcdb684e7bc2520737c8b690",
                "sha256": "bd1a305328555ec0ebb50e68eb19a300a389c8b855b7a914f0e9ec16080b697e"
            },
            "downloads": -1,
            "filename": "ntt_ai_observability_exporter-0.1.4.tar.gz",
            "has_sig": false,
            "md5_digest": "b2091a8bfcdb684e7bc2520737c8b690",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.8",
            "size": 6130072,
            "upload_time": "2025-08-19T16:22:56",
            "upload_time_iso_8601": "2025-08-19T16:22:56.376942Z",
            "url": "https://files.pythonhosted.org/packages/1c/6b/8ce2475bd5a49b1118ed0d4db02b5f31c2334507ff93b350a95bf0e4ecac/ntt_ai_observability_exporter-0.1.4.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-08-19 16:22:56",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "nttlimited",
    "github_project": "ntt-ai-observability-exporter",
    "github_not_found": true,
    "lcname": "ntt-ai-observability-exporter"
}
        
Elapsed time: 1.53654s