spyglass-ai


Namespyglass-ai JSON
Version 0.1.6 PyPI version JSON
download
home_pageNone
SummaryA client library for shipping telemetry to the Spyglass AI platform
upload_time2025-10-07 04:56:01
maintainerNone
docs_urlNone
authorNone
requires_python>=3.12
licenseNone
keywords ai llm monitoring observability openai opentelemetry tracing
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # Spyglass SDK
The Spyglass SDK provides client code for shipping telemetry data to the Spyglass AI platform

## Installation
```bash
pip install spyglass-ai
```

## Configuration

Set the following environment variables to configure the SDK:

### Required
- `SPYGLASS_API_KEY`: Your Spyglass API key
- `SPYGLASS_DEPLOYMENT_ID`: Unique identifier for your deployment
  - **Note**: Used for both `service.name` and `deployment.id` attributes

### Optional
- `SPYGLASS_OTEL_EXPORTER_OTLP_ENDPOINT`: Custom endpoint for development

### Example Configuration
```bash
export SPYGLASS_API_KEY="your-api-key"
export SPYGLASS_DEPLOYMENT_ID="user-service-v1.2.0"  # Required - used for both service.name and deployment.id
```

**Note**: `SPYGLASS_DEPLOYMENT_ID` is required and will be used for both the OpenTelemetry `service.name` and `deployment.id` resource attributes. This ensures consistency and simplifies dashboard queries.

## Usage

### Basic Function Tracing

Use the `@spyglass_trace` decorator to automatically trace function calls:

```python
from spyglass_ai import spyglass_trace

@spyglass_trace()
def calculate_total(price, tax_rate):
    return price * (1 + tax_rate)

# Usage
result = calculate_total(100, 0.08)  # This call will be traced
```

You can also provide a custom span name:

```python
@spyglass_trace(name="payment_processing")
def process_payment(amount, card_info):
    # Payment processing logic
    return {"status": "success", "transaction_id": "tx_123"}
```

### OpenAI Integration

Wrap your OpenAI client to automatically trace API calls:

```python
from openai import OpenAI
from spyglass_ai import spyglass_openai

# Create your OpenAI client
client = OpenAI(api_key="your-api-key")

# Wrap it with Spyglass tracing
client = spyglass_openai(client)

# Now all chat completions will be traced
response = client.chat.completions.create(
    model="gpt-3.5-turbo",
    messages=[{"role": "user", "content": "Hello!"}],
    max_tokens=100
)
```

### Complete Example

```python
from openai import OpenAI
from spyglass_ai import spyglass_trace, spyglass_openai

# Set up OpenAI client with tracing
client = OpenAI(api_key="your-api-key")
client = spyglass_openai(client)

@spyglass_trace(name="ai_conversation")
def have_conversation(user_message):
    response = client.chat.completions.create(
        model="gpt-3.5-turbo",
        messages=[{"role": "user", "content": user_message}],
        max_tokens=150
    )
    return response.choices[0].message.content

# This will create traces for both the function and the OpenAI API call
answer = have_conversation("What is the capital of France?")
print(answer)
```

## What Gets Traced

### Function Tracing (`@spyglass_trace`)
- Function name, module, and qualified name
- Input arguments (excluding `self` and `cls`)
- Return values
- Execution time
- Any exceptions that occur

### OpenAI Integration (`spyglass_openai`)
- Model used
- Number of messages
- Request parameters (max_tokens, temperature, etc.)
- Token usage (prompt, completion, total)
- Response model
- Any API errors

## Development
### Install Dependencies
```bash
uv sync --extra test
```

### Run All Tests
```bash
# Run all tests
uv run pytest

# Run with coverage
uv run pytest --cov=src --cov-report=term-missing

# Run specific test file
uv run pytest tests/test_trace.py -v
```
            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "spyglass-ai",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.12",
    "maintainer_email": null,
    "keywords": "ai, llm, monitoring, observability, openai, opentelemetry, tracing",
    "author": null,
    "author_email": "Spyglass AI Team <team@spyglass-ai.com>",
    "download_url": "https://files.pythonhosted.org/packages/31/5b/59f8d8332ff68f5b14b43a7dc21d27346a448bd619e4ec54dd27f6574699/spyglass_ai-0.1.6.tar.gz",
    "platform": null,
    "description": "# Spyglass SDK\nThe Spyglass SDK provides client code for shipping telemetry data to the Spyglass AI platform\n\n## Installation\n```bash\npip install spyglass-ai\n```\n\n## Configuration\n\nSet the following environment variables to configure the SDK:\n\n### Required\n- `SPYGLASS_API_KEY`: Your Spyglass API key\n- `SPYGLASS_DEPLOYMENT_ID`: Unique identifier for your deployment\n  - **Note**: Used for both `service.name` and `deployment.id` attributes\n\n### Optional\n- `SPYGLASS_OTEL_EXPORTER_OTLP_ENDPOINT`: Custom endpoint for development\n\n### Example Configuration\n```bash\nexport SPYGLASS_API_KEY=\"your-api-key\"\nexport SPYGLASS_DEPLOYMENT_ID=\"user-service-v1.2.0\"  # Required - used for both service.name and deployment.id\n```\n\n**Note**: `SPYGLASS_DEPLOYMENT_ID` is required and will be used for both the OpenTelemetry `service.name` and `deployment.id` resource attributes. This ensures consistency and simplifies dashboard queries.\n\n## Usage\n\n### Basic Function Tracing\n\nUse the `@spyglass_trace` decorator to automatically trace function calls:\n\n```python\nfrom spyglass_ai import spyglass_trace\n\n@spyglass_trace()\ndef calculate_total(price, tax_rate):\n    return price * (1 + tax_rate)\n\n# Usage\nresult = calculate_total(100, 0.08)  # This call will be traced\n```\n\nYou can also provide a custom span name:\n\n```python\n@spyglass_trace(name=\"payment_processing\")\ndef process_payment(amount, card_info):\n    # Payment processing logic\n    return {\"status\": \"success\", \"transaction_id\": \"tx_123\"}\n```\n\n### OpenAI Integration\n\nWrap your OpenAI client to automatically trace API calls:\n\n```python\nfrom openai import OpenAI\nfrom spyglass_ai import spyglass_openai\n\n# Create your OpenAI client\nclient = OpenAI(api_key=\"your-api-key\")\n\n# Wrap it with Spyglass tracing\nclient = spyglass_openai(client)\n\n# Now all chat completions will be traced\nresponse = client.chat.completions.create(\n    model=\"gpt-3.5-turbo\",\n    messages=[{\"role\": \"user\", \"content\": \"Hello!\"}],\n    max_tokens=100\n)\n```\n\n### Complete Example\n\n```python\nfrom openai import OpenAI\nfrom spyglass_ai import spyglass_trace, spyglass_openai\n\n# Set up OpenAI client with tracing\nclient = OpenAI(api_key=\"your-api-key\")\nclient = spyglass_openai(client)\n\n@spyglass_trace(name=\"ai_conversation\")\ndef have_conversation(user_message):\n    response = client.chat.completions.create(\n        model=\"gpt-3.5-turbo\",\n        messages=[{\"role\": \"user\", \"content\": user_message}],\n        max_tokens=150\n    )\n    return response.choices[0].message.content\n\n# This will create traces for both the function and the OpenAI API call\nanswer = have_conversation(\"What is the capital of France?\")\nprint(answer)\n```\n\n## What Gets Traced\n\n### Function Tracing (`@spyglass_trace`)\n- Function name, module, and qualified name\n- Input arguments (excluding `self` and `cls`)\n- Return values\n- Execution time\n- Any exceptions that occur\n\n### OpenAI Integration (`spyglass_openai`)\n- Model used\n- Number of messages\n- Request parameters (max_tokens, temperature, etc.)\n- Token usage (prompt, completion, total)\n- Response model\n- Any API errors\n\n## Development\n### Install Dependencies\n```bash\nuv sync --extra test\n```\n\n### Run All Tests\n```bash\n# Run all tests\nuv run pytest\n\n# Run with coverage\nuv run pytest --cov=src --cov-report=term-missing\n\n# Run specific test file\nuv run pytest tests/test_trace.py -v\n```",
    "bugtrack_url": null,
    "license": null,
    "summary": "A client library for shipping telemetry to the Spyglass AI platform",
    "version": "0.1.6",
    "project_urls": {
        "Bug Reports": "https://github.com/spyglass-ai/spyglass-sdk/issues",
        "Changelog": "https://github.com/spyglass-ai/spyglass-sdk/blob/main/CHANGELOG.md",
        "Homepage": "https://spyglass-ai.com",
        "Repository": "https://github.com/spyglass-ai/spyglass-sdk",
        "Source Code": "https://github.com/spyglass-ai/spyglass-sdk"
    },
    "split_keywords": [
        "ai",
        " llm",
        " monitoring",
        " observability",
        " openai",
        " opentelemetry",
        " tracing"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "0693504c69bf2631f9fd3e8c12fcdf9c28a3d9ea6edaf1d7ed783f6e89922a46",
                "md5": "fed381d0287e2d68c44a94a826daaa6e",
                "sha256": "d59b43d20347479c5175ba465068b34ceb62ee78abb66ab734af55e5c8108221"
            },
            "downloads": -1,
            "filename": "spyglass_ai-0.1.6-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "fed381d0287e2d68c44a94a826daaa6e",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.12",
            "size": 14885,
            "upload_time": "2025-10-07T04:56:00",
            "upload_time_iso_8601": "2025-10-07T04:56:00.180074Z",
            "url": "https://files.pythonhosted.org/packages/06/93/504c69bf2631f9fd3e8c12fcdf9c28a3d9ea6edaf1d7ed783f6e89922a46/spyglass_ai-0.1.6-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "315b59f8d8332ff68f5b14b43a7dc21d27346a448bd619e4ec54dd27f6574699",
                "md5": "10960c8a27a8c67459af23284ff8af84",
                "sha256": "b46e87cdb8a990dc576197d1d87e61e8d2a985fd7c60e828a8107209a4d2f633"
            },
            "downloads": -1,
            "filename": "spyglass_ai-0.1.6.tar.gz",
            "has_sig": false,
            "md5_digest": "10960c8a27a8c67459af23284ff8af84",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.12",
            "size": 85344,
            "upload_time": "2025-10-07T04:56:01",
            "upload_time_iso_8601": "2025-10-07T04:56:01.601413Z",
            "url": "https://files.pythonhosted.org/packages/31/5b/59f8d8332ff68f5b14b43a7dc21d27346a448bd619e4ec54dd27f6574699/spyglass_ai-0.1.6.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-10-07 04:56:01",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "spyglass-ai",
    "github_project": "spyglass-sdk",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "spyglass-ai"
}
        
Elapsed time: 2.61131s