skimly


Nameskimly JSON
Version 2.0.3 PyPI version JSON
download
home_pageNone
SummaryProduction-grade Python SDK for Skimly - AI token optimization with async streaming, tools, and full API coverage
upload_time2025-09-07 07:02:31
maintainerNone
docs_urlNone
authorNone
requires_python>=3.8
licenseMIT
keywords ai anthropic async claude compression gateway gpt openai sdk skimly streaming tools
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # Skimly Python SDK

A production-grade Python SDK for Skimly - the drop-in gateway for AI-powered coding tools that reduces output token usage through smart compression.

## Features

- 🚀 **Full Type Hints** - Complete type annotations with dataclasses and TypedDict
- 🌊 **Async Streaming** - Real-time streaming with AsyncIterator support
- 🛠️ **Tool Calling** - Complete tool calling interface with helper functions
- 📦 **Blob Management** - Large content handling with automatic deduplication
- ⚡ **Performance** - Built-in retry logic, timeouts, and connection pooling
- 🔄 **Provider Agnostic** - Works with OpenAI, Anthropic, and other providers
- 💾 **Smart Caching** - Automatic blob deduplication to reduce costs
- 🔀 **Sync & Async** - Both synchronous and asynchronous clients

## Installation

```bash
pip install skimly
```

## Quick Start

```python
from skimly import AsyncSkimlyClient

async def main():
    client = AsyncSkimlyClient(
        api_key="sk-your-api-key",
        base_url="https://api.skimly.dev"
    )
    
    async with client:
        response = await client.messages.create({
            "provider": "anthropic",
            "model": "claude-3-5-sonnet-20241022",
            "max_tokens": 1024,
            "messages": [{
                "role": "user",
                "content": "Hello, world!"
            }]
        })
        
        print(response["content"][0]["text"])
        print("Tokens saved:", response["skimly_meta"]["tokens_saved"])

import asyncio
asyncio.run(main())
```

## Streaming

```python
async def streaming_example():
    client = AsyncSkimlyClient.from_env()
    
    async with client:
        stream = client.messages.stream({
            "provider": "openai",
            "model": "gpt-4",
            "messages": [{"role": "user", "content": "Write a story"}],
            "stream": True
        })
        
        async for chunk in stream:
            if chunk.get("type") == "content_block_delta":
                if text := chunk.get("delta", {}).get("text"):
                    print(text, end="", flush=True)
```

## Tool Calling

```python
async def tool_calling_example():
    client = AsyncSkimlyClient.from_env()
    
    async with client:
        response = await client.messages.create({
            "provider": "anthropic",
            "model": "claude-3-5-sonnet-20241022",
            "messages": [{"role": "user", "content": "What's the weather in SF?"}],
            "tools": [{
                "type": "function",
                "function": {
                    "name": "get_weather",
                    "description": "Get weather for a location",
                    "parameters": {
                        "type": "object",
                        "properties": {
                            "location": {"type": "string"}
                        },
                        "required": ["location"]
                    }
                }
            }]
        })
        
        # Check for tool uses
        tool_uses = [
            block for block in response["content"]
            if block.get("type") == "tool_use"
        ]
        print("Tool uses:", tool_uses)
```

## Blob Management

```python
async def blob_example():
    client = AsyncSkimlyClient.from_env()
    
    # Large document
    large_doc = "Very large document content..." * 1000
    
    async with client:
        # Upload blob
        blob_response = await client.create_blob(large_doc)
        blob_id = blob_response["blob_id"]
        
        # Use in chat with pointer
        response = await client.messages.create({
            "provider": "anthropic",
            "model": "claude-3-5-sonnet-20241022",
            "messages": [{
                "role": "user",
                "content": [
                    {"type": "text", "text": "Summarize this:"},
                    {"type": "pointer", "blob_id": blob_id}
                ]
            }]
        })
        
        print("Summary:", response["content"][0]["text"])
        print("Tokens saved:", response["skimly_meta"]["tokens_saved"])
        
        # Automatic deduplication
        deduped = await client.create_blob_if_new(large_doc)
        print("Same blob ID:", deduped["blob_id"] == blob_id)
```

## Environment Setup

```bash
export SKIMLY_KEY=sk-your-api-key
export SKIMLY_BASE=https://api.skimly.dev
```

```python
from skimly import AsyncSkimlyClient
client = AsyncSkimlyClient.from_env()
```

## Error Handling

```python
from skimly import (
    SkimlyError,
    SkimlyAPIError,
    SkimlyAuthenticationError,
    SkimlyRateLimitError
)

try:
    response = await client.messages.create(params)
except SkimlyAuthenticationError:
    print("Invalid API key")
except SkimlyRateLimitError:
    print("Rate limit exceeded")
except SkimlyAPIError as e:
    print(f"API error {e.status}: {e.message}")
```

## Configuration

```python
client = AsyncSkimlyClient(
    api_key="sk-your-key",
    base_url="https://api.skimly.dev",
    timeout=60000,        # 60 seconds
    max_retries=3,        # Retry failed requests
    default_headers={
        "User-Agent": "MyApp/1.0"
    }
)
```

## Synchronous Client

For non-async environments:

```python
from skimly import SkimlyClient

client = SkimlyClient.from_env()

response = client.create_message({
    "provider": "anthropic",
    "model": "claude-3-5-sonnet-20241022",
    "max_tokens": 1024,
    "messages": [{
        "role": "user",
        "content": "Hello from sync client!"
    }]
})

print(response["content"][0]["text"])
```

## Advanced Usage

### Streaming Collection

```python
from skimly import collect_stream

stream = client.messages.stream(params)
message = await collect_stream(stream)
print(message["content"][0]["text"])
```

### Streaming Message Helper

```python
from skimly import StreamingMessage

streaming_msg = StreamingMessage()

async for chunk in stream:
    streaming_msg.add_chunk(chunk)
    
    if streaming_msg.is_complete():
        break

print("Final text:", streaming_msg.get_text())
print("Tool uses:", streaming_msg.get_tool_uses())
```

### Transform Tool Results

```python
compressed = await client.transform(
    result=json.dumps(tool_output),
    tool_name="code_analysis",
    command="analyze_files",
    model="claude-3-5-sonnet-20241022"
)
```

### Fetch with Range

```python
content = await client.fetch_blob(
    blob_id, 
    range_params={"start": 0, "end": 1000}
)
```

## Type Hints

Complete type definitions are included:

```python
from skimly.types_enhanced import (
    MessageParams,
    MessageResponse,
    StreamingChunk,
    ContentBlock,
    Tool,
    SkimlyClientOptions
)
```

## Examples

See the [examples](./examples/) directory for complete usage examples:

- [Basic Usage](./examples/basic_usage.py) - Simple chat, streaming, and tools
- [Advanced Streaming](./examples/advanced_streaming.py) - Complex streaming scenarios

## Requirements

- Python 3.8+
- httpx
- typing_extensions (Python < 3.11)

## License

MIT

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "skimly",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.8",
    "maintainer_email": null,
    "keywords": "ai, anthropic, async, claude, compression, gateway, gpt, openai, sdk, skimly, streaming, tools",
    "author": null,
    "author_email": "Skimly <dev@skimly.dev>",
    "download_url": "https://files.pythonhosted.org/packages/2a/ef/5c5389151278afeeaff04686e7f059e26f141fce5d6722a4b63c582316fd/skimly-2.0.3.tar.gz",
    "platform": null,
    "description": "# Skimly Python SDK\n\nA production-grade Python SDK for Skimly - the drop-in gateway for AI-powered coding tools that reduces output token usage through smart compression.\n\n## Features\n\n- \ud83d\ude80 **Full Type Hints** - Complete type annotations with dataclasses and TypedDict\n- \ud83c\udf0a **Async Streaming** - Real-time streaming with AsyncIterator support\n- \ud83d\udee0\ufe0f **Tool Calling** - Complete tool calling interface with helper functions\n- \ud83d\udce6 **Blob Management** - Large content handling with automatic deduplication\n- \u26a1 **Performance** - Built-in retry logic, timeouts, and connection pooling\n- \ud83d\udd04 **Provider Agnostic** - Works with OpenAI, Anthropic, and other providers\n- \ud83d\udcbe **Smart Caching** - Automatic blob deduplication to reduce costs\n- \ud83d\udd00 **Sync & Async** - Both synchronous and asynchronous clients\n\n## Installation\n\n```bash\npip install skimly\n```\n\n## Quick Start\n\n```python\nfrom skimly import AsyncSkimlyClient\n\nasync def main():\n    client = AsyncSkimlyClient(\n        api_key=\"sk-your-api-key\",\n        base_url=\"https://api.skimly.dev\"\n    )\n    \n    async with client:\n        response = await client.messages.create({\n            \"provider\": \"anthropic\",\n            \"model\": \"claude-3-5-sonnet-20241022\",\n            \"max_tokens\": 1024,\n            \"messages\": [{\n                \"role\": \"user\",\n                \"content\": \"Hello, world!\"\n            }]\n        })\n        \n        print(response[\"content\"][0][\"text\"])\n        print(\"Tokens saved:\", response[\"skimly_meta\"][\"tokens_saved\"])\n\nimport asyncio\nasyncio.run(main())\n```\n\n## Streaming\n\n```python\nasync def streaming_example():\n    client = AsyncSkimlyClient.from_env()\n    \n    async with client:\n        stream = client.messages.stream({\n            \"provider\": \"openai\",\n            \"model\": \"gpt-4\",\n            \"messages\": [{\"role\": \"user\", \"content\": \"Write a story\"}],\n            \"stream\": True\n        })\n        \n        async for chunk in stream:\n            if chunk.get(\"type\") == \"content_block_delta\":\n                if text := chunk.get(\"delta\", {}).get(\"text\"):\n                    print(text, end=\"\", flush=True)\n```\n\n## Tool Calling\n\n```python\nasync def tool_calling_example():\n    client = AsyncSkimlyClient.from_env()\n    \n    async with client:\n        response = await client.messages.create({\n            \"provider\": \"anthropic\",\n            \"model\": \"claude-3-5-sonnet-20241022\",\n            \"messages\": [{\"role\": \"user\", \"content\": \"What's the weather in SF?\"}],\n            \"tools\": [{\n                \"type\": \"function\",\n                \"function\": {\n                    \"name\": \"get_weather\",\n                    \"description\": \"Get weather for a location\",\n                    \"parameters\": {\n                        \"type\": \"object\",\n                        \"properties\": {\n                            \"location\": {\"type\": \"string\"}\n                        },\n                        \"required\": [\"location\"]\n                    }\n                }\n            }]\n        })\n        \n        # Check for tool uses\n        tool_uses = [\n            block for block in response[\"content\"]\n            if block.get(\"type\") == \"tool_use\"\n        ]\n        print(\"Tool uses:\", tool_uses)\n```\n\n## Blob Management\n\n```python\nasync def blob_example():\n    client = AsyncSkimlyClient.from_env()\n    \n    # Large document\n    large_doc = \"Very large document content...\" * 1000\n    \n    async with client:\n        # Upload blob\n        blob_response = await client.create_blob(large_doc)\n        blob_id = blob_response[\"blob_id\"]\n        \n        # Use in chat with pointer\n        response = await client.messages.create({\n            \"provider\": \"anthropic\",\n            \"model\": \"claude-3-5-sonnet-20241022\",\n            \"messages\": [{\n                \"role\": \"user\",\n                \"content\": [\n                    {\"type\": \"text\", \"text\": \"Summarize this:\"},\n                    {\"type\": \"pointer\", \"blob_id\": blob_id}\n                ]\n            }]\n        })\n        \n        print(\"Summary:\", response[\"content\"][0][\"text\"])\n        print(\"Tokens saved:\", response[\"skimly_meta\"][\"tokens_saved\"])\n        \n        # Automatic deduplication\n        deduped = await client.create_blob_if_new(large_doc)\n        print(\"Same blob ID:\", deduped[\"blob_id\"] == blob_id)\n```\n\n## Environment Setup\n\n```bash\nexport SKIMLY_KEY=sk-your-api-key\nexport SKIMLY_BASE=https://api.skimly.dev\n```\n\n```python\nfrom skimly import AsyncSkimlyClient\nclient = AsyncSkimlyClient.from_env()\n```\n\n## Error Handling\n\n```python\nfrom skimly import (\n    SkimlyError,\n    SkimlyAPIError,\n    SkimlyAuthenticationError,\n    SkimlyRateLimitError\n)\n\ntry:\n    response = await client.messages.create(params)\nexcept SkimlyAuthenticationError:\n    print(\"Invalid API key\")\nexcept SkimlyRateLimitError:\n    print(\"Rate limit exceeded\")\nexcept SkimlyAPIError as e:\n    print(f\"API error {e.status}: {e.message}\")\n```\n\n## Configuration\n\n```python\nclient = AsyncSkimlyClient(\n    api_key=\"sk-your-key\",\n    base_url=\"https://api.skimly.dev\",\n    timeout=60000,        # 60 seconds\n    max_retries=3,        # Retry failed requests\n    default_headers={\n        \"User-Agent\": \"MyApp/1.0\"\n    }\n)\n```\n\n## Synchronous Client\n\nFor non-async environments:\n\n```python\nfrom skimly import SkimlyClient\n\nclient = SkimlyClient.from_env()\n\nresponse = client.create_message({\n    \"provider\": \"anthropic\",\n    \"model\": \"claude-3-5-sonnet-20241022\",\n    \"max_tokens\": 1024,\n    \"messages\": [{\n        \"role\": \"user\",\n        \"content\": \"Hello from sync client!\"\n    }]\n})\n\nprint(response[\"content\"][0][\"text\"])\n```\n\n## Advanced Usage\n\n### Streaming Collection\n\n```python\nfrom skimly import collect_stream\n\nstream = client.messages.stream(params)\nmessage = await collect_stream(stream)\nprint(message[\"content\"][0][\"text\"])\n```\n\n### Streaming Message Helper\n\n```python\nfrom skimly import StreamingMessage\n\nstreaming_msg = StreamingMessage()\n\nasync for chunk in stream:\n    streaming_msg.add_chunk(chunk)\n    \n    if streaming_msg.is_complete():\n        break\n\nprint(\"Final text:\", streaming_msg.get_text())\nprint(\"Tool uses:\", streaming_msg.get_tool_uses())\n```\n\n### Transform Tool Results\n\n```python\ncompressed = await client.transform(\n    result=json.dumps(tool_output),\n    tool_name=\"code_analysis\",\n    command=\"analyze_files\",\n    model=\"claude-3-5-sonnet-20241022\"\n)\n```\n\n### Fetch with Range\n\n```python\ncontent = await client.fetch_blob(\n    blob_id, \n    range_params={\"start\": 0, \"end\": 1000}\n)\n```\n\n## Type Hints\n\nComplete type definitions are included:\n\n```python\nfrom skimly.types_enhanced import (\n    MessageParams,\n    MessageResponse,\n    StreamingChunk,\n    ContentBlock,\n    Tool,\n    SkimlyClientOptions\n)\n```\n\n## Examples\n\nSee the [examples](./examples/) directory for complete usage examples:\n\n- [Basic Usage](./examples/basic_usage.py) - Simple chat, streaming, and tools\n- [Advanced Streaming](./examples/advanced_streaming.py) - Complex streaming scenarios\n\n## Requirements\n\n- Python 3.8+\n- httpx\n- typing_extensions (Python < 3.11)\n\n## License\n\nMIT\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "Production-grade Python SDK for Skimly - AI token optimization with async streaming, tools, and full API coverage",
    "version": "2.0.3",
    "project_urls": {
        "Homepage": "https://skimly.dev",
        "Issues": "https://github.com/skimly/skimly/issues"
    },
    "split_keywords": [
        "ai",
        " anthropic",
        " async",
        " claude",
        " compression",
        " gateway",
        " gpt",
        " openai",
        " sdk",
        " skimly",
        " streaming",
        " tools"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "f3943dac09942859720c5828b4d634168a7f5e79a5b1254a65cfd76b1b02b1c4",
                "md5": "ef5f455795a2d30820b49919e1eb7261",
                "sha256": "6f87e2cff035e0244bddc507d439d5e2f13ff6b18373d69554c8ead90796da9f"
            },
            "downloads": -1,
            "filename": "skimly-2.0.3-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "ef5f455795a2d30820b49919e1eb7261",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.8",
            "size": 13782,
            "upload_time": "2025-09-07T07:02:30",
            "upload_time_iso_8601": "2025-09-07T07:02:30.221167Z",
            "url": "https://files.pythonhosted.org/packages/f3/94/3dac09942859720c5828b4d634168a7f5e79a5b1254a65cfd76b1b02b1c4/skimly-2.0.3-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "2aef5c5389151278afeeaff04686e7f059e26f141fce5d6722a4b63c582316fd",
                "md5": "1f4b684fbf948b2f0355593b6558bd34",
                "sha256": "e52855a43b5164fecb2b281c030d17baf2c8c7396354b6fa8f6f6733902db4b6"
            },
            "downloads": -1,
            "filename": "skimly-2.0.3.tar.gz",
            "has_sig": false,
            "md5_digest": "1f4b684fbf948b2f0355593b6558bd34",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.8",
            "size": 16838,
            "upload_time": "2025-09-07T07:02:31",
            "upload_time_iso_8601": "2025-09-07T07:02:31.735920Z",
            "url": "https://files.pythonhosted.org/packages/2a/ef/5c5389151278afeeaff04686e7f059e26f141fce5d6722a4b63c582316fd/skimly-2.0.3.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-09-07 07:02:31",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "skimly",
    "github_project": "skimly",
    "github_not_found": true,
    "lcname": "skimly"
}
        
Elapsed time: 1.23820s