open-skills


Nameopen-skills JSON
Version 0.2.0 PyPI version JSON
download
home_pageNone
SummaryFramework-agnostic skills subsystem for Python AI agents - embed directly or deploy as a service
upload_time2025-10-19 22:08:11
maintainerRichard Scheiwe
docs_urlNone
authorRichard Scheiwe
requires_python>=3.11
licenseMIT License Copyright (c) 2025 Open Skills Contributors Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
keywords ai agent agents skills llm openai anthropic langchain tools function-calling fastapi async plugin-system automation versioning embeddings
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # open-skills

**A framework-agnostic Skills subsystem for Python agents.** Build, version, and execute reusable agent capabilities as code bundles — embed directly in your app or deploy as a service.

> Inspired by [Anthropic's Skills](https://www.anthropic.com/) feature for Claude.

## Overview

`open-skills` provides a complete system for managing executable code bundles (skills) that AI agents can discover and invoke. Think of it as a plugin system for LLM applications with version control, auto-discovery, and execution tracking.

**Version 0.2.0** introduces **library mode** — embed open-skills directly into any Python application without running a separate service.

### Key Features

✅ **Framework-Agnostic** — Works with OpenAI, Anthropic, LangChain, LlamaIndex, or custom agents

✅ **Two Deployment Modes** — Library (embedded) or Service (microservice)

✅ **Auto-Discovery** — Skills registered from folder structure at startup

✅ **Context-Aware Prompts** — Automatic skill injection into system prompts

✅ **Versioned Bundles** — Skills as folders with metadata, scripts, and resources

✅ **Embedding-Based Search** — Automatic skill selection via vector similarity

✅ **Tool Manifest** — Standard `.well-known/skills.json` for any LLM framework

✅ **Real-Time Streaming** — SSE for execution updates

✅ **Artifact Generation** — File outputs with S3-compatible storage

✅ **Multi-Skill Composition** — Chain or parallelize execution

## Quick Start

### Library Mode (Embed in Your App)

**Install:**

```bash
pip install open-skills
```

**Integrate into FastAPI:**

```python
from fastapi import FastAPI
from open_skills import mount_open_skills

app = FastAPI()

# One-line integration
await mount_open_skills(
    app,
    skills_dir="./skills",              # Auto-discover from this folder
    database_url="postgresql+asyncpg://localhost/mydb",
    openai_api_key="sk-...",
)

# Skills are now:
# - Auto-registered from ./skills folder
# - Discoverable at /.well-known/skills.json
# - Executable via /skills/api/runs
```

**Use with any agent framework:**

```python
from open_skills import as_agent_tools, to_openai_tool
import openai

# Get available tools
tools = await as_agent_tools(published_only=True)
openai_tools = [to_openai_tool(t) for t in tools]

# Use with OpenAI
client = openai.AsyncOpenAI()
response = await client.chat.completions.create(
    model="gpt-4",
    messages=[{"role": "user", "content": "Summarize this document..."}],
    tools=openai_tools,
)
```

### Service Mode (Microservice)

**Run as standalone service:**

```bash
# Using Docker Compose
docker-compose up -d

# Or directly
python -m open_skills.service.main
```

**Access from any language:**

```bash
curl http://localhost:8000/.well-known/skills.json  # Discover tools
curl -X POST http://localhost:8000/api/runs \
  -d '{"skill_version_ids": ["..."], "input": {...}}'
```

## Two Ways to Use

| Mode        | Best For                     | Pros                                 | Cons             |
| ----------- | ---------------------------- | ------------------------------------ | ---------------- |
| **Library** | Monolithic apps, low latency | In-process, zero network overhead    | Shares resources |
| **Service** | Microservices, polyglot apps | Process isolation, language-agnostic | Network overhead |

See [INTEGRATION_GUIDE.md](INTEGRATION_GUIDE.md) for complete integration patterns.

## Skill Bundle Format

A skill is a directory containing:

```
my-skill/
├── SKILL.md          # Metadata (YAML frontmatter + description)
├── scripts/
│   └── main.py       # Entrypoint function
├── resources/        # Optional: templates, data files
│   └── template.txt
└── tests/            # Optional: test inputs
    └── sample.json
```

**SKILL.md Example:**

```markdown
---
name: text_summarizer
version: 1.0.0
entrypoint: scripts/main.py
description: Summarizes long text into key points
inputs:
  - type: text
outputs:
  - type: text
tags: [nlp, summarization, text]
---

# Text Summarizer

This skill takes long text and produces a concise summary.
```

**scripts/main.py Example:**

```python
async def run(input_payload: dict) -> dict:
    text = input_payload.get("text", "")
    summary = text[:200] + "..."  # Simple truncation

    return {
        "outputs": {"summary": summary},
        "artifacts": []
    }
```

## Common Use Cases

### 1. Embed in Existing FastAPI App

```python
from fastapi import FastAPI
from open_skills import mount_open_skills

app = FastAPI()

# Your existing routes
@app.get("/")
async def root():
    return {"app": "my-app"}

# Add skills
@app.on_event("startup")
async def startup():
    await mount_open_skills(
        app,
        prefix="/skills",
        skills_dir="./skills",
        auto_register=True,
    )
```

### 2. Use with OpenAI Tool Calling

```python
from open_skills import configure, as_agent_tools, to_openai_tool
from open_skills.core.executor import SkillExecutor
from open_skills.core.manager import SkillManager
import openai

# Configure library
configure(database_url="postgresql+asyncpg://...", openai_api_key="sk-...")

# Get tools
tools = await as_agent_tools()
openai_tools = [to_openai_tool(t) for t in tools]

# Call OpenAI
client = openai.AsyncOpenAI()
response = await client.chat.completions.create(
    model="gpt-4",
    messages=[{"role": "user", "content": "Help me summarize this..."}],
    tools=openai_tools,
)

# Execute skill if tool called
if response.choices[0].message.tool_calls:
    for tool_call in response.choices[0].message.tool_calls:
        function_name = tool_call.function.name
        tool = next(t for t in tools if t["name"] == function_name)

        # Execute the skill
        # ... (see examples/openai_agents_sdk_example.py for full example)
```

### 3. Context-Aware Prompts (Skill Injection)

```python
from open_skills import configure, inject_skills_context

configure(database_url="postgresql+asyncpg://...", openai_api_key="sk-...")

# Create a context-aware system prompt
base_prompt = "You are a helpful AI assistant."

# Inject available skills into the prompt
system_prompt = await inject_skills_context(
    base_prompt,
    format="detailed"  # or "compact", "numbered"
)

# Now the agent knows what skills are available
agent = Agent(system_prompt=system_prompt)
```

### 4. Auto-Discovery from Folder

```python
from open_skills import configure, register_skills_from_folder

configure(database_url="postgresql+asyncpg://...", openai_api_key="sk-...")

# Auto-register all skills in ./skills folder
versions = await register_skills_from_folder(
    "./skills",
    auto_publish=True,
    visibility="org",
)

print(f"Registered {len(versions)} skills")
```

### 5. Real-Time Execution Streaming

```python
# Backend (Python)
import httpx

async with httpx.AsyncClient() as client:
    async with client.stream("GET", f"/api/runs/{run_id}/stream") as response:
        async for line in response.aiter_lines():
            # Process Server-Sent Events
            print(line)
```

```javascript
// Frontend (JavaScript)
const eventSource = new EventSource(`/api/runs/${runId}/stream`);

eventSource.addEventListener("status", (e) => {
  console.log("Status:", JSON.parse(e.data).status);
});

eventSource.addEventListener("complete", (e) => {
  console.log("Done:", JSON.parse(e.data));
  eventSource.close();
});
```

## Architecture

```
┌─────────────────────────────────────────────────────────┐
│                    Your Application                     │
├─────────────────────────────────────────────────────────┤
│  Library Mode                 │  Service Mode           │
│  ┌─────────────────────┐      │  ┌──────────────────┐  │
│  │ mount_open_skills() │      │  │ HTTP Client      │  │
│  │  • Auto-register    │      │  │  • REST API      │  │
│  │  • Tool discovery   │      │  │  • Language-     │  │
│  │  • In-process exec  │      │  │    agnostic      │  │
│  └─────────────────────┘      │  └──────────────────┘  │
└─────────────────────────────────────────────────────────┘
                      │
                      ▼
        ┌──────────────────────────┐
        │   open-skills Core       │
        ├──────────────────────────┤
        │  • Skill Manager         │
        │  • Skill Router          │
        │  • Skill Executor        │
        │  • Auto-Discovery        │
        │  • Tool Manifest         │
        └────┬─────────────────┬───┘
             │                 │
             ▼                 ▼
        ┌─────────┐      ┌──────────┐
        │Postgres │      │    S3    │
        │+pgvector│      │Artifacts │
        └─────────┘      └──────────┘
```

## Installation

### Prerequisites

- Python 3.11+
- PostgreSQL 14+ with pgvector extension
- OpenAI API key (for embeddings)

### Install Package

```bash
pip install open-skills

# Or for development
git clone https://github.com/rscheiwe/open-skills.git
cd open-skills
pip install -e ".[dev]"
```

### Database Setup

```bash
# Using Docker (recommended)
docker run -d \
  --name openskills-postgres \
  -e POSTGRES_PASSWORD=postgres \
  -e POSTGRES_DB=openskills \
  -p 5432:5432 \
  pgvector/pgvector:pg16

# Run migrations
alembic upgrade head
```

## Configuration

### Library Mode

```python
from open_skills import configure

configure(
    database_url="postgresql+asyncpg://localhost/mydb",
    openai_api_key="sk-...",
    storage_root="./skills",
    artifacts_root="./artifacts",
    # Optional S3 configuration
    s3_endpoint="https://s3.amazonaws.com",
    s3_bucket="my-bucket",
)
```

### Service Mode

Create `.env` file:

```env
POSTGRES_URL=postgresql+asyncpg://user:password@localhost:5432/openskills
OPENAI_API_KEY=sk-...
JWT_SECRET=your-secret-key-here
STORAGE_ROOT=./storage
ARTIFACTS_ROOT=./artifacts

# Optional
S3_ENDPOINT=https://s3.amazonaws.com
S3_BUCKET=open-skills-artifacts
LANGFUSE_API_KEY=  # Telemetry
```

## API Endpoints

When using `mount_open_skills()` or service mode:

| Endpoint                    | Method    | Description             |
| --------------------------- | --------- | ----------------------- |
| `/.well-known/skills.json`  | GET       | Tool discovery manifest |
| `/api/health`               | GET       | Health check            |
| `/api/skills`               | GET, POST | List/create skills      |
| `/api/skills/{id}/versions` | GET, POST | Manage versions         |
| `/api/skills/search`        | POST      | Embedding-based search  |
| `/api/runs`                 | POST      | Execute skills          |
| `/api/runs/{id}`            | GET       | Get run details         |
| `/api/runs/{id}/stream`     | GET       | Real-time SSE stream    |

See [INTEGRATION_GUIDE.md](INTEGRATION_GUIDE.md) for complete API reference.

## CLI Tools

```bash
# Create a new skill
open-skills init my-skill

# Validate skill bundle
open-skills validate ./my-skill

# Test locally
open-skills run-local ./my-skill input.json

# Publish to service
open-skills publish ./my-skill

# Start service
open-skills serve --port 8000
```

## Examples

- [`examples/integration_example.py`](examples/integration_example.py) - Simple FastAPI integration
- [`examples/prompt_injection_example.py`](examples/prompt_injection_example.py) - **Context-aware prompt injection**
- [`examples/openai_agents_sdk_example.py`](examples/openai_agents_sdk_example.py) - **OpenAI Agents SDK integration**
- [`examples/library_mode_complete.py`](examples/library_mode_complete.py) - Full example with OpenAI
- [`examples/streaming_example.py`](examples/streaming_example.py) - SSE streaming client
- [`examples/streaming_frontend_example.html`](examples/streaming_frontend_example.html) - Browser UI
- [`examples/hello-world/`](examples/hello-world/) - Sample skill bundle
- [`examples/text-summarizer/`](examples/text-summarizer/) - Advanced skill example

## Documentation

- **[QUICKSTART.md](QUICKSTART.md)** - Get started in 5 minutes
- **[INTEGRATION_GUIDE.md](INTEGRATION_GUIDE.md)** - Complete integration reference
- **[MIGRATION_GUIDE.md](MIGRATION_GUIDE.md)** - Upgrade from v0.1.0
- **[REFACTOR_SUMMARY.md](REFACTOR_SUMMARY.md)** - What's new in v0.2.0

## Framework Compatibility

Open-skills provides tool converters for:

- **OpenAI** - Function calling format
- **Anthropic** - Tool use format
- **LangChain** - Tool format
- **Custom** - Generic tool contract

```python
from open_skills import as_agent_tools, to_openai_tool, to_anthropic_tool, to_langchain_tool

tools = await as_agent_tools()

# Convert to framework-specific formats
openai_tools = [to_openai_tool(t) for t in tools]
anthropic_tools = [to_anthropic_tool(t) for t in tools]
langchain_tools = [to_langchain_tool(t) for t in tools]
```

## Development

### Run Tests

```bash
pytest                    # All tests
pytest -m unit            # Unit tests only
pytest -m integration     # Integration tests
pytest --cov=open_skills  # With coverage
```

### Code Quality

```bash
black open_skills tests   # Format
ruff check open_skills    # Lint
mypy open_skills          # Type check
```

### Database Migrations

```bash
alembic revision --autogenerate -m "description"  # Create migration
alembic upgrade head                              # Apply
alembic downgrade -1                              # Rollback
```

## Deployment

### Docker (Service Mode)

```bash
docker build -t open-skills:latest .
docker run -p 8000:8000 --env-file .env open-skills:latest
```

### Kubernetes

```bash
kubectl apply -f k8s/
```

### Library Mode (Embedded)

Deploy as part of your application — no separate deployment needed!

See [docs/deployment.md](docs/deployment.md) for production setup.

## Troubleshooting

### Skills not appearing in manifest

```python
from open_skills.core.manager import SkillManager

async with db_session() as db:
    manager = SkillManager(db)
    skills = await manager.list_skills()
    print(f"Found {len(skills)} skills")
```

### Database connection issues

```bash
# Verify pgvector extension
psql -d openskills -c "\dx"

# Test connection
psql postgresql://postgres:postgres@localhost:5432/openskills
```

See [INTEGRATION_GUIDE.md](INTEGRATION_GUIDE.md#troubleshooting) for more.

## Contributing

Contributions welcome! Please read [CONTRIBUTING.md](CONTRIBUTING.md) for guidelines.

## License

MIT License - see [LICENSE](LICENSE) file for details.

## Acknowledgments

Inspired by [Anthropic's Skills](https://www.anthropic.com/) feature for Claude, designed to work with any LLM framework.

---

**Current Version:** 0.2.0 (Framework-Agnostic Release)
**Status:** Production-ready for library mode, service mode, and hybrid deployments

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "open-skills",
    "maintainer": "Richard Scheiwe",
    "docs_url": null,
    "requires_python": ">=3.11",
    "maintainer_email": null,
    "keywords": "ai, agent, agents, skills, llm, openai, anthropic, langchain, tools, function-calling, fastapi, async, plugin-system, automation, versioning, embeddings",
    "author": "Richard Scheiwe",
    "author_email": null,
    "download_url": "https://files.pythonhosted.org/packages/96/0a/150bfe0a4c2aaf61c9fa8c93f07250dd342ddb0a629fe6dd4c8e1e7819d6/open_skills-0.2.0.tar.gz",
    "platform": null,
    "description": "# open-skills\n\n**A framework-agnostic Skills subsystem for Python agents.** Build, version, and execute reusable agent capabilities as code bundles \u2014 embed directly in your app or deploy as a service.\n\n> Inspired by [Anthropic's Skills](https://www.anthropic.com/) feature for Claude.\n\n## Overview\n\n`open-skills` provides a complete system for managing executable code bundles (skills) that AI agents can discover and invoke. Think of it as a plugin system for LLM applications with version control, auto-discovery, and execution tracking.\n\n**Version 0.2.0** introduces **library mode** \u2014 embed open-skills directly into any Python application without running a separate service.\n\n### Key Features\n\n\u2705 **Framework-Agnostic** \u2014 Works with OpenAI, Anthropic, LangChain, LlamaIndex, or custom agents\n\n\u2705 **Two Deployment Modes** \u2014 Library (embedded) or Service (microservice)\n\n\u2705 **Auto-Discovery** \u2014 Skills registered from folder structure at startup\n\n\u2705 **Context-Aware Prompts** \u2014 Automatic skill injection into system prompts\n\n\u2705 **Versioned Bundles** \u2014 Skills as folders with metadata, scripts, and resources\n\n\u2705 **Embedding-Based Search** \u2014 Automatic skill selection via vector similarity\n\n\u2705 **Tool Manifest** \u2014 Standard `.well-known/skills.json` for any LLM framework\n\n\u2705 **Real-Time Streaming** \u2014 SSE for execution updates\n\n\u2705 **Artifact Generation** \u2014 File outputs with S3-compatible storage\n\n\u2705 **Multi-Skill Composition** \u2014 Chain or parallelize execution\n\n## Quick Start\n\n### Library Mode (Embed in Your App)\n\n**Install:**\n\n```bash\npip install open-skills\n```\n\n**Integrate into FastAPI:**\n\n```python\nfrom fastapi import FastAPI\nfrom open_skills import mount_open_skills\n\napp = FastAPI()\n\n# One-line integration\nawait mount_open_skills(\n    app,\n    skills_dir=\"./skills\",              # Auto-discover from this folder\n    database_url=\"postgresql+asyncpg://localhost/mydb\",\n    openai_api_key=\"sk-...\",\n)\n\n# Skills are now:\n# - Auto-registered from ./skills folder\n# - Discoverable at /.well-known/skills.json\n# - Executable via /skills/api/runs\n```\n\n**Use with any agent framework:**\n\n```python\nfrom open_skills import as_agent_tools, to_openai_tool\nimport openai\n\n# Get available tools\ntools = await as_agent_tools(published_only=True)\nopenai_tools = [to_openai_tool(t) for t in tools]\n\n# Use with OpenAI\nclient = openai.AsyncOpenAI()\nresponse = await client.chat.completions.create(\n    model=\"gpt-4\",\n    messages=[{\"role\": \"user\", \"content\": \"Summarize this document...\"}],\n    tools=openai_tools,\n)\n```\n\n### Service Mode (Microservice)\n\n**Run as standalone service:**\n\n```bash\n# Using Docker Compose\ndocker-compose up -d\n\n# Or directly\npython -m open_skills.service.main\n```\n\n**Access from any language:**\n\n```bash\ncurl http://localhost:8000/.well-known/skills.json  # Discover tools\ncurl -X POST http://localhost:8000/api/runs \\\n  -d '{\"skill_version_ids\": [\"...\"], \"input\": {...}}'\n```\n\n## Two Ways to Use\n\n| Mode        | Best For                     | Pros                                 | Cons             |\n| ----------- | ---------------------------- | ------------------------------------ | ---------------- |\n| **Library** | Monolithic apps, low latency | In-process, zero network overhead    | Shares resources |\n| **Service** | Microservices, polyglot apps | Process isolation, language-agnostic | Network overhead |\n\nSee [INTEGRATION_GUIDE.md](INTEGRATION_GUIDE.md) for complete integration patterns.\n\n## Skill Bundle Format\n\nA skill is a directory containing:\n\n```\nmy-skill/\n\u251c\u2500\u2500 SKILL.md          # Metadata (YAML frontmatter + description)\n\u251c\u2500\u2500 scripts/\n\u2502   \u2514\u2500\u2500 main.py       # Entrypoint function\n\u251c\u2500\u2500 resources/        # Optional: templates, data files\n\u2502   \u2514\u2500\u2500 template.txt\n\u2514\u2500\u2500 tests/            # Optional: test inputs\n    \u2514\u2500\u2500 sample.json\n```\n\n**SKILL.md Example:**\n\n```markdown\n---\nname: text_summarizer\nversion: 1.0.0\nentrypoint: scripts/main.py\ndescription: Summarizes long text into key points\ninputs:\n  - type: text\noutputs:\n  - type: text\ntags: [nlp, summarization, text]\n---\n\n# Text Summarizer\n\nThis skill takes long text and produces a concise summary.\n```\n\n**scripts/main.py Example:**\n\n```python\nasync def run(input_payload: dict) -> dict:\n    text = input_payload.get(\"text\", \"\")\n    summary = text[:200] + \"...\"  # Simple truncation\n\n    return {\n        \"outputs\": {\"summary\": summary},\n        \"artifacts\": []\n    }\n```\n\n## Common Use Cases\n\n### 1. Embed in Existing FastAPI App\n\n```python\nfrom fastapi import FastAPI\nfrom open_skills import mount_open_skills\n\napp = FastAPI()\n\n# Your existing routes\n@app.get(\"/\")\nasync def root():\n    return {\"app\": \"my-app\"}\n\n# Add skills\n@app.on_event(\"startup\")\nasync def startup():\n    await mount_open_skills(\n        app,\n        prefix=\"/skills\",\n        skills_dir=\"./skills\",\n        auto_register=True,\n    )\n```\n\n### 2. Use with OpenAI Tool Calling\n\n```python\nfrom open_skills import configure, as_agent_tools, to_openai_tool\nfrom open_skills.core.executor import SkillExecutor\nfrom open_skills.core.manager import SkillManager\nimport openai\n\n# Configure library\nconfigure(database_url=\"postgresql+asyncpg://...\", openai_api_key=\"sk-...\")\n\n# Get tools\ntools = await as_agent_tools()\nopenai_tools = [to_openai_tool(t) for t in tools]\n\n# Call OpenAI\nclient = openai.AsyncOpenAI()\nresponse = await client.chat.completions.create(\n    model=\"gpt-4\",\n    messages=[{\"role\": \"user\", \"content\": \"Help me summarize this...\"}],\n    tools=openai_tools,\n)\n\n# Execute skill if tool called\nif response.choices[0].message.tool_calls:\n    for tool_call in response.choices[0].message.tool_calls:\n        function_name = tool_call.function.name\n        tool = next(t for t in tools if t[\"name\"] == function_name)\n\n        # Execute the skill\n        # ... (see examples/openai_agents_sdk_example.py for full example)\n```\n\n### 3. Context-Aware Prompts (Skill Injection)\n\n```python\nfrom open_skills import configure, inject_skills_context\n\nconfigure(database_url=\"postgresql+asyncpg://...\", openai_api_key=\"sk-...\")\n\n# Create a context-aware system prompt\nbase_prompt = \"You are a helpful AI assistant.\"\n\n# Inject available skills into the prompt\nsystem_prompt = await inject_skills_context(\n    base_prompt,\n    format=\"detailed\"  # or \"compact\", \"numbered\"\n)\n\n# Now the agent knows what skills are available\nagent = Agent(system_prompt=system_prompt)\n```\n\n### 4. Auto-Discovery from Folder\n\n```python\nfrom open_skills import configure, register_skills_from_folder\n\nconfigure(database_url=\"postgresql+asyncpg://...\", openai_api_key=\"sk-...\")\n\n# Auto-register all skills in ./skills folder\nversions = await register_skills_from_folder(\n    \"./skills\",\n    auto_publish=True,\n    visibility=\"org\",\n)\n\nprint(f\"Registered {len(versions)} skills\")\n```\n\n### 5. Real-Time Execution Streaming\n\n```python\n# Backend (Python)\nimport httpx\n\nasync with httpx.AsyncClient() as client:\n    async with client.stream(\"GET\", f\"/api/runs/{run_id}/stream\") as response:\n        async for line in response.aiter_lines():\n            # Process Server-Sent Events\n            print(line)\n```\n\n```javascript\n// Frontend (JavaScript)\nconst eventSource = new EventSource(`/api/runs/${runId}/stream`);\n\neventSource.addEventListener(\"status\", (e) => {\n  console.log(\"Status:\", JSON.parse(e.data).status);\n});\n\neventSource.addEventListener(\"complete\", (e) => {\n  console.log(\"Done:\", JSON.parse(e.data));\n  eventSource.close();\n});\n```\n\n## Architecture\n\n```\n\u250c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2510\n\u2502                    Your Application                     \u2502\n\u251c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2524\n\u2502  Library Mode                 \u2502  Service Mode           \u2502\n\u2502  \u250c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2510      \u2502  \u250c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2510  \u2502\n\u2502  \u2502 mount_open_skills() \u2502      \u2502  \u2502 HTTP Client      \u2502  \u2502\n\u2502  \u2502  \u2022 Auto-register    \u2502      \u2502  \u2502  \u2022 REST API      \u2502  \u2502\n\u2502  \u2502  \u2022 Tool discovery   \u2502      \u2502  \u2502  \u2022 Language-     \u2502  \u2502\n\u2502  \u2502  \u2022 In-process exec  \u2502      \u2502  \u2502    agnostic      \u2502  \u2502\n\u2502  \u2514\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2518      \u2502  \u2514\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2518  \u2502\n\u2514\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2518\n                      \u2502\n                      \u25bc\n        \u250c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2510\n        \u2502   open-skills Core       \u2502\n        \u251c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2524\n        \u2502  \u2022 Skill Manager         \u2502\n        \u2502  \u2022 Skill Router          \u2502\n        \u2502  \u2022 Skill Executor        \u2502\n        \u2502  \u2022 Auto-Discovery        \u2502\n        \u2502  \u2022 Tool Manifest         \u2502\n        \u2514\u2500\u2500\u2500\u2500\u252c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u252c\u2500\u2500\u2500\u2518\n             \u2502                 \u2502\n             \u25bc                 \u25bc\n        \u250c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2510      \u250c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2510\n        \u2502Postgres \u2502      \u2502    S3    \u2502\n        \u2502+pgvector\u2502      \u2502Artifacts \u2502\n        \u2514\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2518      \u2514\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2518\n```\n\n## Installation\n\n### Prerequisites\n\n- Python 3.11+\n- PostgreSQL 14+ with pgvector extension\n- OpenAI API key (for embeddings)\n\n### Install Package\n\n```bash\npip install open-skills\n\n# Or for development\ngit clone https://github.com/rscheiwe/open-skills.git\ncd open-skills\npip install -e \".[dev]\"\n```\n\n### Database Setup\n\n```bash\n# Using Docker (recommended)\ndocker run -d \\\n  --name openskills-postgres \\\n  -e POSTGRES_PASSWORD=postgres \\\n  -e POSTGRES_DB=openskills \\\n  -p 5432:5432 \\\n  pgvector/pgvector:pg16\n\n# Run migrations\nalembic upgrade head\n```\n\n## Configuration\n\n### Library Mode\n\n```python\nfrom open_skills import configure\n\nconfigure(\n    database_url=\"postgresql+asyncpg://localhost/mydb\",\n    openai_api_key=\"sk-...\",\n    storage_root=\"./skills\",\n    artifacts_root=\"./artifacts\",\n    # Optional S3 configuration\n    s3_endpoint=\"https://s3.amazonaws.com\",\n    s3_bucket=\"my-bucket\",\n)\n```\n\n### Service Mode\n\nCreate `.env` file:\n\n```env\nPOSTGRES_URL=postgresql+asyncpg://user:password@localhost:5432/openskills\nOPENAI_API_KEY=sk-...\nJWT_SECRET=your-secret-key-here\nSTORAGE_ROOT=./storage\nARTIFACTS_ROOT=./artifacts\n\n# Optional\nS3_ENDPOINT=https://s3.amazonaws.com\nS3_BUCKET=open-skills-artifacts\nLANGFUSE_API_KEY=  # Telemetry\n```\n\n## API Endpoints\n\nWhen using `mount_open_skills()` or service mode:\n\n| Endpoint                    | Method    | Description             |\n| --------------------------- | --------- | ----------------------- |\n| `/.well-known/skills.json`  | GET       | Tool discovery manifest |\n| `/api/health`               | GET       | Health check            |\n| `/api/skills`               | GET, POST | List/create skills      |\n| `/api/skills/{id}/versions` | GET, POST | Manage versions         |\n| `/api/skills/search`        | POST      | Embedding-based search  |\n| `/api/runs`                 | POST      | Execute skills          |\n| `/api/runs/{id}`            | GET       | Get run details         |\n| `/api/runs/{id}/stream`     | GET       | Real-time SSE stream    |\n\nSee [INTEGRATION_GUIDE.md](INTEGRATION_GUIDE.md) for complete API reference.\n\n## CLI Tools\n\n```bash\n# Create a new skill\nopen-skills init my-skill\n\n# Validate skill bundle\nopen-skills validate ./my-skill\n\n# Test locally\nopen-skills run-local ./my-skill input.json\n\n# Publish to service\nopen-skills publish ./my-skill\n\n# Start service\nopen-skills serve --port 8000\n```\n\n## Examples\n\n- [`examples/integration_example.py`](examples/integration_example.py) - Simple FastAPI integration\n- [`examples/prompt_injection_example.py`](examples/prompt_injection_example.py) - **Context-aware prompt injection**\n- [`examples/openai_agents_sdk_example.py`](examples/openai_agents_sdk_example.py) - **OpenAI Agents SDK integration**\n- [`examples/library_mode_complete.py`](examples/library_mode_complete.py) - Full example with OpenAI\n- [`examples/streaming_example.py`](examples/streaming_example.py) - SSE streaming client\n- [`examples/streaming_frontend_example.html`](examples/streaming_frontend_example.html) - Browser UI\n- [`examples/hello-world/`](examples/hello-world/) - Sample skill bundle\n- [`examples/text-summarizer/`](examples/text-summarizer/) - Advanced skill example\n\n## Documentation\n\n- **[QUICKSTART.md](QUICKSTART.md)** - Get started in 5 minutes\n- **[INTEGRATION_GUIDE.md](INTEGRATION_GUIDE.md)** - Complete integration reference\n- **[MIGRATION_GUIDE.md](MIGRATION_GUIDE.md)** - Upgrade from v0.1.0\n- **[REFACTOR_SUMMARY.md](REFACTOR_SUMMARY.md)** - What's new in v0.2.0\n\n## Framework Compatibility\n\nOpen-skills provides tool converters for:\n\n- **OpenAI** - Function calling format\n- **Anthropic** - Tool use format\n- **LangChain** - Tool format\n- **Custom** - Generic tool contract\n\n```python\nfrom open_skills import as_agent_tools, to_openai_tool, to_anthropic_tool, to_langchain_tool\n\ntools = await as_agent_tools()\n\n# Convert to framework-specific formats\nopenai_tools = [to_openai_tool(t) for t in tools]\nanthropic_tools = [to_anthropic_tool(t) for t in tools]\nlangchain_tools = [to_langchain_tool(t) for t in tools]\n```\n\n## Development\n\n### Run Tests\n\n```bash\npytest                    # All tests\npytest -m unit            # Unit tests only\npytest -m integration     # Integration tests\npytest --cov=open_skills  # With coverage\n```\n\n### Code Quality\n\n```bash\nblack open_skills tests   # Format\nruff check open_skills    # Lint\nmypy open_skills          # Type check\n```\n\n### Database Migrations\n\n```bash\nalembic revision --autogenerate -m \"description\"  # Create migration\nalembic upgrade head                              # Apply\nalembic downgrade -1                              # Rollback\n```\n\n## Deployment\n\n### Docker (Service Mode)\n\n```bash\ndocker build -t open-skills:latest .\ndocker run -p 8000:8000 --env-file .env open-skills:latest\n```\n\n### Kubernetes\n\n```bash\nkubectl apply -f k8s/\n```\n\n### Library Mode (Embedded)\n\nDeploy as part of your application \u2014 no separate deployment needed!\n\nSee [docs/deployment.md](docs/deployment.md) for production setup.\n\n## Troubleshooting\n\n### Skills not appearing in manifest\n\n```python\nfrom open_skills.core.manager import SkillManager\n\nasync with db_session() as db:\n    manager = SkillManager(db)\n    skills = await manager.list_skills()\n    print(f\"Found {len(skills)} skills\")\n```\n\n### Database connection issues\n\n```bash\n# Verify pgvector extension\npsql -d openskills -c \"\\dx\"\n\n# Test connection\npsql postgresql://postgres:postgres@localhost:5432/openskills\n```\n\nSee [INTEGRATION_GUIDE.md](INTEGRATION_GUIDE.md#troubleshooting) for more.\n\n## Contributing\n\nContributions welcome! Please read [CONTRIBUTING.md](CONTRIBUTING.md) for guidelines.\n\n## License\n\nMIT License - see [LICENSE](LICENSE) file for details.\n\n## Acknowledgments\n\nInspired by [Anthropic's Skills](https://www.anthropic.com/) feature for Claude, designed to work with any LLM framework.\n\n---\n\n**Current Version:** 0.2.0 (Framework-Agnostic Release)\n**Status:** Production-ready for library mode, service mode, and hybrid deployments\n",
    "bugtrack_url": null,
    "license": "MIT License\n        \n        Copyright (c) 2025 Open Skills Contributors\n        \n        Permission is hereby granted, free of charge, to any person obtaining a copy\n        of this software and associated documentation files (the \"Software\"), to deal\n        in the Software without restriction, including without limitation the rights\n        to use, copy, modify, merge, publish, distribute, sublicense, and/or sell\n        copies of the Software, and to permit persons to whom the Software is\n        furnished to do so, subject to the following conditions:\n        \n        The above copyright notice and this permission notice shall be included in all\n        copies or substantial portions of the Software.\n        \n        THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\n        IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\n        FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\n        AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\n        LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\n        OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\n        SOFTWARE.\n        ",
    "summary": "Framework-agnostic skills subsystem for Python AI agents - embed directly or deploy as a service",
    "version": "0.2.0",
    "project_urls": {
        "Bug Tracker": "https://github.com/rscheiwe/open-skills/issues",
        "Changelog": "https://github.com/rscheiwe/open-skills/blob/main/CHANGELOG.md",
        "Documentation": "https://github.com/rscheiwe/open-skills#readme",
        "Homepage": "https://github.com/rscheiwe/open-skills",
        "Repository": "https://github.com/rscheiwe/open-skills",
        "Source Code": "https://github.com/rscheiwe/open-skills"
    },
    "split_keywords": [
        "ai",
        " agent",
        " agents",
        " skills",
        " llm",
        " openai",
        " anthropic",
        " langchain",
        " tools",
        " function-calling",
        " fastapi",
        " async",
        " plugin-system",
        " automation",
        " versioning",
        " embeddings"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "74023169c06265519743b77e2500a38d32372c320de91974827ff32c2d175d1e",
                "md5": "354f76d78403787cef2bfc6969dc6853",
                "sha256": "d13fc9ff0d2eeb8dece2be092f513289bf564e717e387b6463306af47c58fe50"
            },
            "downloads": -1,
            "filename": "open_skills-0.2.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "354f76d78403787cef2bfc6969dc6853",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.11",
            "size": 74996,
            "upload_time": "2025-10-19T22:08:09",
            "upload_time_iso_8601": "2025-10-19T22:08:09.680241Z",
            "url": "https://files.pythonhosted.org/packages/74/02/3169c06265519743b77e2500a38d32372c320de91974827ff32c2d175d1e/open_skills-0.2.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "960a150bfe0a4c2aaf61c9fa8c93f07250dd342ddb0a629fe6dd4c8e1e7819d6",
                "md5": "c954731e556503fd49791f0d20a7fd76",
                "sha256": "c0c7432af2ff912ad26caa8a092f2e0ea6c20c038559436d8b38c9aff10b289e"
            },
            "downloads": -1,
            "filename": "open_skills-0.2.0.tar.gz",
            "has_sig": false,
            "md5_digest": "c954731e556503fd49791f0d20a7fd76",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.11",
            "size": 91106,
            "upload_time": "2025-10-19T22:08:11",
            "upload_time_iso_8601": "2025-10-19T22:08:11.362924Z",
            "url": "https://files.pythonhosted.org/packages/96/0a/150bfe0a4c2aaf61c9fa8c93f07250dd342ddb0a629fe6dd4c8e1e7819d6/open_skills-0.2.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-10-19 22:08:11",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "rscheiwe",
    "github_project": "open-skills",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "lcname": "open-skills"
}
        
Elapsed time: 0.53646s