# Claude Worker 🚀
**Fire-and-forget task execution for Claude Code SDK with dependency orchestration.**
Run complex AI-powered workflows without blocking your terminal. Define task dependencies, run in parallel, and get instant audio feedback.
[](https://www.python.org/downloads/)
[](https://pypi.org/project/claude-worker/)
[](LICENSE)
## 🎯 Key Features
- **🔗 Task Orchestration** - Define complex workflows with dependencies, automatic parallel execution, and failure propagation
- **🚀 Fire & Forget** - Submit tasks and continue working while Claude handles them in the background
- **💾 Persistent** - SQLite storage survives crashes and restarts
- **🔊 Audio Feedback** - Get notified with sounds when tasks start, complete, or fail
- **📊 Rich Logging** - Structured, emoji-enhanced logs with multiple verbosity levels
- **🔄 Auto-Retry** - Intelligent retry logic with exponential backoff for transient failures
- **🎛️ Multiple Interfaces** - CLI, REST API, and MCP (Claude Desktop) support
## 📦 Installation
```bash
# Everything (recommended)
pip install "claude-worker[full]"
# Just CLI and REST API
pip install "claude-worker[server]"
# Just MCP for Claude Desktop
pip install "claude-worker[mcp]"
```
**Prerequisites:**
- Python 3.10+
- Node.js 16+ (for Claude CLI)
- `ANTHROPIC_API_KEY` or Claude CLI OAuth
## 🚀 Quick Start
### 1. Basic Task Execution
```bash
# Single task - server auto-starts
claude-worker run "Create a Python web scraper for news articles"
# With options
claude-worker run "Refactor auth.py" --dir ./src --model opus --watch
# From file or stdin
claude-worker run instructions.txt
git diff | claude-worker run "Review these changes"
```
### 2. Task Orchestration (NEW!)
Create `workflow.json`:
```json
{
"tasks": [
{
"identifier": "fetch_data",
"execution_prompt": "Download dataset from S3 to /data/raw/"
},
{
"identifier": "process_data",
"execution_prompt": "Clean and transform data in /data/raw/",
"depends_on": ["fetch_data"],
"initial_delay": 2.0
},
{
"identifier": "train_model",
"execution_prompt": "Train ML model on processed data",
"depends_on": ["process_data"],
"model": "opus"
}
]
}
```
Execute workflow:
```bash
# Submit and continue
claude-worker orchestrate workflow.json
# Submit and wait for completion
claude-worker orchestrate workflow.json --wait
# Check progress
claude-worker orchestration-status 1
```
### 3. MCP Integration (Claude Desktop)
Setup once:
```bash
pip install "claude-worker[mcp]"
claude mcp add claude-worker -s user -- python -m claude_worker.mcp.factory
```
Now Claude can create dependent tasks:
```python
# Claude automatically manages orchestrations
await create_task(
task_identifier="setup_db",
execution_prompt="Initialize PostgreSQL database"
)
await create_task(
task_identifier="migrate_schema",
execution_prompt="Run database migrations",
depends_on=["setup_db"],
wait_after_dependencies=2.0
)
await create_task(
task_identifier="seed_data",
execution_prompt="Populate test data",
depends_on=["migrate_schema"]
)
```
## 📊 Real-World Examples
### Example 1: Full-Stack App Development
```json
{
"tasks": [
{
"identifier": "design_api",
"execution_prompt": "Design REST API specification for todo app",
"model": "opus"
},
{
"identifier": "create_backend",
"execution_prompt": "Implement FastAPI backend based on API spec",
"depends_on": ["design_api"],
"working_directory": "./backend"
},
{
"identifier": "create_frontend",
"execution_prompt": "Build React frontend with TypeScript",
"depends_on": ["design_api"],
"working_directory": "./frontend"
},
{
"identifier": "add_tests",
"execution_prompt": "Write comprehensive test suites",
"depends_on": ["create_backend", "create_frontend"]
},
{
"identifier": "create_docker",
"execution_prompt": "Create Docker Compose configuration",
"depends_on": ["add_tests"]
}
]
}
```
### Example 2: Codebase Refactoring
```bash
# Create refactoring pipeline
cat > refactor.json << EOF
{
"tasks": [
{
"identifier": "analyze",
"execution_prompt": "Analyze codebase for code smells and technical debt",
"model": "opus"
},
{
"identifier": "plan_refactor",
"execution_prompt": "Create detailed refactoring plan",
"depends_on": ["analyze"],
"model": "opus"
},
{
"identifier": "refactor_auth",
"execution_prompt": "Refactor authentication module",
"depends_on": ["plan_refactor"]
},
{
"identifier": "refactor_api",
"execution_prompt": "Refactor API endpoints",
"depends_on": ["plan_refactor"]
},
{
"identifier": "refactor_db",
"execution_prompt": "Refactor database layer",
"depends_on": ["plan_refactor"]
},
{
"identifier": "run_tests",
"execution_prompt": "Run all tests and fix any issues",
"depends_on": ["refactor_auth", "refactor_api", "refactor_db"]
}
]
}
EOF
# Execute with progress monitoring
claude-worker orchestrate refactor.json --wait
```
### Example 3: Data Pipeline
```python
# Programmatic orchestration via REST API
import httpx
client = httpx.Client(base_url="http://localhost:8000")
# Define ETL pipeline
pipeline = {
"tasks": [
# Extract (parallel)
{"identifier": "extract_sales", "execution_prompt": "Extract sales data from PostgreSQL"},
{"identifier": "extract_inventory", "execution_prompt": "Extract inventory from MongoDB"},
{"identifier": "extract_customers", "execution_prompt": "Extract customer data from API"},
# Transform (depends on all extracts)
{
"identifier": "transform",
"execution_prompt": "Clean, normalize, and join all datasets",
"depends_on": ["extract_sales", "extract_inventory", "extract_customers"],
"initial_delay": 2.0
},
# Load
{
"identifier": "load_warehouse",
"execution_prompt": "Load transformed data into Snowflake",
"depends_on": ["transform"]
},
# Report
{
"identifier": "generate_report",
"execution_prompt": "Create executive dashboard",
"depends_on": ["load_warehouse"],
"model": "opus"
}
]
}
# Submit pipeline
response = client.post("/api/v1/orchestrations", json=pipeline)
orch_id = response.json()["orchestration_id"]
# Monitor progress
import time
while True:
status = client.get(f"/api/v1/orchestrations/{orch_id}").json()
print(f"Progress: {status['completed_tasks']}/{status['total_tasks']}")
if status['status'] in ['completed', 'failed']:
break
time.sleep(5)
```
## 🎯 Model Selection Strategy
```bash
# Fast, simple tasks (file operations, basic analysis)
--model haiku # ~$0.001 per task
# Balanced performance (default)
--model sonnet # ~$0.01 per task
# Complex reasoning (architecture, refactoring)
--model opus # ~$0.05 per task
```
## 🔊 Sound Notifications
Get audio feedback for task events:
```bash
# Customize sounds
export CLAUDE_WORKER_START_SOUND=/path/to/start.wav
export CLAUDE_WORKER_SUCCESS_SOUND=/path/to/success.wav
export CLAUDE_WORKER_FAILURE_SOUND=/path/to/failure.wav
# Disable if needed
export CLAUDE_WORKER_ENABLE_SOUNDS=false
```
**Platform Support:**
- **macOS**: Native `afplay`
- **Linux**: PulseAudio or ALSA
- **Windows**: PowerShell SoundPlayer
- **Universal**: `mpv` or `ffplay`
## 📁 Logging & Monitoring
```bash
# View all tasks
claude-worker list
# Check specific task
claude-worker status 42
# Monitor logs
tail -f ~/.claude-worker/claude-worker.log # Global
tail -f ~/.claude-worker/tasks/task_*_summary.log # Task summary
tail -f ~/.claude-worker/tasks/task_*_detailed.log # Full details
# List orchestrations
claude-worker list-orchestrations --status running
```
## 🔧 Configuration
### Environment Variables
```bash
# Authentication
export ANTHROPIC_API_KEY="sk-ant-..."
# Server
export CLAUDE_WORKER_HOST="0.0.0.0"
export CLAUDE_WORKER_PORT="8000"
# Storage
export CLAUDE_WORKER_DB="~/.claude-worker/tasks.db"
export CLAUDE_WORKER_LOG_DIR="~/.claude-worker/logs"
# Orchestration limits
export MAX_TASKS_PER_ORCHESTRATION="100"
export ORCHESTRATION_TIMEOUT="3600"
# Sound notifications
export CLAUDE_WORKER_ENABLE_SOUNDS="true"
```
## 🚀 Production Deployment
### Docker Compose
```yaml
version: '3.8'
services:
claude-worker:
image: claude-worker:latest
ports:
- "8000:8000"
environment:
- ANTHROPIC_API_KEY=${ANTHROPIC_API_KEY}
volumes:
- ./data:/data
restart: unless-stopped
```
### Systemd Service
```bash
# /etc/systemd/system/claude-worker.service
[Unit]
Description=Claude Worker Server
After=network.target
[Service]
Type=simple
User=claude
ExecStart=/usr/local/bin/claude-worker server start
Restart=on-failure
Environment="ANTHROPIC_API_KEY=sk-ant-..."
[Install]
WantedBy=multi-user.target
```
## 🆙 Upgrading
### From Pre-Orchestration Versions
```bash
# 1. Upgrade package
pip install --upgrade "claude-worker[full]"
# 2. Backup database
cp ~/.claude-worker/tasks.db ~/.claude-worker/tasks.db.backup
# 3. Run migration
python -m claude_worker.scripts.migrate_db --migrate
# 4. Verify
python -m claude_worker.scripts.migrate_db
# Should show: ✓ Orchestration support: Yes
```
## 🐛 Troubleshooting
### Quick Fixes
```bash
# Check health
curl http://localhost:8000/health
# Reset stuck orchestration
claude-worker cancel-orchestration <id>
# Fix "database locked"
pkill -f claude-worker
rm ~/.claude-worker/tasks.db-journal
# Test sounds
python -c "from claude_worker.server.notification import test_sounds; test_sounds()"
# Full reset
pkill -f claude-worker
rm -rf ~/.claude-worker
claude-worker server start
```
### Common Issues
| Issue | Solution |
|-------|----------|
| Circular dependency | Review task dependencies, ensure DAG |
| Tasks not parallel | Remove unnecessary `depends_on` |
| Database locked | Kill stuck processes, remove lock files |
| No orchestration support | Run migration script |
| Sound not working | Check platform tools (afplay/paplay/mpv) |
## 🛠️ Development
```bash
# Setup
git clone https://github.com/yigitkonur/claude-worker
cd claude-worker
poetry install
poetry shell
# Run with auto-reload
claude-worker server start --reload
# Tests
pytest
black claude_worker/
ruff check claude_worker/
# Build
poetry build
```
## 📚 Documentation
- [Orchestration Guide](HANDOFF_DOCS_TO_NEXT_DEVELOPER/DOC_PACK_02-TASK_ORCHESTRATION_AND_DEPENDENCIES/01_ORCHESTRATION_OVERVIEW.md)
- [CLI Reference](HANDOFF_DOCS_TO_NEXT_DEVELOPER/DOC_PACK_02-TASK_ORCHESTRATION_AND_DEPENDENCIES/03_CLI_USAGE_GUIDE.md)
- [MCP Integration](HANDOFF_DOCS_TO_NEXT_DEVELOPER/DOC_PACK_02-TASK_ORCHESTRATION_AND_DEPENDENCIES/04_MCP_INTEGRATION.md)
- [API Documentation](docs/api.md)
- [Architecture](HANDOFF_DOCS_TO_NEXT_DEVELOPER/DOC_PACK_02-TASK_ORCHESTRATION_AND_DEPENDENCIES/02_IMPLEMENTATION_ARCHITECTURE.md)
## 🤝 Contributing
Contributions welcome! Please read [CONTRIBUTING.md](CONTRIBUTING.md) and follow the SOLE principle:
- Each module has Single, Overarching, Lucidly-stated Expertise
- Maintain backward compatibility
- Add tests for new features
- Update documentation
## 📄 License
MIT - see [LICENSE](LICENSE) file.
## 🔗 Links
- [PyPI Package](https://pypi.org/project/claude-worker/)
- [GitHub Repository](https://github.com/yigitkonur/claude-worker)
- [Issue Tracker](https://github.com/yigitkonur/claude-worker/issues)
- [Discussions](https://github.com/yigitkonur/claude-worker/discussions)
---
**Built with ❤️ for developers who value their time.**
Stop waiting for Claude. Start orchestrating.
Raw data
{
"_id": null,
"home_page": null,
"name": "claude-worker",
"maintainer": null,
"docs_url": null,
"requires_python": "<4.0,>=3.10",
"maintainer_email": null,
"keywords": "claude, ai, automation, task-queue, cli, mcp, anthropic",
"author": "Yigit Konur",
"author_email": "yigit@thinkbuddy.ai",
"download_url": "https://files.pythonhosted.org/packages/94/73/2be70732befc9fb6d6fff7866912393682a37e789955be154917a0375883/claude_worker-0.5.0.tar.gz",
"platform": null,
"description": "# Claude Worker \ud83d\ude80\n\n**Fire-and-forget task execution for Claude Code SDK with dependency orchestration.** \nRun complex AI-powered workflows without blocking your terminal. Define task dependencies, run in parallel, and get instant audio feedback.\n\n[](https://www.python.org/downloads/)\n[](https://pypi.org/project/claude-worker/)\n[](LICENSE)\n\n## \ud83c\udfaf Key Features\n\n- **\ud83d\udd17 Task Orchestration** - Define complex workflows with dependencies, automatic parallel execution, and failure propagation\n- **\ud83d\ude80 Fire & Forget** - Submit tasks and continue working while Claude handles them in the background\n- **\ud83d\udcbe Persistent** - SQLite storage survives crashes and restarts\n- **\ud83d\udd0a Audio Feedback** - Get notified with sounds when tasks start, complete, or fail\n- **\ud83d\udcca Rich Logging** - Structured, emoji-enhanced logs with multiple verbosity levels\n- **\ud83d\udd04 Auto-Retry** - Intelligent retry logic with exponential backoff for transient failures\n- **\ud83c\udf9b\ufe0f Multiple Interfaces** - CLI, REST API, and MCP (Claude Desktop) support\n\n## \ud83d\udce6 Installation\n\n```bash\n# Everything (recommended)\npip install \"claude-worker[full]\"\n\n# Just CLI and REST API\npip install \"claude-worker[server]\"\n\n# Just MCP for Claude Desktop\npip install \"claude-worker[mcp]\"\n```\n\n**Prerequisites:** \n- Python 3.10+\n- Node.js 16+ (for Claude CLI)\n- `ANTHROPIC_API_KEY` or Claude CLI OAuth\n\n## \ud83d\ude80 Quick Start\n\n### 1. Basic Task Execution\n\n```bash\n# Single task - server auto-starts\nclaude-worker run \"Create a Python web scraper for news articles\"\n\n# With options\nclaude-worker run \"Refactor auth.py\" --dir ./src --model opus --watch\n\n# From file or stdin\nclaude-worker run instructions.txt\ngit diff | claude-worker run \"Review these changes\"\n```\n\n### 2. Task Orchestration (NEW!)\n\nCreate `workflow.json`:\n```json\n{\n \"tasks\": [\n {\n \"identifier\": \"fetch_data\",\n \"execution_prompt\": \"Download dataset from S3 to /data/raw/\"\n },\n {\n \"identifier\": \"process_data\",\n \"execution_prompt\": \"Clean and transform data in /data/raw/\",\n \"depends_on\": [\"fetch_data\"],\n \"initial_delay\": 2.0\n },\n {\n \"identifier\": \"train_model\",\n \"execution_prompt\": \"Train ML model on processed data\",\n \"depends_on\": [\"process_data\"],\n \"model\": \"opus\"\n }\n ]\n}\n```\n\nExecute workflow:\n```bash\n# Submit and continue\nclaude-worker orchestrate workflow.json\n\n# Submit and wait for completion\nclaude-worker orchestrate workflow.json --wait\n\n# Check progress\nclaude-worker orchestration-status 1\n```\n\n### 3. MCP Integration (Claude Desktop)\n\nSetup once:\n```bash\npip install \"claude-worker[mcp]\"\nclaude mcp add claude-worker -s user -- python -m claude_worker.mcp.factory\n```\n\nNow Claude can create dependent tasks:\n```python\n# Claude automatically manages orchestrations\nawait create_task(\n task_identifier=\"setup_db\",\n execution_prompt=\"Initialize PostgreSQL database\"\n)\n\nawait create_task(\n task_identifier=\"migrate_schema\",\n execution_prompt=\"Run database migrations\",\n depends_on=[\"setup_db\"],\n wait_after_dependencies=2.0\n)\n\nawait create_task(\n task_identifier=\"seed_data\",\n execution_prompt=\"Populate test data\",\n depends_on=[\"migrate_schema\"]\n)\n```\n\n## \ud83d\udcca Real-World Examples\n\n### Example 1: Full-Stack App Development\n\n```json\n{\n \"tasks\": [\n {\n \"identifier\": \"design_api\",\n \"execution_prompt\": \"Design REST API specification for todo app\",\n \"model\": \"opus\"\n },\n {\n \"identifier\": \"create_backend\",\n \"execution_prompt\": \"Implement FastAPI backend based on API spec\",\n \"depends_on\": [\"design_api\"],\n \"working_directory\": \"./backend\"\n },\n {\n \"identifier\": \"create_frontend\",\n \"execution_prompt\": \"Build React frontend with TypeScript\",\n \"depends_on\": [\"design_api\"],\n \"working_directory\": \"./frontend\"\n },\n {\n \"identifier\": \"add_tests\",\n \"execution_prompt\": \"Write comprehensive test suites\",\n \"depends_on\": [\"create_backend\", \"create_frontend\"]\n },\n {\n \"identifier\": \"create_docker\",\n \"execution_prompt\": \"Create Docker Compose configuration\",\n \"depends_on\": [\"add_tests\"]\n }\n ]\n}\n```\n\n### Example 2: Codebase Refactoring\n\n```bash\n# Create refactoring pipeline\ncat > refactor.json << EOF\n{\n \"tasks\": [\n {\n \"identifier\": \"analyze\",\n \"execution_prompt\": \"Analyze codebase for code smells and technical debt\",\n \"model\": \"opus\"\n },\n {\n \"identifier\": \"plan_refactor\",\n \"execution_prompt\": \"Create detailed refactoring plan\",\n \"depends_on\": [\"analyze\"],\n \"model\": \"opus\"\n },\n {\n \"identifier\": \"refactor_auth\",\n \"execution_prompt\": \"Refactor authentication module\",\n \"depends_on\": [\"plan_refactor\"]\n },\n {\n \"identifier\": \"refactor_api\",\n \"execution_prompt\": \"Refactor API endpoints\",\n \"depends_on\": [\"plan_refactor\"]\n },\n {\n \"identifier\": \"refactor_db\",\n \"execution_prompt\": \"Refactor database layer\",\n \"depends_on\": [\"plan_refactor\"]\n },\n {\n \"identifier\": \"run_tests\",\n \"execution_prompt\": \"Run all tests and fix any issues\",\n \"depends_on\": [\"refactor_auth\", \"refactor_api\", \"refactor_db\"]\n }\n ]\n}\nEOF\n\n# Execute with progress monitoring\nclaude-worker orchestrate refactor.json --wait\n```\n\n### Example 3: Data Pipeline\n\n```python\n# Programmatic orchestration via REST API\nimport httpx\n\nclient = httpx.Client(base_url=\"http://localhost:8000\")\n\n# Define ETL pipeline\npipeline = {\n \"tasks\": [\n # Extract (parallel)\n {\"identifier\": \"extract_sales\", \"execution_prompt\": \"Extract sales data from PostgreSQL\"},\n {\"identifier\": \"extract_inventory\", \"execution_prompt\": \"Extract inventory from MongoDB\"},\n {\"identifier\": \"extract_customers\", \"execution_prompt\": \"Extract customer data from API\"},\n \n # Transform (depends on all extracts)\n {\n \"identifier\": \"transform\",\n \"execution_prompt\": \"Clean, normalize, and join all datasets\",\n \"depends_on\": [\"extract_sales\", \"extract_inventory\", \"extract_customers\"],\n \"initial_delay\": 2.0\n },\n \n # Load\n {\n \"identifier\": \"load_warehouse\",\n \"execution_prompt\": \"Load transformed data into Snowflake\",\n \"depends_on\": [\"transform\"]\n },\n \n # Report\n {\n \"identifier\": \"generate_report\",\n \"execution_prompt\": \"Create executive dashboard\",\n \"depends_on\": [\"load_warehouse\"],\n \"model\": \"opus\"\n }\n ]\n}\n\n# Submit pipeline\nresponse = client.post(\"/api/v1/orchestrations\", json=pipeline)\norch_id = response.json()[\"orchestration_id\"]\n\n# Monitor progress\nimport time\nwhile True:\n status = client.get(f\"/api/v1/orchestrations/{orch_id}\").json()\n print(f\"Progress: {status['completed_tasks']}/{status['total_tasks']}\")\n if status['status'] in ['completed', 'failed']:\n break\n time.sleep(5)\n```\n\n## \ud83c\udfaf Model Selection Strategy\n\n```bash\n# Fast, simple tasks (file operations, basic analysis)\n--model haiku # ~$0.001 per task\n\n# Balanced performance (default)\n--model sonnet # ~$0.01 per task\n\n# Complex reasoning (architecture, refactoring)\n--model opus # ~$0.05 per task\n```\n\n## \ud83d\udd0a Sound Notifications\n\nGet audio feedback for task events:\n\n```bash\n# Customize sounds\nexport CLAUDE_WORKER_START_SOUND=/path/to/start.wav\nexport CLAUDE_WORKER_SUCCESS_SOUND=/path/to/success.wav\nexport CLAUDE_WORKER_FAILURE_SOUND=/path/to/failure.wav\n\n# Disable if needed\nexport CLAUDE_WORKER_ENABLE_SOUNDS=false\n```\n\n**Platform Support:**\n- **macOS**: Native `afplay`\n- **Linux**: PulseAudio or ALSA\n- **Windows**: PowerShell SoundPlayer\n- **Universal**: `mpv` or `ffplay`\n\n## \ud83d\udcc1 Logging & Monitoring\n\n```bash\n# View all tasks\nclaude-worker list\n\n# Check specific task\nclaude-worker status 42\n\n# Monitor logs\ntail -f ~/.claude-worker/claude-worker.log # Global\ntail -f ~/.claude-worker/tasks/task_*_summary.log # Task summary\ntail -f ~/.claude-worker/tasks/task_*_detailed.log # Full details\n\n# List orchestrations\nclaude-worker list-orchestrations --status running\n```\n\n## \ud83d\udd27 Configuration\n\n### Environment Variables\n\n```bash\n# Authentication\nexport ANTHROPIC_API_KEY=\"sk-ant-...\"\n\n# Server\nexport CLAUDE_WORKER_HOST=\"0.0.0.0\"\nexport CLAUDE_WORKER_PORT=\"8000\"\n\n# Storage\nexport CLAUDE_WORKER_DB=\"~/.claude-worker/tasks.db\"\nexport CLAUDE_WORKER_LOG_DIR=\"~/.claude-worker/logs\"\n\n# Orchestration limits\nexport MAX_TASKS_PER_ORCHESTRATION=\"100\"\nexport ORCHESTRATION_TIMEOUT=\"3600\"\n\n# Sound notifications\nexport CLAUDE_WORKER_ENABLE_SOUNDS=\"true\"\n```\n\n## \ud83d\ude80 Production Deployment\n\n### Docker Compose\n\n```yaml\nversion: '3.8'\nservices:\n claude-worker:\n image: claude-worker:latest\n ports:\n - \"8000:8000\"\n environment:\n - ANTHROPIC_API_KEY=${ANTHROPIC_API_KEY}\n volumes:\n - ./data:/data\n restart: unless-stopped\n```\n\n### Systemd Service\n\n```bash\n# /etc/systemd/system/claude-worker.service\n[Unit]\nDescription=Claude Worker Server\nAfter=network.target\n\n[Service]\nType=simple\nUser=claude\nExecStart=/usr/local/bin/claude-worker server start\nRestart=on-failure\nEnvironment=\"ANTHROPIC_API_KEY=sk-ant-...\"\n\n[Install]\nWantedBy=multi-user.target\n```\n\n## \ud83c\udd99 Upgrading\n\n### From Pre-Orchestration Versions\n\n```bash\n# 1. Upgrade package\npip install --upgrade \"claude-worker[full]\"\n\n# 2. Backup database\ncp ~/.claude-worker/tasks.db ~/.claude-worker/tasks.db.backup\n\n# 3. Run migration\npython -m claude_worker.scripts.migrate_db --migrate\n\n# 4. Verify\npython -m claude_worker.scripts.migrate_db\n# Should show: \u2713 Orchestration support: Yes\n```\n\n## \ud83d\udc1b Troubleshooting\n\n### Quick Fixes\n\n```bash\n# Check health\ncurl http://localhost:8000/health\n\n# Reset stuck orchestration\nclaude-worker cancel-orchestration <id>\n\n# Fix \"database locked\"\npkill -f claude-worker\nrm ~/.claude-worker/tasks.db-journal\n\n# Test sounds\npython -c \"from claude_worker.server.notification import test_sounds; test_sounds()\"\n\n# Full reset\npkill -f claude-worker\nrm -rf ~/.claude-worker\nclaude-worker server start\n```\n\n### Common Issues\n\n| Issue | Solution |\n|-------|----------|\n| Circular dependency | Review task dependencies, ensure DAG |\n| Tasks not parallel | Remove unnecessary `depends_on` |\n| Database locked | Kill stuck processes, remove lock files |\n| No orchestration support | Run migration script |\n| Sound not working | Check platform tools (afplay/paplay/mpv) |\n\n## \ud83d\udee0\ufe0f Development\n\n```bash\n# Setup\ngit clone https://github.com/yigitkonur/claude-worker\ncd claude-worker\npoetry install\npoetry shell\n\n# Run with auto-reload\nclaude-worker server start --reload\n\n# Tests\npytest\nblack claude_worker/\nruff check claude_worker/\n\n# Build\npoetry build\n```\n\n## \ud83d\udcda Documentation\n\n- [Orchestration Guide](HANDOFF_DOCS_TO_NEXT_DEVELOPER/DOC_PACK_02-TASK_ORCHESTRATION_AND_DEPENDENCIES/01_ORCHESTRATION_OVERVIEW.md)\n- [CLI Reference](HANDOFF_DOCS_TO_NEXT_DEVELOPER/DOC_PACK_02-TASK_ORCHESTRATION_AND_DEPENDENCIES/03_CLI_USAGE_GUIDE.md)\n- [MCP Integration](HANDOFF_DOCS_TO_NEXT_DEVELOPER/DOC_PACK_02-TASK_ORCHESTRATION_AND_DEPENDENCIES/04_MCP_INTEGRATION.md)\n- [API Documentation](docs/api.md)\n- [Architecture](HANDOFF_DOCS_TO_NEXT_DEVELOPER/DOC_PACK_02-TASK_ORCHESTRATION_AND_DEPENDENCIES/02_IMPLEMENTATION_ARCHITECTURE.md)\n\n## \ud83e\udd1d Contributing\n\nContributions welcome! Please read [CONTRIBUTING.md](CONTRIBUTING.md) and follow the SOLE principle:\n- Each module has Single, Overarching, Lucidly-stated Expertise\n- Maintain backward compatibility\n- Add tests for new features\n- Update documentation\n\n## \ud83d\udcc4 License\n\nMIT - see [LICENSE](LICENSE) file.\n\n## \ud83d\udd17 Links\n\n- [PyPI Package](https://pypi.org/project/claude-worker/)\n- [GitHub Repository](https://github.com/yigitkonur/claude-worker)\n- [Issue Tracker](https://github.com/yigitkonur/claude-worker/issues)\n- [Discussions](https://github.com/yigitkonur/claude-worker/discussions)\n\n---\n\n**Built with \u2764\ufe0f for developers who value their time.** \nStop waiting for Claude. Start orchestrating.\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "Fire-and-forget task execution system for Claude Code SDK with CLI and MCP interfaces",
"version": "0.5.0",
"project_urls": {
"Bug Tracker": "https://github.com/yigitkonur/claude-worker/issues",
"Changelog": "https://github.com/yigitkonur/claude-worker/blob/main/CHANGELOG.md",
"Documentation": "https://github.com/yigitkonur/claude-worker/blob/main/README.md",
"Homepage": "https://github.com/yigitkonur/claude-worker",
"MCP Documentation": "https://github.com/yigitkonur/claude-worker/blob/main/MCP_README.md",
"Repository": "https://github.com/yigitkonur/claude-worker",
"Smithery": "https://smithery.ai/@yigitkonur/claude-worker"
},
"split_keywords": [
"claude",
" ai",
" automation",
" task-queue",
" cli",
" mcp",
" anthropic"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "53a0a31aeb56b0f8966dc1d2ee05a7e2a08f659b698a6a74cc89b74590279e76",
"md5": "e838b6fd07106465cc90ca8cda84c28f",
"sha256": "068973815612c9ada5cc50f60259f2d1aba9f9b12cacbbb4be681fd88d7a95b4"
},
"downloads": -1,
"filename": "claude_worker-0.5.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "e838b6fd07106465cc90ca8cda84c28f",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": "<4.0,>=3.10",
"size": 81317,
"upload_time": "2025-08-21T12:17:34",
"upload_time_iso_8601": "2025-08-21T12:17:34.477854Z",
"url": "https://files.pythonhosted.org/packages/53/a0/a31aeb56b0f8966dc1d2ee05a7e2a08f659b698a6a74cc89b74590279e76/claude_worker-0.5.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "94732be70732befc9fb6d6fff7866912393682a37e789955be154917a0375883",
"md5": "9270a7e7b76a0d6a6bb63aaa18e602ac",
"sha256": "972c65d56fc23d4d66ea1417dfdcbee1a6f3568ae422c29e08600a79533f1701"
},
"downloads": -1,
"filename": "claude_worker-0.5.0.tar.gz",
"has_sig": false,
"md5_digest": "9270a7e7b76a0d6a6bb63aaa18e602ac",
"packagetype": "sdist",
"python_version": "source",
"requires_python": "<4.0,>=3.10",
"size": 68314,
"upload_time": "2025-08-21T12:17:35",
"upload_time_iso_8601": "2025-08-21T12:17:35.546912Z",
"url": "https://files.pythonhosted.org/packages/94/73/2be70732befc9fb6d6fff7866912393682a37e789955be154917a0375883/claude_worker-0.5.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-08-21 12:17:35",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "yigitkonur",
"github_project": "claude-worker",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"lcname": "claude-worker"
}