# AI-MCP Terminal
[](https://pypi.org/project/ai-mcp-terminal/)
[](https://www.python.org/downloads/)
[](LICENSE)
> 🚀 Multi-threaded terminal management for AI assistants with real-time web monitoring
**Solve terminal blocking issues** - Commands run async, never block AI operations. Monitor up to **100 concurrent terminals** with intelligent cleanup and system tracking.
---
## ✨ Key Features
### Core Capabilities
* 🚀 **Async Execution** - Commands never block AI operations
* 🔢 **Multi-Threading** - 100 concurrent terminals with ThreadPoolExecutor
* 🧹 **Auto Cleanup** - Smart idle session detection & memory management
* ⚡ **Batch Operations** - Execute across multiple terminals simultaneously
* 📊 **Web Monitor** - Real-time xterm.js interface with system stats
### Smart Execution (v1.0.52+)
* 🔗 **Workflow Engine** - Execute tasks with dependencies (DAG support)
* ⏳ **Smart Waiting** - Block until specific tasks complete
* 📝 **Sequential Execution** - Run commands in strict order
* 🔄 **Auto Retry** - Automatic retry on transient failures
* 📂 **Project Lock** - Terminals always start in project directory
### Platform Support
* 🐧 **WSL Priority** - Auto-detect WSL bash on Windows (preferred)
* 🌐 **UTF-8 Support** - Proper encoding, no garbled text
* 🛑 **Anti-Loop Protection** - Prevents AI from getting stuck in query loops
---
## 🚀 Quick Start (1 Minute)
### Step 1: Add MCP Configuration
Add to your Cursor/Cline MCP settings:
```json
{
"mcpServers": {
"ai-mcp-terminal": {
"command": "uvx",
"args": ["ai-mcp-terminal"],
"env": {}
}
}
}
```
### Step 2: Restart IDE
### Step 3: Start Using
In Cursor:
```
Create 3 terminals and run system checks in parallel
```
**AI will use `create_batch` for true concurrency!**
Browser auto-opens → `http://localhost:8000` → View all terminals in real-time!
---
## 📊 Web Interface
Auto-opens at `http://localhost:8000`
**Features**:
- 📺 Real-time xterm.js terminals
- 📊 CPU/Memory/System stats
- 🔄 Live output streaming
- 🎯 Click to expand terminals
- 🛑 Shutdown server button
---
## 🛠️ Available MCP Tools
### Batch Tools (Recommended)
| Tool | Description | Concurrency |
|------|-------------|-------------|
| `create_batch` | Create multiple terminals + execute | ✅ 100 threads |
| `execute_batch` | Execute across terminals | ✅ 100 threads |
| `get_batch_output` | Get all outputs | ✅ 100 threads |
| `check_completion` | Check status | ✅ 100 threads |
| `broadcast_command` | Send to all terminals | ✅ Async |
### Smart Execution Tools (v1.0.52+)
| Tool | Description | Use Case |
|------|-------------|----------|
| `execute_workflow` | DAG-based task execution | Build → Test → Deploy pipeline |
| `wait_until_complete` | Block until tasks finish | Wait for build before deploy |
| `execute_sequence` | Run commands in order | Step-by-step setup scripts |
| `execute_with_retry` | Auto-retry on failure | Network requests, downloads |
### Single Tools (Use batch tools instead!)
| Tool | Use Instead |
|------|-------------|
| `create_session` | → `create_batch` |
| `execute_command` | → `execute_batch` |
| `get_output` | → `get_batch_output` |
**Why batch tools?**
- 10x faster (parallel execution)
- 1 call instead of 10 calls
- Non-blocking design
---
## 🎯 Use Cases
### Multi-Service Development
```
User: "Start frontend, backend, and database"
AI calls:
create_batch(sessions=[
{name: "frontend", cwd: "./web", initial_command: "npm run dev"},
{name: "backend", cwd: "./api", initial_command: "python app.py"},
{name: "db", cwd: "./", initial_command: "docker-compose up"}
])
Result: 3 services start simultaneously, web interface shows all
```
### System Information Gathering
```
User: "Check system info"
AI calls:
create_batch(sessions=[
{name: "cpu", cwd: ".", initial_command: "wmic cpu get name"},
{name: "mem", cwd: ".", initial_command: "wmic memorychip get capacity"},
{name: "disk", cwd: ".", initial_command: "wmic logicaldisk get size,freespace"},
{name: "os", cwd: ".", initial_command: "systeminfo"}
])
Later:
get_batch_output(session_ids=["cpu", "mem", "disk", "os"])
Result: All info gathered in parallel, 4x faster than serial
```
### PyPI Release Workflow (v1.0.52+)
```
User: "Release to PyPI"
AI calls:
execute_workflow(tasks=[
{
name: "clean",
session_id: "build",
command: "rm -rf dist build *.egg-info"
},
{
name: "build",
session_id: "build",
command: "python -m build",
depends_on: ["clean"]
},
{
name: "upload",
session_id: "upload",
command: "twine upload dist/*",
depends_on: ["build"],
retry: true,
max_retries: 3
}
])
Result:
- Clean executes first
- Build waits for clean to complete
- Upload waits for build, retries on failure
- Tasks run in parallel when possible
```
### Smart Retry for Network Operations
```
User: "Download and install dependencies"
AI calls:
execute_with_retry(
session_id: "npm_install",
command: "npm install",
max_retries: 3,
retry_delay: 2.0
)
Result:
- Attempt 1 fails (network error)
- Wait 2 seconds
- Attempt 2 fails
- Wait 2 seconds
- Attempt 3 succeeds ✓
```
---
## ⚙️ Configuration
Optional environment variables:
```json
{
"mcpServers": {
"ai-mcp-terminal": {
"command": "uvx",
"args": ["ai-mcp-terminal"],
"env": {
"AI_MCP_PREFERRED_SHELL": "bash"
}
}
}
}
```
**Shell Priority**:
- Windows: `WSL bash` (🐧) → `Git Bash` (🐚) → `powershell` → `cmd`
- macOS: `zsh` → `bash` → `sh`
- Linux: `bash` → `zsh` → `sh`
**v1.0.52**: WSL now displays with penguin icon (🐧) in web interface, Git Bash with shell icon (🐚)
---
## 🔧 Installation Options
### Option 1: UVX (Recommended)
```json
{
"command": "uvx",
"args": ["ai-mcp-terminal"]
}
```
**No installation needed!** UV handles everything.
### Option 2: PIPX
```bash
pipx install ai-mcp-terminal
```
```json
{
"command": "ai-mcp-terminal"
}
```
### Option 3: PIP
```bash
pip install ai-mcp-terminal
```
```json
{
"command": "python",
"args": ["-m", "src.main"]
}
```
---
## 🛡️ Anti-Loop Protection
**Problem**: AI gets stuck querying terminal repeatedly
**Solution**: Built-in query counter
- Query 1-2: Normal
- Query 3-4: ⚠️ Warning + stop instruction
- Query ≥5: 🔪 Auto-terminate process
**Result**: AI never loops, always proceeds with tasks
---
## 🚦 How AI Should Use This
### ✅ Correct Pattern
```
Dialog 1:
User: "Deploy React app"
AI:
1. create_batch(...)
2. Reply: "Deploying in background..."
3. END conversation
Dialog 2 (later):
User: "Is it done?"
AI:
1. check_completion(...)
2. Reply: "Still running..." or "Done!"
3. END conversation
```
### ❌ Wrong Pattern (Fixed by protection)
```
Dialog 1:
User: "Deploy React app"
AI:
1. execute_command(...)
2. get_output(...) → running
3. get_output(...) → running [Query 2]
4. get_output(...) → running [Query 3 - WARNING]
5. get_output(...) → running [Query 4]
6. get_output(...) → AUTO-KILLED [Query 5]
7. Error: "Loop detected, process terminated"
```
---
## 📁 Project Structure
```
ai-mcp-terminal/
├── src/
│ ├── main.py # Entry point
│ ├── mcp_server.py # MCP protocol handler (30+ tools)
│ ├── terminal_manager.py # Terminal management (3400+ lines)
│ ├── web_server.py # FastAPI + WebSocket
│ ├── key_mapper.py # Keyboard interaction support
│ └── static/ # Web UI (xterm.js)
├── docs/ # Documentation (15+ guides)
├── examples/ # Usage examples
├── CHANGELOG.md # Detailed version history
├── README.md
├── LICENSE
└── pyproject.toml
```
---
## 🔧 Troubleshooting
### Web Interface Not Opening
**Solution**: Visit `http://localhost:8000` manually
### Port Already in Use
**Solution**:
1. Auto-finds next available port
2. Or click shutdown in existing interface
### AI Keeps Using Single Tools
**Solution**:
1. Restart IDE (MCP caches tool definitions)
2. Check tool descriptions loaded correctly
---
## 📄 License
MIT License - see [LICENSE](LICENSE)
---
## 🤝 Contributing
Contributions welcome! See [CONTRIBUTING.md](CONTRIBUTING.md)
---
## 🔗 Links
- **PyPI**: https://pypi.org/project/ai-mcp-terminal/
- **GitHub**: https://github.com/kanniganfan/ai-mcp-terminal
- **Issues**: https://github.com/kanniganfan/ai-mcp-terminal/issues
- **Changelog**: [CHANGELOG.md](CHANGELOG.md)
---
## 🆕 What's New in v1.0.53
### 🎯 Production-Ready Improvements
Based on real PyPI release testing, v1.0.53 brings **battle-tested improvements** that solve actual production issues:
#### 🔍 Enhanced Debugging
- **Detailed Statistics**: Every command returns `output_bytes`, `output_lines`, `execution_time`, `encoding_used`
- **Clear Status**: Explicit `success: true/false` instead of ambiguous `exit_code: null`
- **No More Guessing**: Know exactly what happened with every command
#### 🛡️ Smart Error Prevention
- **Shell Type Detection**: Warns when PowerShell command sent to Bash terminal (and vice versa)
- **Quick Fix Suggestions**: Provides exact commands to fix common errors
- **7 Error Categories**: PyPI duplicates, encoding errors, permissions, network, syntax, etc.
#### 🌐 Zero-Config UTF-8 (Windows)
- **Auto Setup**: Sets `PYTHONIOENCODING=utf-8` and `PYTHONUTF8=1` automatically
- **No More Encoding Errors**: twine, pip, and other Python tools just work
- **80% Fewer Errors**: Eliminates common `UnicodeEncodeError` issues
#### 🔄 Intelligent Batch Execution
- **Smart Queueing**: Same terminal → sequential, different terminals → concurrent
- **Zero Race Conditions**: No more "upload before build finishes" issues
- **Maximum Efficiency**: Still fully concurrent across different terminals
### Previous Features (v1.0.52)
- ✨ **execute_workflow()** - DAG-based task orchestration
- ⏳ **wait_until_complete()** - Smart blocking wait
- 📝 **execute_sequence()** - Sequential execution with error handling
- 🔄 **execute_with_retry()** - Automatic retry mechanism
See [CHANGELOG.md](CHANGELOG.md) for complete details.
---
**Made with ❤️ for AI Assistants**
If this helps you, please give it a ⭐ star!
Raw data
{
"_id": null,
"home_page": "https://github.com/kanniganfan/ai-mcp-terminal",
"name": "ai-mcp-terminal",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.10",
"maintainer_email": null,
"keywords": "mcp, terminal, ai, multi-thread, concurrent, shell, management, automation",
"author": "AI-MCP Contributors",
"author_email": null,
"download_url": "https://files.pythonhosted.org/packages/59/fb/294b149dade71e15618096124027eb22a02e1357de2a0fb1f1b1ca5ec639/ai_mcp_terminal-1.0.54.tar.gz",
"platform": null,
"description": "# AI-MCP Terminal\r\n\r\n[](https://pypi.org/project/ai-mcp-terminal/)\r\n[](https://www.python.org/downloads/)\r\n[](LICENSE)\r\n\r\n> \ud83d\ude80 Multi-threaded terminal management for AI assistants with real-time web monitoring\r\n\r\n**Solve terminal blocking issues** - Commands run async, never block AI operations. Monitor up to **100 concurrent terminals** with intelligent cleanup and system tracking.\r\n\r\n---\r\n\r\n## \u2728 Key Features\r\n\r\n### Core Capabilities\r\n* \ud83d\ude80 **Async Execution** - Commands never block AI operations\r\n* \ud83d\udd22 **Multi-Threading** - 100 concurrent terminals with ThreadPoolExecutor\r\n* \ud83e\uddf9 **Auto Cleanup** - Smart idle session detection & memory management\r\n* \u26a1 **Batch Operations** - Execute across multiple terminals simultaneously\r\n* \ud83d\udcca **Web Monitor** - Real-time xterm.js interface with system stats\r\n\r\n### Smart Execution (v1.0.52+)\r\n* \ud83d\udd17 **Workflow Engine** - Execute tasks with dependencies (DAG support)\r\n* \u23f3 **Smart Waiting** - Block until specific tasks complete\r\n* \ud83d\udcdd **Sequential Execution** - Run commands in strict order\r\n* \ud83d\udd04 **Auto Retry** - Automatic retry on transient failures\r\n* \ud83d\udcc2 **Project Lock** - Terminals always start in project directory\r\n\r\n### Platform Support\r\n* \ud83d\udc27 **WSL Priority** - Auto-detect WSL bash on Windows (preferred)\r\n* \ud83c\udf10 **UTF-8 Support** - Proper encoding, no garbled text\r\n* \ud83d\uded1 **Anti-Loop Protection** - Prevents AI from getting stuck in query loops\r\n\r\n---\r\n\r\n## \ud83d\ude80 Quick Start (1 Minute)\r\n\r\n### Step 1: Add MCP Configuration\r\n\r\nAdd to your Cursor/Cline MCP settings:\r\n\r\n```json\r\n{\r\n \"mcpServers\": {\r\n \"ai-mcp-terminal\": {\r\n \"command\": \"uvx\",\r\n \"args\": [\"ai-mcp-terminal\"],\r\n \"env\": {}\r\n }\r\n }\r\n}\r\n```\r\n\r\n### Step 2: Restart IDE\r\n\r\n### Step 3: Start Using\r\n\r\nIn Cursor:\r\n\r\n```\r\nCreate 3 terminals and run system checks in parallel\r\n```\r\n\r\n**AI will use `create_batch` for true concurrency!**\r\n\r\nBrowser auto-opens \u2192 `http://localhost:8000` \u2192 View all terminals in real-time!\r\n\r\n---\r\n\r\n## \ud83d\udcca Web Interface\r\n\r\nAuto-opens at `http://localhost:8000`\r\n\r\n**Features**:\r\n- \ud83d\udcfa Real-time xterm.js terminals\r\n- \ud83d\udcca CPU/Memory/System stats \r\n- \ud83d\udd04 Live output streaming\r\n- \ud83c\udfaf Click to expand terminals\r\n- \ud83d\uded1 Shutdown server button\r\n\r\n---\r\n\r\n## \ud83d\udee0\ufe0f Available MCP Tools\r\n\r\n### Batch Tools (Recommended)\r\n\r\n| Tool | Description | Concurrency |\r\n|------|-------------|-------------|\r\n| `create_batch` | Create multiple terminals + execute | \u2705 100 threads |\r\n| `execute_batch` | Execute across terminals | \u2705 100 threads |\r\n| `get_batch_output` | Get all outputs | \u2705 100 threads |\r\n| `check_completion` | Check status | \u2705 100 threads |\r\n| `broadcast_command` | Send to all terminals | \u2705 Async |\r\n\r\n### Smart Execution Tools (v1.0.52+)\r\n\r\n| Tool | Description | Use Case |\r\n|------|-------------|----------|\r\n| `execute_workflow` | DAG-based task execution | Build \u2192 Test \u2192 Deploy pipeline |\r\n| `wait_until_complete` | Block until tasks finish | Wait for build before deploy |\r\n| `execute_sequence` | Run commands in order | Step-by-step setup scripts |\r\n| `execute_with_retry` | Auto-retry on failure | Network requests, downloads |\r\n\r\n### Single Tools (Use batch tools instead!)\r\n\r\n| Tool | Use Instead |\r\n|------|-------------|\r\n| `create_session` | \u2192 `create_batch` |\r\n| `execute_command` | \u2192 `execute_batch` |\r\n| `get_output` | \u2192 `get_batch_output` |\r\n\r\n**Why batch tools?**\r\n- 10x faster (parallel execution)\r\n- 1 call instead of 10 calls\r\n- Non-blocking design\r\n\r\n---\r\n\r\n## \ud83c\udfaf Use Cases\r\n\r\n### Multi-Service Development\r\n\r\n```\r\nUser: \"Start frontend, backend, and database\"\r\n\r\nAI calls:\r\ncreate_batch(sessions=[\r\n {name: \"frontend\", cwd: \"./web\", initial_command: \"npm run dev\"},\r\n {name: \"backend\", cwd: \"./api\", initial_command: \"python app.py\"},\r\n {name: \"db\", cwd: \"./\", initial_command: \"docker-compose up\"}\r\n])\r\n\r\nResult: 3 services start simultaneously, web interface shows all\r\n```\r\n\r\n### System Information Gathering\r\n\r\n```\r\nUser: \"Check system info\"\r\n\r\nAI calls:\r\ncreate_batch(sessions=[\r\n {name: \"cpu\", cwd: \".\", initial_command: \"wmic cpu get name\"},\r\n {name: \"mem\", cwd: \".\", initial_command: \"wmic memorychip get capacity\"},\r\n {name: \"disk\", cwd: \".\", initial_command: \"wmic logicaldisk get size,freespace\"},\r\n {name: \"os\", cwd: \".\", initial_command: \"systeminfo\"}\r\n])\r\n\r\nLater:\r\nget_batch_output(session_ids=[\"cpu\", \"mem\", \"disk\", \"os\"])\r\n\r\nResult: All info gathered in parallel, 4x faster than serial\r\n```\r\n\r\n### PyPI Release Workflow (v1.0.52+)\r\n\r\n```\r\nUser: \"Release to PyPI\"\r\n\r\nAI calls:\r\nexecute_workflow(tasks=[\r\n {\r\n name: \"clean\",\r\n session_id: \"build\",\r\n command: \"rm -rf dist build *.egg-info\"\r\n },\r\n {\r\n name: \"build\",\r\n session_id: \"build\",\r\n command: \"python -m build\",\r\n depends_on: [\"clean\"]\r\n },\r\n {\r\n name: \"upload\",\r\n session_id: \"upload\",\r\n command: \"twine upload dist/*\",\r\n depends_on: [\"build\"],\r\n retry: true,\r\n max_retries: 3\r\n }\r\n])\r\n\r\nResult: \r\n- Clean executes first\r\n- Build waits for clean to complete\r\n- Upload waits for build, retries on failure\r\n- Tasks run in parallel when possible\r\n```\r\n\r\n### Smart Retry for Network Operations\r\n\r\n```\r\nUser: \"Download and install dependencies\"\r\n\r\nAI calls:\r\nexecute_with_retry(\r\n session_id: \"npm_install\",\r\n command: \"npm install\",\r\n max_retries: 3,\r\n retry_delay: 2.0\r\n)\r\n\r\nResult:\r\n- Attempt 1 fails (network error)\r\n- Wait 2 seconds\r\n- Attempt 2 fails \r\n- Wait 2 seconds\r\n- Attempt 3 succeeds \u2713\r\n```\r\n\r\n---\r\n\r\n## \u2699\ufe0f Configuration\r\n\r\nOptional environment variables:\r\n\r\n```json\r\n{\r\n \"mcpServers\": {\r\n \"ai-mcp-terminal\": {\r\n \"command\": \"uvx\",\r\n \"args\": [\"ai-mcp-terminal\"],\r\n \"env\": {\r\n \"AI_MCP_PREFERRED_SHELL\": \"bash\"\r\n }\r\n }\r\n }\r\n}\r\n```\r\n\r\n**Shell Priority**:\r\n- Windows: `WSL bash` (\ud83d\udc27) \u2192 `Git Bash` (\ud83d\udc1a) \u2192 `powershell` \u2192 `cmd`\r\n- macOS: `zsh` \u2192 `bash` \u2192 `sh`\r\n- Linux: `bash` \u2192 `zsh` \u2192 `sh`\r\n\r\n**v1.0.52**: WSL now displays with penguin icon (\ud83d\udc27) in web interface, Git Bash with shell icon (\ud83d\udc1a)\r\n\r\n---\r\n\r\n## \ud83d\udd27 Installation Options\r\n\r\n### Option 1: UVX (Recommended)\r\n\r\n ```json\r\n {\r\n \"command\": \"uvx\",\r\n \"args\": [\"ai-mcp-terminal\"]\r\n}\r\n```\r\n\r\n**No installation needed!** UV handles everything.\r\n\r\n### Option 2: PIPX\r\n\r\n```bash\r\npipx install ai-mcp-terminal\r\n```\r\n\r\n ```json\r\n {\r\n \"command\": \"ai-mcp-terminal\"\r\n}\r\n```\r\n\r\n### Option 3: PIP\r\n\r\n```bash\r\npip install ai-mcp-terminal\r\n```\r\n\r\n```json\r\n{\r\n \"command\": \"python\",\r\n \"args\": [\"-m\", \"src.main\"]\r\n}\r\n```\r\n\r\n---\r\n\r\n## \ud83d\udee1\ufe0f Anti-Loop Protection\r\n\r\n**Problem**: AI gets stuck querying terminal repeatedly\r\n\r\n**Solution**: Built-in query counter\r\n\r\n- Query 1-2: Normal\r\n- Query 3-4: \u26a0\ufe0f Warning + stop instruction \r\n- Query \u22655: \ud83d\udd2a Auto-terminate process\r\n\r\n**Result**: AI never loops, always proceeds with tasks\r\n\r\n---\r\n\r\n## \ud83d\udea6 How AI Should Use This\r\n\r\n### \u2705 Correct Pattern\r\n\r\n```\r\nDialog 1:\r\nUser: \"Deploy React app\"\r\nAI: \r\n 1. create_batch(...) \r\n 2. Reply: \"Deploying in background...\"\r\n 3. END conversation\r\n\r\nDialog 2 (later):\r\nUser: \"Is it done?\"\r\nAI:\r\n 1. check_completion(...)\r\n 2. Reply: \"Still running...\" or \"Done!\"\r\n 3. END conversation\r\n```\r\n\r\n### \u274c Wrong Pattern (Fixed by protection)\r\n\r\n```\r\nDialog 1:\r\nUser: \"Deploy React app\"\r\nAI:\r\n 1. execute_command(...)\r\n 2. get_output(...) \u2192 running\r\n 3. get_output(...) \u2192 running [Query 2]\r\n 4. get_output(...) \u2192 running [Query 3 - WARNING]\r\n 5. get_output(...) \u2192 running [Query 4]\r\n 6. get_output(...) \u2192 AUTO-KILLED [Query 5]\r\n 7. Error: \"Loop detected, process terminated\"\r\n```\r\n\r\n---\r\n\r\n## \ud83d\udcc1 Project Structure\r\n\r\n```\r\nai-mcp-terminal/\r\n\u251c\u2500\u2500 src/\r\n\u2502 \u251c\u2500\u2500 main.py # Entry point\r\n\u2502 \u251c\u2500\u2500 mcp_server.py # MCP protocol handler (30+ tools)\r\n\u2502 \u251c\u2500\u2500 terminal_manager.py # Terminal management (3400+ lines)\r\n\u2502 \u251c\u2500\u2500 web_server.py # FastAPI + WebSocket\r\n\u2502 \u251c\u2500\u2500 key_mapper.py # Keyboard interaction support\r\n\u2502 \u2514\u2500\u2500 static/ # Web UI (xterm.js)\r\n\u251c\u2500\u2500 docs/ # Documentation (15+ guides)\r\n\u251c\u2500\u2500 examples/ # Usage examples\r\n\u251c\u2500\u2500 CHANGELOG.md # Detailed version history\r\n\u251c\u2500\u2500 README.md\r\n\u251c\u2500\u2500 LICENSE\r\n\u2514\u2500\u2500 pyproject.toml\r\n```\r\n\r\n---\r\n\r\n## \ud83d\udd27 Troubleshooting\r\n\r\n### Web Interface Not Opening\r\n\r\n**Solution**: Visit `http://localhost:8000` manually\r\n\r\n### Port Already in Use\r\n\r\n**Solution**: \r\n1. Auto-finds next available port\r\n2. Or click shutdown in existing interface\r\n\r\n### AI Keeps Using Single Tools\r\n\r\n**Solution**: \r\n1. Restart IDE (MCP caches tool definitions)\r\n2. Check tool descriptions loaded correctly\r\n\r\n---\r\n\r\n## \ud83d\udcc4 License\r\n\r\nMIT License - see [LICENSE](LICENSE)\r\n\r\n---\r\n\r\n## \ud83e\udd1d Contributing\r\n\r\nContributions welcome! See [CONTRIBUTING.md](CONTRIBUTING.md)\r\n\r\n---\r\n\r\n## \ud83d\udd17 Links\r\n\r\n- **PyPI**: https://pypi.org/project/ai-mcp-terminal/\r\n- **GitHub**: https://github.com/kanniganfan/ai-mcp-terminal\r\n- **Issues**: https://github.com/kanniganfan/ai-mcp-terminal/issues\r\n- **Changelog**: [CHANGELOG.md](CHANGELOG.md)\r\n\r\n---\r\n\r\n## \ud83c\udd95 What's New in v1.0.53\r\n\r\n### \ud83c\udfaf Production-Ready Improvements\r\nBased on real PyPI release testing, v1.0.53 brings **battle-tested improvements** that solve actual production issues:\r\n\r\n#### \ud83d\udd0d Enhanced Debugging\r\n- **Detailed Statistics**: Every command returns `output_bytes`, `output_lines`, `execution_time`, `encoding_used`\r\n- **Clear Status**: Explicit `success: true/false` instead of ambiguous `exit_code: null`\r\n- **No More Guessing**: Know exactly what happened with every command\r\n\r\n#### \ud83d\udee1\ufe0f Smart Error Prevention\r\n- **Shell Type Detection**: Warns when PowerShell command sent to Bash terminal (and vice versa)\r\n- **Quick Fix Suggestions**: Provides exact commands to fix common errors\r\n- **7 Error Categories**: PyPI duplicates, encoding errors, permissions, network, syntax, etc.\r\n\r\n#### \ud83c\udf10 Zero-Config UTF-8 (Windows)\r\n- **Auto Setup**: Sets `PYTHONIOENCODING=utf-8` and `PYTHONUTF8=1` automatically\r\n- **No More Encoding Errors**: twine, pip, and other Python tools just work\r\n- **80% Fewer Errors**: Eliminates common `UnicodeEncodeError` issues\r\n\r\n#### \ud83d\udd04 Intelligent Batch Execution\r\n- **Smart Queueing**: Same terminal \u2192 sequential, different terminals \u2192 concurrent\r\n- **Zero Race Conditions**: No more \"upload before build finishes\" issues\r\n- **Maximum Efficiency**: Still fully concurrent across different terminals\r\n\r\n### Previous Features (v1.0.52)\r\n- \u2728 **execute_workflow()** - DAG-based task orchestration\r\n- \u23f3 **wait_until_complete()** - Smart blocking wait\r\n- \ud83d\udcdd **execute_sequence()** - Sequential execution with error handling\r\n- \ud83d\udd04 **execute_with_retry()** - Automatic retry mechanism\r\n\r\nSee [CHANGELOG.md](CHANGELOG.md) for complete details.\r\n\r\n---\r\n\r\n**Made with \u2764\ufe0f for AI Assistants**\r\n\r\nIf this helps you, please give it a \u2b50 star!\r\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "AI\u7ec8\u7aef\u7ba1\u7406\u5668 - \u652f\u6301MCP\u534f\u8bae\u7684\u591a\u7ebf\u7a0b\u7ec8\u7aef\u7cfb\u7edf\uff0c\u63d0\u4f9b\u5e76\u53d1\u6267\u884c\u3001\u667a\u80fd\u8c03\u5ea6\u3001\u4ea4\u4e92\u68c0\u6d4b\u3001Web\u754c\u9762\u7b49\u529f\u80fd",
"version": "1.0.54",
"project_urls": {
"Changelog": "https://github.com/kanniganfan/ai-mcp-terminal/blob/main/CHANGELOG.md",
"Documentation": "https://github.com/kanniganfan/ai-mcp-terminal#readme",
"Homepage": "https://github.com/kanniganfan/ai-mcp-terminal",
"Issues": "https://github.com/kanniganfan/ai-mcp-terminal/issues",
"Repository": "https://github.com/kanniganfan/ai-mcp-terminal"
},
"split_keywords": [
"mcp",
" terminal",
" ai",
" multi-thread",
" concurrent",
" shell",
" management",
" automation"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "d812177c2b2d2141a313cbcc456d4bfe3afe89785c6d97dce207b15b2b1c403b",
"md5": "1ffcaa2863cb3361cab603e56ad7a995",
"sha256": "7406b0f8aaac81fdf17130a7859f477ee6e27ee29beb2c099c25ea1346f96e3e"
},
"downloads": -1,
"filename": "ai_mcp_terminal-1.0.54-py3-none-any.whl",
"has_sig": false,
"md5_digest": "1ffcaa2863cb3361cab603e56ad7a995",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.10",
"size": 85228,
"upload_time": "2025-10-16T01:44:30",
"upload_time_iso_8601": "2025-10-16T01:44:30.929437Z",
"url": "https://files.pythonhosted.org/packages/d8/12/177c2b2d2141a313cbcc456d4bfe3afe89785c6d97dce207b15b2b1c403b/ai_mcp_terminal-1.0.54-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "59fb294b149dade71e15618096124027eb22a02e1357de2a0fb1f1b1ca5ec639",
"md5": "38965efe44df4530dd83d6c15ad04218",
"sha256": "bd240d300662e816464ff9367470f8342c37dcc8d6d250f38198c7c482780479"
},
"downloads": -1,
"filename": "ai_mcp_terminal-1.0.54.tar.gz",
"has_sig": false,
"md5_digest": "38965efe44df4530dd83d6c15ad04218",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.10",
"size": 83071,
"upload_time": "2025-10-16T01:44:32",
"upload_time_iso_8601": "2025-10-16T01:44:32.625849Z",
"url": "https://files.pythonhosted.org/packages/59/fb/294b149dade71e15618096124027eb22a02e1357de2a0fb1f1b1ca5ec639/ai_mcp_terminal-1.0.54.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-10-16 01:44:32",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "kanniganfan",
"github_project": "ai-mcp-terminal",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"requirements": [
{
"name": "mcp",
"specs": [
[
">=",
"1.0.0"
]
]
},
{
"name": "fastapi",
"specs": [
[
">=",
"0.104.0"
]
]
},
{
"name": "uvicorn",
"specs": [
[
">=",
"0.24.0"
]
]
},
{
"name": "websockets",
"specs": [
[
">=",
"12.0"
]
]
},
{
"name": "psutil",
"specs": [
[
">=",
"5.9.0"
]
]
},
{
"name": "aiofiles",
"specs": [
[
">=",
"23.2.0"
]
]
},
{
"name": "python-multipart",
"specs": [
[
">=",
"0.0.6"
]
]
}
],
"lcname": "ai-mcp-terminal"
}