taskbeacon-mcp


Nametaskbeacon-mcp JSON
Version 0.1.1 PyPI version JSON
download
home_pageNone
SummaryA model contexture protocal (MCP) for TaskBeacon
upload_time2025-07-31 03:07:46
maintainerNone
docs_urlNone
authorNone
requires_python>=3.10
licenseNone
keywords taskbeacon mcp uv taskbeacon
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # taskbeacon-mcp

A model context protocol (MCP) for taskbeacon.

---

## Overview

`taskbeacon-mcp` is a lightweight **FastMCP** server that lets a language-model clone, transform, download and localize taskbeacon task templates using a single entry-point tool.

This README provides instructions for setting up and using `taskbeacon-mcp` in different environments.

---

## 1 · Quick Start (Recommended)

The easiest way to use `taskbeacon-mcp` is with `uvx`. This tool automatically downloads the package from PyPI, installs it and its dependencies into a temporary virtual environment, and runs it in a single step. No manual cloning or setup is required.

### 1.1 · Prerequisites

Ensure you have `uvx` installed. If not, you can install it with `pip`:

```bash
pip install uvx
```

### 1.2 · LLM Tool Configuration (JSON)

To integrate `taskbeacon-mcp` with your LLM tool (like Gemini CLI or Cursor), use the following JSON configuration. This tells the tool how to run the server using `uvx`.

```json
{
  "name": "taskbeacon-mcp",
  "type": "stdio",
  "description": "Local FastMCP server for taskbeacon task operations. Uses uvx for automatic setup.",
  "isActive": true,
  "command": "uvx",
  "args": [
    "taskbeacon-mcp"
  ]
}
```

With this setup, the LLM can now use the `taskbeacon-mcp` tools.

---

## 2 · Manual Setup (For Developers)

This method is for developers who want to modify or contribute to the `taskbeacon-mcp` source code.

### 2.1 · Environment Setup

1.  **Create a virtual environment and install dependencies:**
    This project uses `uv`. Make sure you are in the project root directory.
    ```bash
    # Create and activate the virtual environment
    python -m venv .venv
    source .venv/bin/activate  # On Windows, use: .venv\Scripts\activate

    # Install dependencies in editable mode
    pip install -e .
    ```

### 2.2 · Running Locally (StdIO)

This is the standard mode for local development, where the server communicates over `STDIN/STDOUT`.

1.  **Launch the server:**
    ```bash
    python taskbeacon-mcp/main.py
    ```

2.  **LLM Tool Configuration (JSON):**
    To use your local development server with an LLM tool, use the following configuration. Note that you should replace the example path in `args` with the absolute path to the `main.py` file on your machine.

    ```json
    {
      "name": "taskbeacon-mcp_dev",
      "type": "stdio",
      "description": "Local development server for taskbeacon task operations.",
      "isActive": true,
      "command": "python",
      "args": [
        "path\\to\\taskbeacon-mcp\\main.py"
      ]
    }
    ```

### 2.3 · Running as a Persistent Server (SSE)

For a persistent, stateful server, you can run `taskbeacon-mcp` using Server-Sent Events (SSE). This is ideal for production or when multiple clients need to interact with the same server instance.

1.  **Modify `main.py`:**
    In `taskbeacon-mcp/main.py`, change the last line from `mcp.run(transport="stdio")` to:
    ```python
mcp.run(transport="sse", port=8000)
    ```

2.  **Run the server:**
    ```bash
    python taskbeacon-mcp/main.py
    ```
    The server will now be accessible at `http://localhost:8000/mcp`.

3.  **LLM Tool Configuration (JSON):**
    To connect an LLM tool to the running SSE server, use a configuration like this:
    ```json
    {
      "name": "taskbeacon-mcp_sse",
      "type": "http",
      "description": "Persistent SSE server for taskbeacon task operations.",
      "isActive": true,
      "endpoint": "http://localhost:8000/mcp"
    }
    ```

---

## 3 · Conceptual Workflow

1.  **User** describes the task they want (e.g. “Make a Stroop out of Flanker”).
2.  **LLM** calls the `build_task` tool:
    *   If the model already knows the best starting template it passes `source_task`.
    *   Otherwise it omits `source_task`, receives a menu created by `choose_template_prompt`, picks a repo, then calls `build_task` again with that repo.
3.  The server clones the chosen template, returns a Stage 0→5 instruction prompt (`transform_prompt`) plus the local template path.
4.  The LLM edits files locally, optionally invokes `localize` to translate and adapt `config.yaml`, then zips / commits the new task.

---

## 4 · Exposed Tools

| Tool | Arguments | Purpose / Return |
| :--- | :--- | :--- |
| `build_task` | `target_task:str`, `source_task?:str` | **Main entry-point.** • With `source_task` → clones repo and returns: `prompt` (Stage 0→5) **+** `template_path` (local clone). • Without `source_task` → returns `prompt_messages` from `choose_template_prompt` so the LLM can pick the best starting template, then call `build_task` again. |
| `list_tasks` | *none* | Returns an array of objects: `{ repo, readme_snippet, branches }`, where `branches` lists up to 20 branch names for that repo. |
| `download_task` | `repo:str` | Clones any template repo from the registry and returns its local path. |
| `localize` | `task_path:str`, `target_language:str`, `voice?:str` | Reads `config.yaml`, wraps it in `localize_prompt`, and returns `prompt_messages`. If a `voice` is not provided, it first calls `list_voices` to find suitable options. Also deletes old `_voice.mp3` files. |
| `list_voices` | `filter_lang?:str` | Returns a human-readable string of available text-to-speech voices from `taskbeacon`, optionally filtered by language (e.g., "ja", "en"). |

---

## 5 · Exposed Prompts

| Prompt | Parameters | Description |
| :--- | :--- | :--- |
| `transform_prompt` | `source_task`, `target_task` | Single **User** message containing the full Stage 0→5 instructions to convert `source_task` into `target_task`. |
| `choose_template_prompt` | `desc`, `candidates:list[{repo,readme_snippet}]` | Three **User** messages: task description, template list, and selection criteria. The LLM must reply with **one repo name** or the literal word `NONE`. |
| `localize_prompt` | `yaml_text`, `target_language`, `voice_options?` | Two-message sequence: strict translation instruction + raw YAML. The LLM must return the fully-translated YAML body, adding the `voice: <short_name>` if suitable options were provided. |

---

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "taskbeacon-mcp",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.10",
    "maintainer_email": null,
    "keywords": "taskbeacon, mcp, uv, TaskBeacon",
    "author": null,
    "author_email": "Zhipeng Cao <zhipeng30@foxmail.com>",
    "download_url": "https://files.pythonhosted.org/packages/a2/1f/83642fbc689f855b3eff9b7703c67143eaaf7ceaf06ba1f1cffb47d8856a/taskbeacon_mcp-0.1.1.tar.gz",
    "platform": null,
    "description": "# taskbeacon-mcp\r\n\r\nA model context protocol (MCP) for taskbeacon.\r\n\r\n---\r\n\r\n## Overview\r\n\r\n`taskbeacon-mcp` is a lightweight **FastMCP** server that lets a language-model clone, transform, download and localize taskbeacon task templates using a single entry-point tool.\r\n\r\nThis README provides instructions for setting up and using `taskbeacon-mcp` in different environments.\r\n\r\n---\r\n\r\n## 1 \u00b7 Quick Start (Recommended)\r\n\r\nThe easiest way to use `taskbeacon-mcp` is with `uvx`. This tool automatically downloads the package from PyPI, installs it and its dependencies into a temporary virtual environment, and runs it in a single step. No manual cloning or setup is required.\r\n\r\n### 1.1 \u00b7 Prerequisites\r\n\r\nEnsure you have `uvx` installed. If not, you can install it with `pip`:\r\n\r\n```bash\r\npip install uvx\r\n```\r\n\r\n### 1.2 \u00b7 LLM Tool Configuration (JSON)\r\n\r\nTo integrate `taskbeacon-mcp` with your LLM tool (like Gemini CLI or Cursor), use the following JSON configuration. This tells the tool how to run the server using `uvx`.\r\n\r\n```json\r\n{\r\n  \"name\": \"taskbeacon-mcp\",\r\n  \"type\": \"stdio\",\r\n  \"description\": \"Local FastMCP server for taskbeacon task operations. Uses uvx for automatic setup.\",\r\n  \"isActive\": true,\r\n  \"command\": \"uvx\",\r\n  \"args\": [\r\n    \"taskbeacon-mcp\"\r\n  ]\r\n}\r\n```\r\n\r\nWith this setup, the LLM can now use the `taskbeacon-mcp` tools.\r\n\r\n---\r\n\r\n## 2 \u00b7 Manual Setup (For Developers)\r\n\r\nThis method is for developers who want to modify or contribute to the `taskbeacon-mcp` source code.\r\n\r\n### 2.1 \u00b7 Environment Setup\r\n\r\n1.  **Create a virtual environment and install dependencies:**\r\n    This project uses `uv`. Make sure you are in the project root directory.\r\n    ```bash\r\n    # Create and activate the virtual environment\r\n    python -m venv .venv\r\n    source .venv/bin/activate  # On Windows, use: .venv\\Scripts\\activate\r\n\r\n    # Install dependencies in editable mode\r\n    pip install -e .\r\n    ```\r\n\r\n### 2.2 \u00b7 Running Locally (StdIO)\r\n\r\nThis is the standard mode for local development, where the server communicates over `STDIN/STDOUT`.\r\n\r\n1.  **Launch the server:**\r\n    ```bash\r\n    python taskbeacon-mcp/main.py\r\n    ```\r\n\r\n2.  **LLM Tool Configuration (JSON):**\r\n    To use your local development server with an LLM tool, use the following configuration. Note that you should replace the example path in `args` with the absolute path to the `main.py` file on your machine.\r\n\r\n    ```json\r\n    {\r\n      \"name\": \"taskbeacon-mcp_dev\",\r\n      \"type\": \"stdio\",\r\n      \"description\": \"Local development server for taskbeacon task operations.\",\r\n      \"isActive\": true,\r\n      \"command\": \"python\",\r\n      \"args\": [\r\n        \"path\\\\to\\\\taskbeacon-mcp\\\\main.py\"\r\n      ]\r\n    }\r\n    ```\r\n\r\n### 2.3 \u00b7 Running as a Persistent Server (SSE)\r\n\r\nFor a persistent, stateful server, you can run `taskbeacon-mcp` using Server-Sent Events (SSE). This is ideal for production or when multiple clients need to interact with the same server instance.\r\n\r\n1.  **Modify `main.py`:**\r\n    In `taskbeacon-mcp/main.py`, change the last line from `mcp.run(transport=\"stdio\")` to:\r\n    ```python\r\nmcp.run(transport=\"sse\", port=8000)\r\n    ```\r\n\r\n2.  **Run the server:**\r\n    ```bash\r\n    python taskbeacon-mcp/main.py\r\n    ```\r\n    The server will now be accessible at `http://localhost:8000/mcp`.\r\n\r\n3.  **LLM Tool Configuration (JSON):**\r\n    To connect an LLM tool to the running SSE server, use a configuration like this:\r\n    ```json\r\n    {\r\n      \"name\": \"taskbeacon-mcp_sse\",\r\n      \"type\": \"http\",\r\n      \"description\": \"Persistent SSE server for taskbeacon task operations.\",\r\n      \"isActive\": true,\r\n      \"endpoint\": \"http://localhost:8000/mcp\"\r\n    }\r\n    ```\r\n\r\n---\r\n\r\n## 3 \u00b7 Conceptual Workflow\r\n\r\n1.  **User** describes the task they want (e.g. \u201cMake a Stroop out of Flanker\u201d).\r\n2.  **LLM** calls the `build_task` tool:\r\n    *   If the model already knows the best starting template it passes `source_task`.\r\n    *   Otherwise it omits `source_task`, receives a menu created by `choose_template_prompt`, picks a repo, then calls `build_task` again with that repo.\r\n3.  The server clones the chosen template, returns a Stage 0\u21925 instruction prompt (`transform_prompt`) plus the local template path.\r\n4.  The LLM edits files locally, optionally invokes `localize` to translate and adapt `config.yaml`, then zips / commits the new task.\r\n\r\n---\r\n\r\n## 4 \u00b7 Exposed Tools\r\n\r\n| Tool | Arguments | Purpose / Return |\r\n| :--- | :--- | :--- |\r\n| `build_task` | `target_task:str`, `source_task?:str` | **Main entry-point.** \u2022 With `source_task` \u2192 clones repo and returns: `prompt` (Stage 0\u21925) **+** `template_path` (local clone). \u2022 Without `source_task` \u2192 returns `prompt_messages` from `choose_template_prompt` so the LLM can pick the best starting template, then call `build_task` again. |\r\n| `list_tasks` | *none* | Returns an array of objects: `{ repo, readme_snippet, branches }`, where `branches` lists up to 20 branch names for that repo. |\r\n| `download_task` | `repo:str` | Clones any template repo from the registry and returns its local path. |\r\n| `localize` | `task_path:str`, `target_language:str`, `voice?:str` | Reads `config.yaml`, wraps it in `localize_prompt`, and returns `prompt_messages`. If a `voice` is not provided, it first calls `list_voices` to find suitable options. Also deletes old `_voice.mp3` files. |\r\n| `list_voices` | `filter_lang?:str` | Returns a human-readable string of available text-to-speech voices from `taskbeacon`, optionally filtered by language (e.g., \"ja\", \"en\"). |\r\n\r\n---\r\n\r\n## 5 \u00b7 Exposed Prompts\r\n\r\n| Prompt | Parameters | Description |\r\n| :--- | :--- | :--- |\r\n| `transform_prompt` | `source_task`, `target_task` | Single **User** message containing the full Stage 0\u21925 instructions to convert `source_task` into `target_task`. |\r\n| `choose_template_prompt` | `desc`, `candidates:list[{repo,readme_snippet}]` | Three **User** messages: task description, template list, and selection criteria. The LLM must reply with **one repo name** or the literal word `NONE`. |\r\n| `localize_prompt` | `yaml_text`, `target_language`, `voice_options?` | Two-message sequence: strict translation instruction + raw YAML. The LLM must return the fully-translated YAML body, adding the `voice: <short_name>` if suitable options were provided. |\r\n\r\n---\r\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "A model contexture protocal (MCP) for TaskBeacon",
    "version": "0.1.1",
    "project_urls": {
        "Bug Tracker": "https://github.com/TaskBeacon/taskbeacon-mcp/issues",
        "Homepage": "https://github.com/TaskBeacon/taskbeacon-mcp",
        "Repository": "https://github.com/TaskBeacon/taskbeacon-mcp"
    },
    "split_keywords": [
        "taskbeacon",
        " mcp",
        " uv",
        " taskbeacon"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "271e9f044cef4e10a9f60c4ebe6a51a01b7cd5d90df8cedd5aa9551b342430f2",
                "md5": "2c5e5339c1e1805370d828fc930ec379",
                "sha256": "a4822a2b38eec0e0a15f5d7512f02a40631f2adf7792d92f7f8542fe5a6f25c3"
            },
            "downloads": -1,
            "filename": "taskbeacon_mcp-0.1.1-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "2c5e5339c1e1805370d828fc930ec379",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.10",
            "size": 10762,
            "upload_time": "2025-07-31T03:07:44",
            "upload_time_iso_8601": "2025-07-31T03:07:44.591663Z",
            "url": "https://files.pythonhosted.org/packages/27/1e/9f044cef4e10a9f60c4ebe6a51a01b7cd5d90df8cedd5aa9551b342430f2/taskbeacon_mcp-0.1.1-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "a21f83642fbc689f855b3eff9b7703c67143eaaf7ceaf06ba1f1cffb47d8856a",
                "md5": "5793c3adde823a39578024a13ab9d4e6",
                "sha256": "f7d8353959313790a67f970900e0648d7f695eca763a114ad174cbea176ad498"
            },
            "downloads": -1,
            "filename": "taskbeacon_mcp-0.1.1.tar.gz",
            "has_sig": false,
            "md5_digest": "5793c3adde823a39578024a13ab9d4e6",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.10",
            "size": 12735,
            "upload_time": "2025-07-31T03:07:46",
            "upload_time_iso_8601": "2025-07-31T03:07:46.238210Z",
            "url": "https://files.pythonhosted.org/packages/a2/1f/83642fbc689f855b3eff9b7703c67143eaaf7ceaf06ba1f1cffb47d8856a/taskbeacon_mcp-0.1.1.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-07-31 03:07:46",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "TaskBeacon",
    "github_project": "taskbeacon-mcp",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "lcname": "taskbeacon-mcp"
}
        
Elapsed time: 1.35402s