Name | psyflow-mcp JSON |
Version |
0.1.12
JSON |
| download |
home_page | None |
Summary | A minimal complete project (MCP) for psyflow |
upload_time | 2025-07-28 08:41:54 |
maintainer | None |
docs_url | None |
author | None |
requires_python | >=3.10 |
license | None |
keywords |
psyflow
mcp
uv
|
VCS |
 |
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
# psyflow-mcp
A minimal complete project (MCP) for psyflow.
## Installation
To install this project using `uv`:
```bash
uv pip install psyflow-mcp
```
## Usage with uvx
`uvx` allows you to run commands within the project's `uv` environment without explicitly activating it. First, ensure you have `uvx` installed:
```bash
pip install uvx
```
Then, you can run the `psyflow-mcp` command (assuming `main.py` has a `main` function that is exposed as a script):
```bash
uvx psyflow-mcp
```
### Example `uvx` Configuration (uvx.json)
You can configure `uvx` to automatically use this project's environment. Create a `uvx.json` file in your project root or a parent directory with the following content:
```json
{
"project_name": "psyflow-mcp",
"entry_point": "main:main",
"commands": {
"run": "psyflow-mcp"
}
}
```
With this `uvx.json` in place, you can simply run:
```bash
uvx run
```
This will execute the `main` function from `main.py` within the `psyflow-mcp` environment.
A lightweight **FastMCP** server that lets a language-model clone, transform, download and localize PsyFlow task templates using a single entry-point tool.
---
## 1 · Setup & Run
This project uses `uv` for fast and reliable dependency management.
### 1.1 · Local Setup (StdIO)
This is the standard mode for local development and testing, where the server communicates over `STDIN/STDOUT`.
```bash
# 1. Clone the repository
git clone https://github.com/TaskBeacon/psyflow-mcp.git
cd psyflow-mcp
# 2. Create a virtual environment and install dependencies
uv venv
uv pip install "mcp[cli]>=1.12.2" psyflow gitpython httpx ruamel.yaml
# 3. Launch the std-IO server
uv run python main.py
```
The process stays in the foreground and communicates with the LLM via the Model-Context-Protocol (MCP).
### 1.2 · Server Setup (SSE)
For a persistent, stateful server, you can use Server-Sent Events (SSE). This is ideal for production or when multiple clients need to interact with the same server instance.
1. **Modify `main.py`:**
Change the last line from `mcp.run(transport="stdio")` to:
```python
mcp.run(transport="sse", port=8000)
```
2. **Run the server:**
```bash
uv run python main.py
```
The server will now be accessible at `http://localhost:8000/mcp`.
---
## 2 · MCP/LLM Setup
To connect this server to a Large Language Model (LLM) via a command-line interface (CLI) like Gemini CLI or a tool-integrated environment like Cursor, you'll need to provide a JSON configuration. Below are templates for both `StdIO` and `SSE` modes.
### 2.1 · StdIO Mode (Local Tool)
This configuration tells the CLI how to launch and communicate with the MCP server directly. Create a `psyflow-mcp.json` file with the following content, making sure to replace `/path/to/your/project/psyflow-mcp` with the actual absolute path to the cloned repository on your machine.
```json
{
"tool": {
"name": "psyflow_mcp_stdio",
"description": "A lightweight server to clone, transform, and download PsyFlow task templates.",
"command": ["uv", "run", "python", "main.py"],
"working_directory": "/path/to/your/project/psyflow-mcp"
}
}
```
```json
"psyflow-mcp": {
"name": "PsyFlow-MCP",
"type": "stdio", // communicate over STDIN / STDOUT
"description": "Local FastMCP server for PsyFlow task operations",
"isActive": true, // set false to disable without deleting
"registryUrl": "", // leave blank – weʼre running locally
"command": "python", // executable to launch
"args": [
"E:\\xhmhc\\TaskBeacon\\psyflow-mcp\\main.py"
]
}
"psyflow-mcp-pypi": {
"name": "PsyFlow-MCP",
"type": "stdio", // communicate over STDIN / STDOUT
"description": "FastMCP server for PsyFlow task operations",
"isActive": true, // set false to disable without deleting
"registryUrl": "", // leave blank – weʼre running locally
"command": "uvx", // executable to launch
"args": [
"psyflow-mcp"
]
}
```
### 2.2 · SSE Mode (Remote Tool)
When the server is running persistently (as described in section 1.2), you can connect to it as a remote tool using its HTTP endpoint.
```json
{
"tool": {
"name": "psyflow_mcp_sse",
"description": "A lightweight server to clone, transform, and download PsyFlow task templates.",
"endpoint": "http://localhost:8000/mcp"
}
}
```
---
## 3 · Conceptual Workflow
1. **User** describes the task they want (e.g. “Make a Stroop out of Flanker”).
2. **LLM** calls the `build_task` tool:
* If the model already knows the best starting template it passes `source_task`.
* Otherwise it omits `source_task`, receives a menu created by `choose_template_prompt`, picks a repo, then calls `build_task` again with that repo.
3. The server clones the chosen template, returns a Stage 0→5 instruction prompt (`transform_prompt`) plus the local template path.
4. The LLM edits files locally, optionally invokes `localize` to translate and adapt `config.yaml`, then zips / commits the new task.
---
## 4 · Exposed Tools
| Tool | Arguments | Purpose / Return |
| :--- | :--- | :--- |
| `build_task` | `target_task:str`, `source_task?:str` | **Main entry-point.** • With `source_task` → clones repo and returns: `prompt` (Stage 0→5) **+** `template_path` (local clone). • Without `source_task` → returns `prompt_messages` from `choose_template_prompt` so the LLM can pick the best starting template, then call `build_task` again. |
| `list_tasks` | *none* | Returns an array of objects: `{ repo, readme_snippet, branches }`, where `branches` lists up to 20 branch names for that repo. |
| `download_task` | `repo:str` | Clones any template repo from the registry and returns its local path. |
| `localize` | `task_path:str`, `target_language:str`, `voice?:str` | Reads `config.yaml`, wraps it in `localize_prompt`, and returns `prompt_messages`. If a `voice` is not provided, it first calls `list_voices` to find suitable options. Also deletes old `_voice.mp3` files. |
| `list_voices` | `filter_lang?:str` | Returns a human-readable string of available text-to-speech voices from `psyflow`, optionally filtered by language (e.g., "ja", "en"). |
---
## 5 · Exposed Prompts
| Prompt | Parameters | Description |
| :--- | :--- | :--- |
| `transform_prompt` | `source_task`, `target_task` | Single **User** message containing the full Stage 0→5 instructions to convert `source_task` into `target_task`. |
| `choose_template_prompt` | `desc`, `candidates:list[{repo,readme_snippet}]` | Three **User** messages: task description, template list, and selection criteria. The LLM must reply with **one repo name** or the literal word `NONE`. |
| `localize_prompt` | `yaml_text`, `target_language`, `voice_options?` | Two-message sequence: strict translation instruction + raw YAML. The LLM must return the fully-translated YAML body, adding the `voice: <short_name>` if suitable options were provided. |
---
## 6 · Advanced Setup
### 6.1 · Using a Custom PyPI Repository
If you need to install dependencies from a private or alternative package index, you can configure `uv` using an environment variable or a command-line flag.
**Using an environment variable:**
```bash
export UV_INDEX_URL="https://pypi.org/manage/project/psyflow-mcp/"
uv pip install ...
```
**Using a command-line flag:**
```bash
uv pip install --index-url "https://pypi.org/manage/project/psyflow-mcp/" ...
```
### 6.2 · Template Folder Layout
The Stage 0→5 transformation prompt assumes the following repository structure.
```
<repo>/
├─ config/
│ └─ config.yaml
├─ main.py
├─ src/
│ └─ run_trial.py
└─ README.md
```
---
Adjust `NON_TASK_REPOS`, network timeouts, or `git` clone depth in `main.py` to match your infrastructure.
Raw data
{
"_id": null,
"home_page": null,
"name": "psyflow-mcp",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.10",
"maintainer_email": null,
"keywords": "psyflow, mcp, uv",
"author": null,
"author_email": "Zhipeng Cao <zhipeng30@foxmail.com>",
"download_url": "https://files.pythonhosted.org/packages/44/38/70bef9ec53eaf8c6166c629ad02dd47476648af182ba83b43ed74cb68b4c/psyflow_mcp-0.1.12.tar.gz",
"platform": null,
"description": "# psyflow-mcp\r\n\r\nA minimal complete project (MCP) for psyflow.\r\n\r\n## Installation\r\n\r\nTo install this project using `uv`:\r\n\r\n```bash\r\nuv pip install psyflow-mcp\r\n```\r\n\r\n## Usage with uvx\r\n\r\n`uvx` allows you to run commands within the project's `uv` environment without explicitly activating it. First, ensure you have `uvx` installed:\r\n\r\n```bash\r\npip install uvx\r\n```\r\n\r\nThen, you can run the `psyflow-mcp` command (assuming `main.py` has a `main` function that is exposed as a script):\r\n\r\n```bash\r\nuvx psyflow-mcp\r\n```\r\n\r\n### Example `uvx` Configuration (uvx.json)\r\n\r\nYou can configure `uvx` to automatically use this project's environment. Create a `uvx.json` file in your project root or a parent directory with the following content:\r\n\r\n```json\r\n{\r\n \"project_name\": \"psyflow-mcp\",\r\n \"entry_point\": \"main:main\",\r\n \"commands\": {\r\n \"run\": \"psyflow-mcp\"\r\n }\r\n}\r\n```\r\n\r\nWith this `uvx.json` in place, you can simply run:\r\n\r\n```bash\r\nuvx run\r\n```\r\n\r\nThis will execute the `main` function from `main.py` within the `psyflow-mcp` environment.\r\n\r\n\r\nA lightweight **FastMCP** server that lets a language-model clone, transform, download and localize PsyFlow task templates using a single entry-point tool.\r\n\r\n---\r\n\r\n## 1 \u00b7 Setup & Run\r\n\r\nThis project uses `uv` for fast and reliable dependency management.\r\n\r\n### 1.1 \u00b7 Local Setup (StdIO)\r\n\r\nThis is the standard mode for local development and testing, where the server communicates over `STDIN/STDOUT`.\r\n\r\n```bash\r\n# 1. Clone the repository\r\ngit clone https://github.com/TaskBeacon/psyflow-mcp.git\r\ncd psyflow-mcp\r\n\r\n# 2. Create a virtual environment and install dependencies\r\nuv venv\r\nuv pip install \"mcp[cli]>=1.12.2\" psyflow gitpython httpx ruamel.yaml\r\n\r\n# 3. Launch the std-IO server\r\nuv run python main.py\r\n```\r\n\r\nThe process stays in the foreground and communicates with the LLM via the Model-Context-Protocol (MCP).\r\n\r\n### 1.2 \u00b7 Server Setup (SSE)\r\n\r\nFor a persistent, stateful server, you can use Server-Sent Events (SSE). This is ideal for production or when multiple clients need to interact with the same server instance.\r\n\r\n1. **Modify `main.py`:**\r\n Change the last line from `mcp.run(transport=\"stdio\")` to:\r\n ```python\r\n mcp.run(transport=\"sse\", port=8000)\r\n ```\r\n\r\n2. **Run the server:**\r\n ```bash\r\n uv run python main.py\r\n ```\r\n The server will now be accessible at `http://localhost:8000/mcp`.\r\n\r\n---\r\n\r\n## 2 \u00b7 MCP/LLM Setup\r\n\r\nTo connect this server to a Large Language Model (LLM) via a command-line interface (CLI) like Gemini CLI or a tool-integrated environment like Cursor, you'll need to provide a JSON configuration. Below are templates for both `StdIO` and `SSE` modes.\r\n\r\n### 2.1 \u00b7 StdIO Mode (Local Tool)\r\n\r\nThis configuration tells the CLI how to launch and communicate with the MCP server directly. Create a `psyflow-mcp.json` file with the following content, making sure to replace `/path/to/your/project/psyflow-mcp` with the actual absolute path to the cloned repository on your machine.\r\n\r\n```json\r\n{\r\n \"tool\": {\r\n \"name\": \"psyflow_mcp_stdio\",\r\n \"description\": \"A lightweight server to clone, transform, and download PsyFlow task templates.\",\r\n \"command\": [\"uv\", \"run\", \"python\", \"main.py\"],\r\n \"working_directory\": \"/path/to/your/project/psyflow-mcp\"\r\n }\r\n}\r\n```\r\n\r\n```json\r\n \"psyflow-mcp\": {\r\n \"name\": \"PsyFlow-MCP\",\r\n \"type\": \"stdio\", // communicate over STDIN / STDOUT\r\n \"description\": \"Local FastMCP server for PsyFlow task operations\",\r\n \"isActive\": true, // set false to disable without deleting\r\n \"registryUrl\": \"\", // leave blank \u2013 we\u02bcre running locally\r\n \"command\": \"python\", // executable to launch\r\n \"args\": [\r\n \"E:\\\\xhmhc\\\\TaskBeacon\\\\psyflow-mcp\\\\main.py\"\r\n ]\r\n }\r\n\r\n\r\n \"psyflow-mcp-pypi\": {\r\n \"name\": \"PsyFlow-MCP\",\r\n \"type\": \"stdio\", // communicate over STDIN / STDOUT\r\n \"description\": \"FastMCP server for PsyFlow task operations\",\r\n \"isActive\": true, // set false to disable without deleting\r\n \"registryUrl\": \"\", // leave blank \u2013 we\u02bcre running locally\r\n \"command\": \"uvx\", // executable to launch\r\n \"args\": [\r\n \"psyflow-mcp\"\r\n ]\r\n }\r\n```\r\n\r\n### 2.2 \u00b7 SSE Mode (Remote Tool)\r\n\r\nWhen the server is running persistently (as described in section 1.2), you can connect to it as a remote tool using its HTTP endpoint.\r\n\r\n```json\r\n{\r\n \"tool\": {\r\n \"name\": \"psyflow_mcp_sse\",\r\n \"description\": \"A lightweight server to clone, transform, and download PsyFlow task templates.\",\r\n \"endpoint\": \"http://localhost:8000/mcp\"\r\n }\r\n}\r\n```\r\n\r\n---\r\n\r\n## 3 \u00b7 Conceptual Workflow\r\n\r\n1. **User** describes the task they want (e.g. \u201cMake a Stroop out of Flanker\u201d).\r\n2. **LLM** calls the `build_task` tool:\r\n * If the model already knows the best starting template it passes `source_task`.\r\n * Otherwise it omits `source_task`, receives a menu created by `choose_template_prompt`, picks a repo, then calls `build_task` again with that repo.\r\n3. The server clones the chosen template, returns a Stage 0\u21925 instruction prompt (`transform_prompt`) plus the local template path.\r\n4. The LLM edits files locally, optionally invokes `localize` to translate and adapt `config.yaml`, then zips / commits the new task.\r\n\r\n---\r\n\r\n## 4 \u00b7 Exposed Tools\r\n\r\n| Tool | Arguments | Purpose / Return |\r\n| :--- | :--- | :--- |\r\n| `build_task` | `target_task:str`, `source_task?:str` | **Main entry-point.** \u2022 With `source_task` \u2192 clones repo and returns: `prompt` (Stage 0\u21925) **+** `template_path` (local clone). \u2022 Without `source_task` \u2192 returns `prompt_messages` from `choose_template_prompt` so the LLM can pick the best starting template, then call `build_task` again. |\r\n| `list_tasks` | *none* | Returns an array of objects: `{ repo, readme_snippet, branches }`, where `branches` lists up to 20 branch names for that repo. |\r\n| `download_task` | `repo:str` | Clones any template repo from the registry and returns its local path. |\r\n| `localize` | `task_path:str`, `target_language:str`, `voice?:str` | Reads `config.yaml`, wraps it in `localize_prompt`, and returns `prompt_messages`. If a `voice` is not provided, it first calls `list_voices` to find suitable options. Also deletes old `_voice.mp3` files. |\r\n| `list_voices` | `filter_lang?:str` | Returns a human-readable string of available text-to-speech voices from `psyflow`, optionally filtered by language (e.g., \"ja\", \"en\"). |\r\n\r\n---\r\n\r\n## 5 \u00b7 Exposed Prompts\r\n\r\n| Prompt | Parameters | Description |\r\n| :--- | :--- | :--- |\r\n| `transform_prompt` | `source_task`, `target_task` | Single **User** message containing the full Stage 0\u21925 instructions to convert `source_task` into `target_task`. |\r\n| `choose_template_prompt` | `desc`, `candidates:list[{repo,readme_snippet}]` | Three **User** messages: task description, template list, and selection criteria. The LLM must reply with **one repo name** or the literal word `NONE`. |\r\n| `localize_prompt` | `yaml_text`, `target_language`, `voice_options?` | Two-message sequence: strict translation instruction + raw YAML. The LLM must return the fully-translated YAML body, adding the `voice: <short_name>` if suitable options were provided. |\r\n\r\n---\r\n\r\n## 6 \u00b7 Advanced Setup\r\n\r\n### 6.1 \u00b7 Using a Custom PyPI Repository\r\n\r\nIf you need to install dependencies from a private or alternative package index, you can configure `uv` using an environment variable or a command-line flag.\r\n\r\n**Using an environment variable:**\r\n```bash\r\nexport UV_INDEX_URL=\"https://pypi.org/manage/project/psyflow-mcp/\"\r\nuv pip install ...\r\n```\r\n\r\n**Using a command-line flag:**\r\n```bash\r\nuv pip install --index-url \"https://pypi.org/manage/project/psyflow-mcp/\" ...\r\n```\r\n\r\n### 6.2 \u00b7 Template Folder Layout\r\n\r\nThe Stage 0\u21925 transformation prompt assumes the following repository structure.\r\n\r\n```\r\n<repo>/\r\n\u251c\u2500 config/\r\n\u2502 \u2514\u2500 config.yaml\r\n\u251c\u2500 main.py\r\n\u251c\u2500 src/\r\n\u2502 \u2514\u2500 run_trial.py\r\n\u2514\u2500 README.md\r\n```\r\n\r\n---\r\n\r\nAdjust `NON_TASK_REPOS`, network timeouts, or `git` clone depth in `main.py` to match your infrastructure.\r\n",
"bugtrack_url": null,
"license": null,
"summary": "A minimal complete project (MCP) for psyflow",
"version": "0.1.12",
"project_urls": {
"Bug Tracker": "https://github.com/TaskBeacon/psyflow-mcp/issues",
"Homepage": "https://github.com/TaskBeacon/psyflow-mcp",
"Repository": "https://github.com/TaskBeacon/psyflow-mcp"
},
"split_keywords": [
"psyflow",
" mcp",
" uv"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "71b9fb49f14eee0955d1966d76a9bbdd926263d6b8ede404b815c1f0ac858b99",
"md5": "0d703a37911701efec7649eae9996edd",
"sha256": "2844575181b1f204c051bf0337df5187f2bf0d53e65034cf06c64fbdd13d32ee"
},
"downloads": -1,
"filename": "psyflow_mcp-0.1.12-py3-none-any.whl",
"has_sig": false,
"md5_digest": "0d703a37911701efec7649eae9996edd",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.10",
"size": 11355,
"upload_time": "2025-07-28T08:41:53",
"upload_time_iso_8601": "2025-07-28T08:41:53.255009Z",
"url": "https://files.pythonhosted.org/packages/71/b9/fb49f14eee0955d1966d76a9bbdd926263d6b8ede404b815c1f0ac858b99/psyflow_mcp-0.1.12-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "443870bef9ec53eaf8c6166c629ad02dd47476648af182ba83b43ed74cb68b4c",
"md5": "01c343c41d280353c1841374d1326fe9",
"sha256": "470e014a96b0b9382be945faa721c61f9afcbbfcde7990a531c337c138f0f147"
},
"downloads": -1,
"filename": "psyflow_mcp-0.1.12.tar.gz",
"has_sig": false,
"md5_digest": "01c343c41d280353c1841374d1326fe9",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.10",
"size": 14048,
"upload_time": "2025-07-28T08:41:54",
"upload_time_iso_8601": "2025-07-28T08:41:54.323380Z",
"url": "https://files.pythonhosted.org/packages/44/38/70bef9ec53eaf8c6166c629ad02dd47476648af182ba83b43ed74cb68b4c/psyflow_mcp-0.1.12.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-07-28 08:41:54",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "TaskBeacon",
"github_project": "psyflow-mcp",
"travis_ci": false,
"coveralls": false,
"github_actions": false,
"lcname": "psyflow-mcp"
}