rocm-wsl-ai


Namerocm-wsl-ai JSON
Version 0.2.10 PyPI version JSON
download
home_pageNone
SummaryWeb UI to install and manage AMD ROCm + local AI tools on WSL2 (with optional CLI/TUI)
upload_time2025-08-08 20:09:53
maintainerNone
docs_urlNone
authordaMustermann
requires_python>=3.10
licenseNone
keywords rocm wsl amd fastapi webui stable-diffusion llm comfyui ollama textgen
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            
# 🔥 ROCm-WSL-AI Web UI

![License](https://img.shields.io/badge/License-MIT-blue)
![Platform](https://img.shields.io/badge/WSL2-Ubuntu%2024.04-informational)
![GPU](https://img.shields.io/badge/AMD-ROCm%20latest-EE4C2C)
![PyTorch](https://img.shields.io/badge/PyTorch-Nightly-orange)
![Shell](https://img.shields.io/badge/CLI-python%20(%20rocmwsl%20)-4EAA25)
![UI](https://img.shields.io/badge/TUI-textual-6f42c1)
![Status](https://img.shields.io/badge/Status-Active-brightgreen)

Make your AMD GPU sing inside WSL2. Use one Python CLI (rocmwsl, alias rocm-wsl-ai) and an optional Textual TUI to install, launch, update, and remove local AI tools — always ready for the latest PyTorch Nightly.

## What you get
- Always latest ROCm (from AMD’s “latest” apt repo) + PyTorch Nightly matched to your installed ROCm series
- A modern, keyboard-driven TUI (Textual) with clear categories
- One place to install, start, update, and remove local AI tools (image gen + LLMs)
- Optional no-chmod Python CLI: install and run everything with a single command

## Tools included (by category)
Image generation
- ComfyUI
- SD.Next
- Automatic1111 WebUI
- InvokeAI
- Fooocus
- SD WebUI Forge

LLMs
- Ollama (with a small model manager script)
- Text Generation WebUI
## ROCm‑WSL‑AI Web UI

A modern, lightweight web interface to install, run, and manage local AI tools on WSL2 with AMD ROCm. It wraps the existing project features into a single browser-based control panel with live logs, jobs, and per‑tool settings.

### Highlights

- One web dashboard for popular tools (ComfyUI, A1111/SD.Next/Forge, Fooocus, InvokeAI, SillyTavern, TextGen, llama.cpp, KoboldCpp, FastChat, Ollama)
- Start/stop, status, and interface links per tool
- Live logs via SSE or WebSocket with filter and colorized streams
- Job history with progress for installers and long‑running tasks
- Models: location overview, index, refresh, link, and curated preset downloads
- Wizard to set up base folders/venv and defaults for tool flags
- Per‑tool settings persist (URL, extra args), plus smart Host/Port helpers
- Clean, responsive UI (PicoCSS), with theme toggle and small toasts/dialogs

---

## Quick start

1) Install inside WSL2 (recommended)

PyPI (preferred):

```bash
python3 -m venv ~/.venvs/rocmwsl
source ~/.venvs/rocmwsl/bin/activate
python -m pip install --upgrade pip
pip install rocm-wsl-ai
```

PyPI package: https://pypi.org/project/rocm-wsl-ai/

From source (alternative, if you cloned this repo on Windows):

```bash
cd /mnt/f/Coding/rocm-wsl-ai   # adjust path to your repo inside WSL
python3 -m venv ~/.venvs/rocmwsl
source ~/.venvs/rocmwsl/bin/activate
python -m pip install --upgrade pip
pip install -e .
```

From Windows PowerShell into WSL using your local checkout:

```powershell
wsl -e bash -lc "cd /mnt/f/Coding/rocm-wsl-ai && python3 -m venv ~/.venvs/rocmwsl && source ~/.venvs/rocmwsl/bin/activate && python -m pip install --upgrade pip && pip install -e ."
```

2) Run the Web UI inside WSL:

```bash
export ROCMWSL_WEB_TOKEN="set-a-strong-token"   # optional but recommended on LAN
rocmwsl-web                                     # serves on 0.0.0.0:8000
```

From Windows, open http://localhost:8000 in your browser (WSL2 forwards localhost).

You can also launch it from PowerShell directly into WSL:

```powershell
wsl -e bash -lc "export ROCMWSL_WEB_TOKEN='set-a-strong-token'; rocmwsl-web"
```

Tip: Change the port if needed: run(host='0.0.0.0', port=XXXX). Example from within WSL:

```bash
python -c "from rocm_wsl_ai.web.app import run; run(host='0.0.0.0', port=9000)"
```

---

## Using the Web UI

### Dashboard

- Cards show each tool’s status (running/stopped), PID, and actions.
- Click Install to run the installer as a background job.
- Start launches the tool (background when supported). Stop ends it.
- Logs opens a live log stream (switch between SSE/WS). Use the filter box for regex filtering; stderr/stdout are color‑coded.

### Tools page

- Per‑tool settings:
	- Interface URL (used for the “Open interface” link in cards)
	- Host & Port helpers that auto‑compose common flags (e.g., --listen/--port)
	- Extra Args to pass on start (stored and reused)
- The UI keeps Host/Port and URL in sync for convenience.

### Models page

- See where models are located for different categories.
- Build and refresh a searchable models index.
- Link your models into supported tools folders.
- Download preset model bundles (curated). Tasks run as jobs with progress.

### Wizard

- Configure base directory, venv, and optional defaults for tool flags (host/port/flags).
- Saves defaults into a tools.json so starts can reuse them.

### Help

- Quick tips and troubleshooting pointers integrated into the UI.

---

## Updates

To update the package:

```powershell
pip install -U rocm-wsl-ai
```

Your settings (tools.json), job history (jobs.json), and config live in the project’s config directory and are preserved across updates.

---

## Security

- Optional token-based auth: set an environment variable before you start the server.

```powershell
$env:ROCMWSL_WEB_TOKEN = "your-long-random-token"
rocmwsl-web
```

- With a token set, the UI redirects to a small login where you paste the token. APIs also accept the token via cookie, x-auth header, or token query parameter.
- If you expose the server on your LAN (host 0.0.0.0), use a token. For public networks, prefer a proper reverse proxy and TLS.

---

## FAQs / Troubleshooting

- The tool doesn’t start or shows “stopped” quickly.
	- Open Logs to see errors in real‑time. Check Extra Args on Tools page. Verify the tool repository and dependencies are installed.
- Interface link opens the wrong port.
	- Edit the URL in the tool’s settings. Host/Port helpers can auto‑compose flags; the UI syncs URL and Host/Port.
- SillyTavern install requires Node.
	- The installer attempts to guide via nvm. If Node isn’t present, the job runs nvm install/use LTS and npm install in the SillyTavern folder.
- Where are my settings stored?
	- tools.json and jobs.json are saved next to your main config.toml (see Models page → Where for base folder hints).

---

## Uninstall

```powershell
pip uninstall rocm-wsl-ai
```

---

## License

MIT

---

## Releasing (maintainers)

PyPI:

```bash
python -m pip install --upgrade build twine
python -m build
twine check dist/*
twine upload dist/*
```

GitHub:
- Tag the release (e.g., v0.2.0) and push tags
- Create a GitHub Release with notes and attach wheels/sdist if desired

After PyPI release, you can simplify README install instructions to a single line:

```bash
pip install rocm-wsl-ai
```


## The TUI (optional)
```bash
rocmwsl menu   # launches the Textual TUI
```
Use arrow keys/Enter. Install “base” first, then pick your tools. Launch and update from the TUI or via CLI.

## Typical first run
1) Wizard: schnelle Ersteinrichtung (Base + optional ComfyUI)
```bash
rocmwsl wizard
```
oder manuell:
1) Installation → Base (ROCm & PyTorch Nightly)
2) Restart WSL if asked
3) Installation → Pick your tools (e.g., ComfyUI, A1111, Ollama)
4) Launch → Start your tools

CLI equivalent
```bash
rocmwsl wizard --base-dir "$HOME/AI" --venv-name genai_env
# oder manuell
rocmwsl install base
rocmwsl install comfyui
rocmwsl start comfyui
```

## Upgrading
Upgrades:
- Update everything: `rocmwsl update all`
- Update a single tool (e.g., ComfyUI): `rocmwsl update comfyui`
- Update base (PyTorch Nightly): `rocmwsl update base`
- Self-update (CLI/TUI):
	- If installed via pipx: `pipx upgrade rocm-wsl-ai` or `pipx upgrade rocmwsl`
	- From CLI: `rocmwsl update self`

## Useful tips
- Diagnose: `rocmwsl doctor` (prĂĽft /dev/kfd, rocm-smi/rocminfo, Torch/HIP im venv)
- Konfiguration: `~/.config/rocm-wsl-ai/config.toml`
	```toml
	[paths]
	base_dir = "/home/<user>/AI"

	[python]
	venv_name = "genai_env"
	```
- If the TUI looks very plain, install whiptail (see Requirements)
- If you changed groups during base install: restart WSL (`wsl --shutdown` from Windows)
- Ollama’s systemd user service may require systemd in WSL; if it doesn’t start, run it manually via the scripts
- For ROCm trouble, use the menu’s Driver Management and follow the prompts

## License
MIT


            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "rocm-wsl-ai",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.10",
    "maintainer_email": null,
    "keywords": "rocm, wsl, amd, fastapi, webui, stable-diffusion, llm, comfyui, ollama, textgen",
    "author": "daMustermann",
    "author_email": null,
    "download_url": "https://files.pythonhosted.org/packages/20/2a/dd431f94ceaa8ad7d11978dbef4a1c0f8216640c49a5c5744ab3a737c24e/rocm_wsl_ai-0.2.10.tar.gz",
    "platform": null,
    "description": "\r\n# \ud83d\udd25 ROCm-WSL-AI Web UI\r\n\r\n![License](https://img.shields.io/badge/License-MIT-blue)\r\n![Platform](https://img.shields.io/badge/WSL2-Ubuntu%2024.04-informational)\r\n![GPU](https://img.shields.io/badge/AMD-ROCm%20latest-EE4C2C)\r\n![PyTorch](https://img.shields.io/badge/PyTorch-Nightly-orange)\r\n![Shell](https://img.shields.io/badge/CLI-python%20(%20rocmwsl%20)-4EAA25)\r\n![UI](https://img.shields.io/badge/TUI-textual-6f42c1)\r\n![Status](https://img.shields.io/badge/Status-Active-brightgreen)\r\n\r\nMake your AMD GPU sing inside WSL2. Use one Python CLI (rocmwsl, alias rocm-wsl-ai) and an optional Textual TUI to install, launch, update, and remove local AI tools \u2014 always ready for the latest PyTorch Nightly.\r\n\r\n## What you get\r\n- Always latest ROCm (from AMD\u2019s \u201clatest\u201d apt repo) + PyTorch Nightly matched to your installed ROCm series\r\n- A modern, keyboard-driven TUI (Textual) with clear categories\r\n- One place to install, start, update, and remove local AI tools (image gen + LLMs)\r\n- Optional no-chmod Python CLI: install and run everything with a single command\r\n\r\n## Tools included (by category)\r\nImage generation\r\n- ComfyUI\r\n- SD.Next\r\n- Automatic1111 WebUI\r\n- InvokeAI\r\n- Fooocus\r\n- SD WebUI Forge\r\n\r\nLLMs\r\n- Ollama (with a small model manager script)\r\n- Text Generation WebUI\r\n## ROCm\u2011WSL\u2011AI Web UI\r\n\r\nA modern, lightweight web interface to install, run, and manage local AI tools on WSL2 with AMD ROCm. It wraps the existing project features into a single browser-based control panel with live logs, jobs, and per\u2011tool settings.\r\n\r\n### Highlights\r\n\r\n- One web dashboard for popular tools (ComfyUI, A1111/SD.Next/Forge, Fooocus, InvokeAI, SillyTavern, TextGen, llama.cpp, KoboldCpp, FastChat, Ollama)\r\n- Start/stop, status, and interface links per tool\r\n- Live logs via SSE or WebSocket with filter and colorized streams\r\n- Job history with progress for installers and long\u2011running tasks\r\n- Models: location overview, index, refresh, link, and curated preset downloads\r\n- Wizard to set up base folders/venv and defaults for tool flags\r\n- Per\u2011tool settings persist (URL, extra args), plus smart Host/Port helpers\r\n- Clean, responsive UI (PicoCSS), with theme toggle and small toasts/dialogs\r\n\r\n---\r\n\r\n## Quick start\r\n\r\n1) Install inside WSL2 (recommended)\r\n\r\nPyPI (preferred):\r\n\r\n```bash\r\npython3 -m venv ~/.venvs/rocmwsl\r\nsource ~/.venvs/rocmwsl/bin/activate\r\npython -m pip install --upgrade pip\r\npip install rocm-wsl-ai\r\n```\r\n\r\nPyPI package: https://pypi.org/project/rocm-wsl-ai/\r\n\r\nFrom source (alternative, if you cloned this repo on Windows):\r\n\r\n```bash\r\ncd /mnt/f/Coding/rocm-wsl-ai   # adjust path to your repo inside WSL\r\npython3 -m venv ~/.venvs/rocmwsl\r\nsource ~/.venvs/rocmwsl/bin/activate\r\npython -m pip install --upgrade pip\r\npip install -e .\r\n```\r\n\r\nFrom Windows PowerShell into WSL using your local checkout:\r\n\r\n```powershell\r\nwsl -e bash -lc \"cd /mnt/f/Coding/rocm-wsl-ai && python3 -m venv ~/.venvs/rocmwsl && source ~/.venvs/rocmwsl/bin/activate && python -m pip install --upgrade pip && pip install -e .\"\r\n```\r\n\r\n2) Run the Web UI inside WSL:\r\n\r\n```bash\r\nexport ROCMWSL_WEB_TOKEN=\"set-a-strong-token\"   # optional but recommended on LAN\r\nrocmwsl-web                                     # serves on 0.0.0.0:8000\r\n```\r\n\r\nFrom Windows, open http://localhost:8000 in your browser (WSL2 forwards localhost).\r\n\r\nYou can also launch it from PowerShell directly into WSL:\r\n\r\n```powershell\r\nwsl -e bash -lc \"export ROCMWSL_WEB_TOKEN='set-a-strong-token'; rocmwsl-web\"\r\n```\r\n\r\nTip: Change the port if needed: run(host='0.0.0.0', port=XXXX). Example from within WSL:\r\n\r\n```bash\r\npython -c \"from rocm_wsl_ai.web.app import run; run(host='0.0.0.0', port=9000)\"\r\n```\r\n\r\n---\r\n\r\n## Using the Web UI\r\n\r\n### Dashboard\r\n\r\n- Cards show each tool\u2019s status (running/stopped), PID, and actions.\r\n- Click Install to run the installer as a background job.\r\n- Start launches the tool (background when supported). Stop ends it.\r\n- Logs opens a live log stream (switch between SSE/WS). Use the filter box for regex filtering; stderr/stdout are color\u2011coded.\r\n\r\n### Tools page\r\n\r\n- Per\u2011tool settings:\r\n\t- Interface URL (used for the \u201cOpen interface\u201d link in cards)\r\n\t- Host & Port helpers that auto\u2011compose common flags (e.g., --listen/--port)\r\n\t- Extra Args to pass on start (stored and reused)\r\n- The UI keeps Host/Port and URL in sync for convenience.\r\n\r\n### Models page\r\n\r\n- See where models are located for different categories.\r\n- Build and refresh a searchable models index.\r\n- Link your models into supported tools folders.\r\n- Download preset model bundles (curated). Tasks run as jobs with progress.\r\n\r\n### Wizard\r\n\r\n- Configure base directory, venv, and optional defaults for tool flags (host/port/flags).\r\n- Saves defaults into a tools.json so starts can reuse them.\r\n\r\n### Help\r\n\r\n- Quick tips and troubleshooting pointers integrated into the UI.\r\n\r\n---\r\n\r\n## Updates\r\n\r\nTo update the package:\r\n\r\n```powershell\r\npip install -U rocm-wsl-ai\r\n```\r\n\r\nYour settings (tools.json), job history (jobs.json), and config live in the project\u2019s config directory and are preserved across updates.\r\n\r\n---\r\n\r\n## Security\r\n\r\n- Optional token-based auth: set an environment variable before you start the server.\r\n\r\n```powershell\r\n$env:ROCMWSL_WEB_TOKEN = \"your-long-random-token\"\r\nrocmwsl-web\r\n```\r\n\r\n- With a token set, the UI redirects to a small login where you paste the token. APIs also accept the token via cookie, x-auth header, or token query parameter.\r\n- If you expose the server on your LAN (host 0.0.0.0), use a token. For public networks, prefer a proper reverse proxy and TLS.\r\n\r\n---\r\n\r\n## FAQs / Troubleshooting\r\n\r\n- The tool doesn\u2019t start or shows \u201cstopped\u201d quickly.\r\n\t- Open Logs to see errors in real\u2011time. Check Extra Args on Tools page. Verify the tool repository and dependencies are installed.\r\n- Interface link opens the wrong port.\r\n\t- Edit the URL in the tool\u2019s settings. Host/Port helpers can auto\u2011compose flags; the UI syncs URL and Host/Port.\r\n- SillyTavern install requires Node.\r\n\t- The installer attempts to guide via nvm. If Node isn\u2019t present, the job runs nvm install/use LTS and npm install in the SillyTavern folder.\r\n- Where are my settings stored?\r\n\t- tools.json and jobs.json are saved next to your main config.toml (see Models page \u2192 Where for base folder hints).\r\n\r\n---\r\n\r\n## Uninstall\r\n\r\n```powershell\r\npip uninstall rocm-wsl-ai\r\n```\r\n\r\n---\r\n\r\n## License\r\n\r\nMIT\r\n\r\n---\r\n\r\n## Releasing (maintainers)\r\n\r\nPyPI:\r\n\r\n```bash\r\npython -m pip install --upgrade build twine\r\npython -m build\r\ntwine check dist/*\r\ntwine upload dist/*\r\n```\r\n\r\nGitHub:\r\n- Tag the release (e.g., v0.2.0) and push tags\r\n- Create a GitHub Release with notes and attach wheels/sdist if desired\r\n\r\nAfter PyPI release, you can simplify README install instructions to a single line:\r\n\r\n```bash\r\npip install rocm-wsl-ai\r\n```\r\n\r\n\r\n## The TUI (optional)\r\n```bash\r\nrocmwsl menu   # launches the Textual TUI\r\n```\r\nUse arrow keys/Enter. Install \u201cbase\u201d first, then pick your tools. Launch and update from the TUI or via CLI.\r\n\r\n## Typical first run\r\n1) Wizard: schnelle Ersteinrichtung (Base + optional ComfyUI)\r\n```bash\r\nrocmwsl wizard\r\n```\r\noder manuell:\r\n1) Installation \u2192 Base (ROCm & PyTorch Nightly)\r\n2) Restart WSL if asked\r\n3) Installation \u2192 Pick your tools (e.g., ComfyUI, A1111, Ollama)\r\n4) Launch \u2192 Start your tools\r\n\r\nCLI equivalent\r\n```bash\r\nrocmwsl wizard --base-dir \"$HOME/AI\" --venv-name genai_env\r\n# oder manuell\r\nrocmwsl install base\r\nrocmwsl install comfyui\r\nrocmwsl start comfyui\r\n```\r\n\r\n## Upgrading\r\nUpgrades:\r\n- Update everything: `rocmwsl update all`\r\n- Update a single tool (e.g., ComfyUI): `rocmwsl update comfyui`\r\n- Update base (PyTorch Nightly): `rocmwsl update base`\r\n- Self-update (CLI/TUI):\r\n\t- If installed via pipx: `pipx upgrade rocm-wsl-ai` or `pipx upgrade rocmwsl`\r\n\t- From CLI: `rocmwsl update self`\r\n\r\n## Useful tips\r\n- Diagnose: `rocmwsl doctor` (pr\u00fcft /dev/kfd, rocm-smi/rocminfo, Torch/HIP im venv)\r\n- Konfiguration: `~/.config/rocm-wsl-ai/config.toml`\r\n\t```toml\r\n\t[paths]\r\n\tbase_dir = \"/home/<user>/AI\"\r\n\r\n\t[python]\r\n\tvenv_name = \"genai_env\"\r\n\t```\r\n- If the TUI looks very plain, install whiptail (see Requirements)\r\n- If you changed groups during base install: restart WSL (`wsl --shutdown` from Windows)\r\n- Ollama\u2019s systemd user service may require systemd in WSL; if it doesn\u2019t start, run it manually via the scripts\r\n- For ROCm trouble, use the menu\u2019s Driver Management and follow the prompts\r\n\r\n## License\r\nMIT\r\n\r\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "Web UI to install and manage AMD ROCm + local AI tools on WSL2 (with optional CLI/TUI)",
    "version": "0.2.10",
    "project_urls": {
        "Homepage": "https://github.com/daMustermann/rocm-wsl-ai",
        "Issues": "https://github.com/daMustermann/rocm-wsl-ai/issues",
        "Repository": "https://github.com/daMustermann/rocm-wsl-ai"
    },
    "split_keywords": [
        "rocm",
        " wsl",
        " amd",
        " fastapi",
        " webui",
        " stable-diffusion",
        " llm",
        " comfyui",
        " ollama",
        " textgen"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "677166fb78bb528a48624acba84e31a1ee6eb515c0a8e3c2aa1b520fad38a608",
                "md5": "409e1b9157f932b42c517df69a5fc678",
                "sha256": "9fd058fe73f841ccd63b95eaa4bb56bf96660beea3cb69822c0dcfa85141e0aa"
            },
            "downloads": -1,
            "filename": "rocm_wsl_ai-0.2.10-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "409e1b9157f932b42c517df69a5fc678",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.10",
            "size": 62029,
            "upload_time": "2025-08-08T20:09:52",
            "upload_time_iso_8601": "2025-08-08T20:09:52.530978Z",
            "url": "https://files.pythonhosted.org/packages/67/71/66fb78bb528a48624acba84e31a1ee6eb515c0a8e3c2aa1b520fad38a608/rocm_wsl_ai-0.2.10-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "202add431f94ceaa8ad7d11978dbef4a1c0f8216640c49a5c5744ab3a737c24e",
                "md5": "739474dcbbf850a807db65d41d3a739f",
                "sha256": "a8d2481b839b4db4a02d115a2a8251db5711eddfaee9e8da40db8de813370a98"
            },
            "downloads": -1,
            "filename": "rocm_wsl_ai-0.2.10.tar.gz",
            "has_sig": false,
            "md5_digest": "739474dcbbf850a807db65d41d3a739f",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.10",
            "size": 52058,
            "upload_time": "2025-08-08T20:09:53",
            "upload_time_iso_8601": "2025-08-08T20:09:53.517319Z",
            "url": "https://files.pythonhosted.org/packages/20/2a/dd431f94ceaa8ad7d11978dbef4a1c0f8216640c49a5c5744ab3a737c24e/rocm_wsl_ai-0.2.10.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-08-08 20:09:53",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "daMustermann",
    "github_project": "rocm-wsl-ai",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "lcname": "rocm-wsl-ai"
}
        
Elapsed time: 1.85438s