vllm-top


Namevllm-top JSON
Version 0.1.2 PyPI version JSON
download
home_pageNone
SummaryA monitoring tool for vLLM metrics.
upload_time2025-08-03 16:59:45
maintainerNone
docs_urlNone
authorNone
requires_python>=3.6
licenseMIT
keywords vllm monitoring metrics
VCS
bugtrack_url
requirements prometheus_client requests
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # vllm-top

[![PyPI version](https://img.shields.io/pypi/v/vllm-top.svg)](https://pypi.org/project/vllm-top/)

<p align="center">
  <img src="demo/demo.gif" alt="Demo" width="600"/>
</p>

**vllm-top** is a Python package for monitoring and displaying metrics from the [vLLM](https://github.com/vllm-project/vllm) service. It provides a comprehensive dashboard to visualize both current state and historical performance, making it easy to track and analyze service behavior over time.

---

## 🚀 Features

- **Task State Visibility:** Instantly see GPU Cache Usage, Running and Waiting requests to help debug bottlenecks and improve throughput.
- **Minimalist Monitoring:** Lightweight dashboard that parses metrics directly from Prometheus.
- **Quick Setup:** No extra configuration — just pip install and run.

---

## 📦 Installation

Install via pip:

```bash
pip install vllm-top
```

---

## 🛠️ Usage

Start monitoring:

```bash
vllm-top
```

Change update interval (in seconds):

```bash
vllm-top --interval 5
```

Get a one-time snapshot:

```bash
vllm-top --snapshot
```

---

## 🤝 Contributing

Contributions are welcome! Please submit a pull request or open an issue for enhancements or bug fixes.

---

## 📄 License

Licensed under the MIT License. See the [LICENSE](LICENSE) file for details.

---

## 📜 Changelog

See [CHANGELOG.md](CHANGELOG.md) for a detailed

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "vllm-top",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.6",
    "maintainer_email": null,
    "keywords": "vllm, monitoring, metrics",
    "author": null,
    "author_email": "Yeok Tatt Cheah <yeokch@gmail.com>",
    "download_url": "https://files.pythonhosted.org/packages/2a/8c/4411331293908f4546b68a8a2da82a3eb94a63cd3ae738875ff82e406ffa/vllm_top-0.1.2.tar.gz",
    "platform": null,
    "description": "# vllm-top\n\n[![PyPI version](https://img.shields.io/pypi/v/vllm-top.svg)](https://pypi.org/project/vllm-top/)\n\n<p align=\"center\">\n  <img src=\"demo/demo.gif\" alt=\"Demo\" width=\"600\"/>\n</p>\n\n**vllm-top** is a Python package for monitoring and displaying metrics from the [vLLM](https://github.com/vllm-project/vllm) service. It provides a comprehensive dashboard to visualize both current state and historical performance, making it easy to track and analyze service behavior over time.\n\n---\n\n## \ud83d\ude80 Features\n\n- **Task State Visibility:** Instantly see GPU Cache Usage, Running and Waiting requests to help debug bottlenecks and improve throughput.\n- **Minimalist Monitoring:** Lightweight dashboard that parses metrics directly from Prometheus.\n- **Quick Setup:** No extra configuration \u2014 just pip install and run.\n\n---\n\n## \ud83d\udce6 Installation\n\nInstall via pip:\n\n```bash\npip install vllm-top\n```\n\n---\n\n## \ud83d\udee0\ufe0f Usage\n\nStart monitoring:\n\n```bash\nvllm-top\n```\n\nChange update interval (in seconds):\n\n```bash\nvllm-top --interval 5\n```\n\nGet a one-time snapshot:\n\n```bash\nvllm-top --snapshot\n```\n\n---\n\n## \ud83e\udd1d Contributing\n\nContributions are welcome! Please submit a pull request or open an issue for enhancements or bug fixes.\n\n---\n\n## \ud83d\udcc4 License\n\nLicensed under the MIT License. See the [LICENSE](LICENSE) file for details.\n\n---\n\n## \ud83d\udcdc Changelog\n\nSee [CHANGELOG.md](CHANGELOG.md) for a detailed\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "A monitoring tool for vLLM metrics.",
    "version": "0.1.2",
    "project_urls": {
        "Documentation": "https://github.com/yeok-c/vllm-top#readme",
        "Homepage": "https://github.com/yeok-c/vllm-top",
        "Issues": "https://github.com/yeok-c/vllm-top/issues",
        "Repository": "https://github.com/yeok-c/vllm-top"
    },
    "split_keywords": [
        "vllm",
        " monitoring",
        " metrics"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "cf30dc990f9ecd1fc1fe5220849a429480dadc4a04cdfc80b2cc050e73ffc95c",
                "md5": "efa77a83fce0ec4fd0a0b02c41cfe746",
                "sha256": "27b64ae88e33725a7f1d04879bf66932dadb2bb46ca422ea22b11e9b005024c4"
            },
            "downloads": -1,
            "filename": "vllm_top-0.1.2-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "efa77a83fce0ec4fd0a0b02c41cfe746",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.6",
            "size": 8194,
            "upload_time": "2025-08-03T16:59:44",
            "upload_time_iso_8601": "2025-08-03T16:59:44.535159Z",
            "url": "https://files.pythonhosted.org/packages/cf/30/dc990f9ecd1fc1fe5220849a429480dadc4a04cdfc80b2cc050e73ffc95c/vllm_top-0.1.2-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "2a8c4411331293908f4546b68a8a2da82a3eb94a63cd3ae738875ff82e406ffa",
                "md5": "ff1e65b5f07e4a28b410b973e49ac5a6",
                "sha256": "0a9e9e9e21cf3ce5752f9685aa02a7d5fb690647fb9eca918cacd4da7e464ccd"
            },
            "downloads": -1,
            "filename": "vllm_top-0.1.2.tar.gz",
            "has_sig": false,
            "md5_digest": "ff1e65b5f07e4a28b410b973e49ac5a6",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.6",
            "size": 9335,
            "upload_time": "2025-08-03T16:59:45",
            "upload_time_iso_8601": "2025-08-03T16:59:45.619508Z",
            "url": "https://files.pythonhosted.org/packages/2a/8c/4411331293908f4546b68a8a2da82a3eb94a63cd3ae738875ff82e406ffa/vllm_top-0.1.2.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-08-03 16:59:45",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "yeok-c",
    "github_project": "vllm-top#readme",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "requirements": [
        {
            "name": "prometheus_client",
            "specs": []
        },
        {
            "name": "requests",
            "specs": []
        }
    ],
    "lcname": "vllm-top"
}
        
Elapsed time: 1.43649s