vllm-spyre


Namevllm-spyre JSON
Version 0.5.2 PyPI version JSON
download
home_pageNone
SummaryvLLM plugin for Spyre hardware support
upload_time2025-07-16 20:27:31
maintainerNone
docs_urlNone
authorNone
requires_python>=3.9
licenseApache 2
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            <h1 align="center">
Spyre Plugin for vLLM
</h1>

<p align="center">
| <a href="https://vllm-spyre.readthedocs.io/en/latest/"><b>Documentation</b></a> | <a href="https://discuss.vllm.ai/c/hardware-support/vllm-spyre-support"><b>Users Forum</b></a> | <a href="https://slack.vllm.ai"><b>#sig-spyre</b></a> |
</p>

---
**IBM Spyre** is the first production-grade Artificial Intelligence Unit (AIU) accelerator born out of the IBM Research AIU family, and is part of a long-term strategy of developing novel architectures and full-stack technology solutions for the emerging space of generative AI. Spyre builds on the foundation of IBM’s internal AIU research and delivers a scalable, efficient architecture for accelerating AI in enterprise environments.

The vLLM Spyre plugin (`vllm-spyre`) is a dedicated backend extension that enables seamless integration of IBM Spyre Accelerator with vLLM. It follows the architecture described in [vLLM's Plugin System](https://docs.vllm.ai/en/latest/design/plugin_system.html), making it easy to integrate IBM's advanced AI acceleration into existing vLLM workflows.

For more information, check out the following:

- 📚 [Meet the IBM Artificial Intelligence Unit](https://research.ibm.com/blog/ibm-artificial-intelligence-unit-aiu)
- 📽️ [AI Accelerators: Transforming Scalability & Model Efficiency](https://www.youtube.com/watch?v=KX0qBM-ByAg)
- 🚀 [Spyre Accelerator for IBM Z](https://research.ibm.com/blog/spyre-for-z)

## Getting Started

Visit our [documentation](https://vllm-spyre.readthedocs.io/en/latest/):
- [Installation](https://vllm-spyre.readthedocs.io/en/latest/getting_started/installation.html)
- [List of Supported Models](https://vllm-spyre.readthedocs.io/en/latest/user_guide/supported_models.html)
- [List of Supported Features](https://vllm-spyre.readthedocs.io/en/latest/user_guide/supported_features.html)

## Contributing

We welcome and value any contributions and collaborations. Please check out [Contributing to vLLM Spyre](https://vllm-spyre.readthedocs.io/en/latest/contributing/index.html) for how to get involved.

## Contact

You can reach out for discussion or support in the `#sig-spyre` channel in the [vLLM Slack](https://inviter.co/vllm-slack) workspace or by [opening an issue](https://vllm-spyre.readthedocs.io/en/latest/contributing/index.html#issues).

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "vllm-spyre",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.9",
    "maintainer_email": null,
    "keywords": null,
    "author": null,
    "author_email": null,
    "download_url": "https://files.pythonhosted.org/packages/38/33/08e9bf65edc02d033fbcdbb78603013008e56d032936fe10b13c40da0413/vllm_spyre-0.5.2.tar.gz",
    "platform": null,
    "description": "<h1 align=\"center\">\nSpyre Plugin for vLLM\n</h1>\n\n<p align=\"center\">\n| <a href=\"https://vllm-spyre.readthedocs.io/en/latest/\"><b>Documentation</b></a> | <a href=\"https://discuss.vllm.ai/c/hardware-support/vllm-spyre-support\"><b>Users Forum</b></a> | <a href=\"https://slack.vllm.ai\"><b>#sig-spyre</b></a> |\n</p>\n\n---\n**IBM Spyre** is the first production-grade Artificial Intelligence Unit (AIU) accelerator born out of the IBM Research AIU family, and is part of a long-term strategy of developing novel architectures and full-stack technology solutions for the emerging space of generative AI. Spyre builds on the foundation of IBM\u2019s internal AIU research and delivers a scalable, efficient architecture for accelerating AI in enterprise environments.\n\nThe vLLM Spyre plugin (`vllm-spyre`) is a dedicated backend extension that enables seamless integration of IBM Spyre Accelerator with vLLM. It follows the architecture described in [vLLM's Plugin System](https://docs.vllm.ai/en/latest/design/plugin_system.html), making it easy to integrate IBM's advanced AI acceleration into existing vLLM workflows.\n\nFor more information, check out the following:\n\n- \ud83d\udcda [Meet the IBM Artificial Intelligence Unit](https://research.ibm.com/blog/ibm-artificial-intelligence-unit-aiu)\n- \ud83d\udcfd\ufe0f [AI Accelerators: Transforming Scalability & Model Efficiency](https://www.youtube.com/watch?v=KX0qBM-ByAg)\n- \ud83d\ude80 [Spyre Accelerator for IBM Z](https://research.ibm.com/blog/spyre-for-z)\n\n## Getting Started\n\nVisit our [documentation](https://vllm-spyre.readthedocs.io/en/latest/):\n- [Installation](https://vllm-spyre.readthedocs.io/en/latest/getting_started/installation.html)\n- [List of Supported Models](https://vllm-spyre.readthedocs.io/en/latest/user_guide/supported_models.html)\n- [List of Supported Features](https://vllm-spyre.readthedocs.io/en/latest/user_guide/supported_features.html)\n\n## Contributing\n\nWe welcome and value any contributions and collaborations. Please check out [Contributing to vLLM Spyre](https://vllm-spyre.readthedocs.io/en/latest/contributing/index.html) for how to get involved.\n\n## Contact\n\nYou can reach out for discussion or support in the `#sig-spyre` channel in the [vLLM Slack](https://inviter.co/vllm-slack) workspace or by [opening an issue](https://vllm-spyre.readthedocs.io/en/latest/contributing/index.html#issues).\n",
    "bugtrack_url": null,
    "license": "Apache 2",
    "summary": "vLLM plugin for Spyre hardware support",
    "version": "0.5.2",
    "project_urls": null,
    "split_keywords": [],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "064191e2e4127cf85e5d9146a4837f584cc5e311448dbdcd8f0dab0e0efd44b4",
                "md5": "f9c018e203470af3855bf2882c7f85f9",
                "sha256": "b92dbb9bf0d1a8d340fb541cbb5a0924840313323f34b2095dfc7e6aa9c8a659"
            },
            "downloads": -1,
            "filename": "vllm_spyre-0.5.2-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "f9c018e203470af3855bf2882c7f85f9",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.9",
            "size": 73714,
            "upload_time": "2025-07-16T20:27:30",
            "upload_time_iso_8601": "2025-07-16T20:27:30.337923Z",
            "url": "https://files.pythonhosted.org/packages/06/41/91e2e4127cf85e5d9146a4837f584cc5e311448dbdcd8f0dab0e0efd44b4/vllm_spyre-0.5.2-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "383308e9bf65edc02d033fbcdbb78603013008e56d032936fe10b13c40da0413",
                "md5": "afe6c049cee8e779b637f461893afee2",
                "sha256": "52883cacffa639a3eb000bc7704c31210b2934f7653cf33e72d978dba2759970"
            },
            "downloads": -1,
            "filename": "vllm_spyre-0.5.2.tar.gz",
            "has_sig": false,
            "md5_digest": "afe6c049cee8e779b637f461893afee2",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.9",
            "size": 427864,
            "upload_time": "2025-07-16T20:27:31",
            "upload_time_iso_8601": "2025-07-16T20:27:31.984960Z",
            "url": "https://files.pythonhosted.org/packages/38/33/08e9bf65edc02d033fbcdbb78603013008e56d032936fe10b13c40da0413/vllm_spyre-0.5.2.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-07-16 20:27:31",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "lcname": "vllm-spyre"
}
        
Elapsed time: 0.41130s