backend.ai-accelerator-cuda-open


Namebackend.ai-accelerator-cuda-open JSON
Version 24.3.3 PyPI version JSON
download
home_pagehttps://github.com/lablup/backend.ai
SummaryBackend.AI Accelerator Plugin for CUDA
upload_time2024-04-30 16:58:16
maintainerNone
docs_urlNone
authorLablup Inc. and contributors
requires_python<3.13,>=3.12
licenseLGPLv3
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            Backend.AI Accelerator Plugin for CUDA
======================================

Just install this along with Backend.AI agents, using the same virtual environment.
This will allow the agents to detect CUDA devices on their hosts and make them
available to Backend.AI kernel sessions.

```console
$ pip install backend.ai-accelerator-cuda
```

This open-source edition of CUDA plugins support allocation of one or more CUDA
devices to a container, slot-by-slot.

Compatibility Matrix
--------------------

|       Backend.AI Agent       |    CUDA Plugin   |
|:----------------------------:|:----------------:|
|  20.09.x                     |  2.0.0           |
|  20.03.9 ~                   |  2.0.0           |
|  20.03.0 ~ 20.03.8           |  0.14.x          |
|  19.09.17 ~                  |  0.13.x          |
|  19.06.x, 19.09.0 ~ 19.09.16 |  0.11.x, 0.12.x  |
|  19.03.x                     |  0.10.x          |

In the versions released after the above matrix, the agent will set the required
version range of this plugin as an extra requirements set "cuda".

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/lablup/backend.ai",
    "name": "backend.ai-accelerator-cuda-open",
    "maintainer": null,
    "docs_url": null,
    "requires_python": "<3.13,>=3.12",
    "maintainer_email": null,
    "keywords": null,
    "author": "Lablup Inc. and contributors",
    "author_email": null,
    "download_url": "https://files.pythonhosted.org/packages/f1/54/cc7c3aa1f3b66aa1a141e2d926799806903c46aaa04813674dc51f5b5f00/backend.ai-accelerator-cuda-open-24.3.3.tar.gz",
    "platform": null,
    "description": "Backend.AI Accelerator Plugin for CUDA\n======================================\n\nJust install this along with Backend.AI agents, using the same virtual environment.\nThis will allow the agents to detect CUDA devices on their hosts and make them\navailable to Backend.AI kernel sessions.\n\n```console\n$ pip install backend.ai-accelerator-cuda\n```\n\nThis open-source edition of CUDA plugins support allocation of one or more CUDA\ndevices to a container, slot-by-slot.\n\nCompatibility Matrix\n--------------------\n\n|       Backend.AI Agent       |    CUDA Plugin   |\n|:----------------------------:|:----------------:|\n|  20.09.x                     |  2.0.0           |\n|  20.03.9 ~                   |  2.0.0           |\n|  20.03.0 ~ 20.03.8           |  0.14.x          |\n|  19.09.17 ~                  |  0.13.x          |\n|  19.06.x, 19.09.0 ~ 19.09.16 |  0.11.x, 0.12.x  |\n|  19.03.x                     |  0.10.x          |\n\nIn the versions released after the above matrix, the agent will set the required\nversion range of this plugin as an extra requirements set \"cuda\".\n",
    "bugtrack_url": null,
    "license": "LGPLv3",
    "summary": "Backend.AI Accelerator Plugin for CUDA",
    "version": "24.3.3",
    "project_urls": {
        "Documentation": "https://docs.backend.ai/",
        "Homepage": "https://github.com/lablup/backend.ai",
        "Source": "https://github.com/lablup/backend.ai"
    },
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "fec2fd8712239b641992f71546585a049d558ad1f29a91b32c610adb398b5501",
                "md5": "03a51010d4ccf217f9365d046dc954c0",
                "sha256": "57f37b1ec7e19732bee63ccaab615504461a2fd46def7fe8b1f591971af52406"
            },
            "downloads": -1,
            "filename": "backend.ai_accelerator_cuda_open-24.3.3-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "03a51010d4ccf217f9365d046dc954c0",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<3.13,>=3.12",
            "size": 10925,
            "upload_time": "2024-04-30T16:57:31",
            "upload_time_iso_8601": "2024-04-30T16:57:31.189460Z",
            "url": "https://files.pythonhosted.org/packages/fe/c2/fd8712239b641992f71546585a049d558ad1f29a91b32c610adb398b5501/backend.ai_accelerator_cuda_open-24.3.3-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "f154cc7c3aa1f3b66aa1a141e2d926799806903c46aaa04813674dc51f5b5f00",
                "md5": "c51b6603d7515c2b05b8b12da628702a",
                "sha256": "16bc3433accfd0f2191fc83c5e80eff9693a9246b4b5c9e7fbab81a417b95e44"
            },
            "downloads": -1,
            "filename": "backend.ai-accelerator-cuda-open-24.3.3.tar.gz",
            "has_sig": false,
            "md5_digest": "c51b6603d7515c2b05b8b12da628702a",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<3.13,>=3.12",
            "size": 10792,
            "upload_time": "2024-04-30T16:58:16",
            "upload_time_iso_8601": "2024-04-30T16:58:16.900464Z",
            "url": "https://files.pythonhosted.org/packages/f1/54/cc7c3aa1f3b66aa1a141e2d926799806903c46aaa04813674dc51f5b5f00/backend.ai-accelerator-cuda-open-24.3.3.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-04-30 16:58:16",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "lablup",
    "github_project": "backend.ai",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "requirements": [],
    "lcname": "backend.ai-accelerator-cuda-open"
}
        
Elapsed time: 0.24670s