# Intel® Extension for OpenXLA*
[![Python](https://img.shields.io/pypi/pyversions/intel_extension_for_openxla)](https://badge.fury.io/py/intel-extension-for-openxla)
[![PyPI version](https://badge.fury.io/py/intel-extension-for-openxla.svg)](https://badge.fury.io/py/intel-extension-for-openxla)
[![version](https://img.shields.io/github/v/release/intel/intel-extension-for-openxla?color=brightgreen)](https://github.com/intel/intel-extension-for-openxla/releases)
The [OpenXLA](https://github.com/openxla/xla) Project brings together a community of developers and leading AI/ML teams to accelerate ML and address infrastructure fragmentation across ML frameworks and hardware.
Intel® Extension for OpenXLA* includes PJRT plugin implementation, which seamlessly runs JAX models on Intel GPU. The PJRT API simplified the integration, which allowed the Intel GPU plugin to be developed separately and quickly integrated into JAX.
## Installation
The following table tracks intel-extension-for-openxla versions and compatible versions of jax, jaxlib.
| **intel-extension-for-openxla** | **jaxlib** | **jax** |
|:-:|:-:|:-:|
| 0.4.0 | 0.4.26 | >= 0.4.26, <= 0.4.27|
| 0.3.0 | 0.4.24 | >= 0.4.24, <= 0.4.27|
| 0.2.1 | 0.4.20 | >= 0.4.20, <= 0.4.26|
| 0.2.0 | 0.4.20 | >= 0.4.20, <= 0.4.26|
| 0.1.0 | 0.4.13 | >= 0.4.13, <= 0.4.14|
```
pip install --upgrade intel-extension-for-openxla
```
## Security
See Intel's [Security Center](https://www.intel.com/content/www/us/en/security-center/default.html) for information on how to report a potential security issue or vulnerability.
See also: [Security Policy](https://github.com/intel/intel-extension-for-openxla/blob/main/security.md)
Raw data
{
"_id": null,
"home_page": "https://github.com/intel/intel-extension-for-openxla",
"name": "intel-extension-for-openxla",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.9",
"maintainer_email": null,
"keywords": "Intel\u00ae Extension for OpenXLA*",
"author": "Intel Corporation",
"author_email": "itex.maintainers@intel.com",
"download_url": "https://github.com/intel/intel-extension-for-openxla/tags",
"platform": null,
"description": "# Intel\u00ae Extension for OpenXLA*\n\n[![Python](https://img.shields.io/pypi/pyversions/intel_extension_for_openxla)](https://badge.fury.io/py/intel-extension-for-openxla)\n[![PyPI version](https://badge.fury.io/py/intel-extension-for-openxla.svg)](https://badge.fury.io/py/intel-extension-for-openxla)\n[![version](https://img.shields.io/github/v/release/intel/intel-extension-for-openxla?color=brightgreen)](https://github.com/intel/intel-extension-for-openxla/releases)\n\nThe [OpenXLA](https://github.com/openxla/xla) Project brings together a community of developers and leading AI/ML teams to accelerate ML and address infrastructure fragmentation across ML frameworks and hardware.\n\nIntel\u00ae Extension for OpenXLA* includes PJRT plugin implementation, which seamlessly runs JAX models on Intel GPU. The PJRT API simplified the integration, which allowed the Intel GPU plugin to be developed separately and quickly integrated into JAX.\n\n## Installation\n\nThe following table tracks intel-extension-for-openxla versions and compatible versions of jax, jaxlib.\n| **intel-extension-for-openxla** | **jaxlib** | **jax** |\n|:-:|:-:|:-:|\n| 0.4.0 | 0.4.26 | >= 0.4.26, <= 0.4.27|\n| 0.3.0 | 0.4.24 | >= 0.4.24, <= 0.4.27|\n| 0.2.1 | 0.4.20 | >= 0.4.20, <= 0.4.26|\n| 0.2.0 | 0.4.20 | >= 0.4.20, <= 0.4.26|\n| 0.1.0 | 0.4.13 | >= 0.4.13, <= 0.4.14|\n\n\n```\npip install --upgrade intel-extension-for-openxla\n```\n\n## Security\nSee Intel's [Security Center](https://www.intel.com/content/www/us/en/security-center/default.html) for information on how to report a potential security issue or vulnerability.\n\nSee also: [Security Policy](https://github.com/intel/intel-extension-for-openxla/blob/main/security.md)\n",
"bugtrack_url": null,
"license": "Apache 2.0",
"summary": "Intel\u00ae Extension for OpenXLA* library",
"version": "0.4.0",
"project_urls": {
"Bug Tracker": "https://github.com/intel/intel-extension-for-openxla/issues",
"Download": "https://github.com/intel/intel-extension-for-openxla/tags",
"Homepage": "https://github.com/intel/intel-extension-for-openxla"
},
"split_keywords": [
"intel\u00ae",
"extension",
"for",
"openxla*"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "990118ef487938bc5fc4c2ca45747ce889706ab1f7405d2567c6e3002ca7431d",
"md5": "3f1a473a0a58a0a621659145cb57cbe2",
"sha256": "30cc5129b663b920a52dc30247a85f38e0e439598a4afec69f6e06497a14053e"
},
"downloads": -1,
"filename": "intel_extension_for_openxla-0.4.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl",
"has_sig": false,
"md5_digest": "3f1a473a0a58a0a621659145cb57cbe2",
"packagetype": "bdist_wheel",
"python_version": "cp310",
"requires_python": ">=3.9",
"size": 80119854,
"upload_time": "2024-08-12T05:12:20",
"upload_time_iso_8601": "2024-08-12T05:12:20.363404Z",
"url": "https://files.pythonhosted.org/packages/99/01/18ef487938bc5fc4c2ca45747ce889706ab1f7405d2567c6e3002ca7431d/intel_extension_for_openxla-0.4.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "fe31ce0cdfaca9769390c1bcd76b39f5ffbf30c41cf193b56072e903f086f4e1",
"md5": "013ce2de81ba759f1ef87db22f4d070f",
"sha256": "184b2ed0ffe5e07968afb4aad1198e4f9b21c36094e6aebdb4e2646044c95f08"
},
"downloads": -1,
"filename": "intel_extension_for_openxla-0.4.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl",
"has_sig": false,
"md5_digest": "013ce2de81ba759f1ef87db22f4d070f",
"packagetype": "bdist_wheel",
"python_version": "cp311",
"requires_python": ">=3.9",
"size": 80119756,
"upload_time": "2024-08-12T05:12:32",
"upload_time_iso_8601": "2024-08-12T05:12:32.139921Z",
"url": "https://files.pythonhosted.org/packages/fe/31/ce0cdfaca9769390c1bcd76b39f5ffbf30c41cf193b56072e903f086f4e1/intel_extension_for_openxla-0.4.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "93070dda1d60f4dab0eb68cd43f495b30e36ea56fb326acd6038badab228b402",
"md5": "1a9530e7f1347182f772b1abc5968129",
"sha256": "b89fabf35a9be93db9ff4ff48ab53e49e34b064bb9c31bf67475a806a6fd9f58"
},
"downloads": -1,
"filename": "intel_extension_for_openxla-0.4.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl",
"has_sig": false,
"md5_digest": "1a9530e7f1347182f772b1abc5968129",
"packagetype": "bdist_wheel",
"python_version": "cp312",
"requires_python": ">=3.9",
"size": 80120361,
"upload_time": "2024-08-12T05:12:44",
"upload_time_iso_8601": "2024-08-12T05:12:44.551480Z",
"url": "https://files.pythonhosted.org/packages/93/07/0dda1d60f4dab0eb68cd43f495b30e36ea56fb326acd6038badab228b402/intel_extension_for_openxla-0.4.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "e9e2003c7a35512308d20b4cd7e32eed105898e51e9afb1174df22cab0c02ddd",
"md5": "e1e37a9492b077d194f2b0a93cb4649f",
"sha256": "bd0032a3d0588a7c13fd549ea40d152105d6a5c5efdfa268418f08d49ccaf5d8"
},
"downloads": -1,
"filename": "intel_extension_for_openxla-0.4.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl",
"has_sig": false,
"md5_digest": "e1e37a9492b077d194f2b0a93cb4649f",
"packagetype": "bdist_wheel",
"python_version": "cp39",
"requires_python": ">=3.9",
"size": 80119759,
"upload_time": "2024-08-12T05:12:55",
"upload_time_iso_8601": "2024-08-12T05:12:55.868349Z",
"url": "https://files.pythonhosted.org/packages/e9/e2/003c7a35512308d20b4cd7e32eed105898e51e9afb1174df22cab0c02ddd/intel_extension_for_openxla-0.4.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-08-12 05:12:20",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "intel",
"github_project": "intel-extension-for-openxla",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"lcname": "intel-extension-for-openxla"
}