onnxruntime-openvino


Nameonnxruntime-openvino JSON
Version 1.20.0 PyPI version JSON
download
home_pagehttps://onnxruntime.ai
SummaryONNX Runtime is a runtime accelerator for Machine Learning models
upload_time2024-11-25 12:09:33
maintainerNone
docs_urlNone
authorMicrosoft Corporation
requires_pythonNone
licenseMIT License
keywords onnx machine learning
VCS
bugtrack_url
requirements coloredlogs flatbuffers numpy packaging protobuf sympy
Travis-CI No Travis.
coveralls test coverage No coveralls.
            OpenVINO™ Execution Provider for ONNX Runtime
===============================================

`OpenVINO™ Execution Provider for ONNX Runtime <https://onnxruntime.ai/docs/execution-providers/OpenVINO-ExecutionProvider.html>`_ is a product designed for ONNX Runtime developers who want to get started with OpenVINO™ in their inferencing applications. This product delivers  `OpenVINO™ <https://software.intel.com/content/www/us/en/develop/tools/openvino-toolkit.html>`_ inline optimizations which enhance inferencing performance with minimal code modifications.

OpenVINO™ Execution Provider for ONNX Runtime accelerates inference across many  `AI models <https://github.com/onnx/models>`_ on a variety of Intel® hardware such as:
 - Intel® CPUs
 - Intel® integrated GPUs
 - Intel® discrete GPUs
 - Intel® integrated NPUs (Windows only)

Installation
------------

Requirements
^^^^^^^^^^^^

- Ubuntu 18.04, 20.04, RHEL(CPU only) or Windows 10 - 64 bit
- Python 3.9 or 3.10 or 3.11 for Linux and Python 3.10, 3.11 for Windows

This package supports:
 - Intel® CPUs
 - Intel® integrated GPUs
 - Intel® discrete GPUs
 - Intel® integrated NPUs (Windows only)

``pip3 install onnxruntime-openvino``

Please install OpenVINO™ PyPi Package separately for Windows.
For installation instructions on Windows please refer to  `OpenVINO™ Execution Provider for ONNX Runtime for Windows <https://github.com/intel/onnxruntime/releases/>`_.

**OpenVINO™ Execution Provider for ONNX Runtime** Linux Wheels comes with pre-built libraries of OpenVINO™ version 2024.1.0 eliminating the need to install OpenVINO™ separately.

For more details on build and installation please refer to `Build <https://onnxruntime.ai/docs/build/eps.html#openvino>`_.

Usage
^^^^^

By default, Intel® CPU is used to run inference. However, you can change the default option to either Intel® integrated GPU, discrete GPU, integrated NPU (Windows only).
Invoke `the provider config device type argument <https://onnxruntime.ai/docs/execution-providers/OpenVINO-ExecutionProvider.html#summary-of-options>`_ to change the hardware on which inferencing is done.

For more API calls and environment variables, see  `Usage <https://onnxruntime.ai/docs/execution-providers/OpenVINO-ExecutionProvider.html#configuration-options>`_.

Samples
^^^^^^^^

To see what you can do with **OpenVINO™ Execution Provider for ONNX Runtime**, explore the demos located in the  `Examples <https://github.com/microsoft/onnxruntime-inference-examples/tree/main/python/OpenVINO_EP>`_.

License
^^^^^^^^

**OpenVINO™ Execution Provider for ONNX Runtime** is licensed under `MIT <https://github.com/microsoft/onnxruntime/blob/main/LICENSE>`_.
By contributing to the project, you agree to the license and copyright terms therein
and release your contribution under these terms.

Support
^^^^^^^^

Please submit your questions, feature requests and bug reports via   `GitHub Issues <https://github.com/microsoft/onnxruntime/issues>`_.

How to Contribute
^^^^^^^^^^^^^^^^^^

We welcome community contributions to **OpenVINO™ Execution Provider for ONNX Runtime**. If you have an idea for improvement:

* Share your proposal via  `GitHub Issues <https://github.com/microsoft/onnxruntime/issues>`_.
* Submit a  `Pull Request <https://github.com/microsoft/onnxruntime/pulls>`_.

            

Raw data

            {
    "_id": null,
    "home_page": "https://onnxruntime.ai",
    "name": "onnxruntime-openvino",
    "maintainer": null,
    "docs_url": null,
    "requires_python": null,
    "maintainer_email": null,
    "keywords": "onnx machine learning",
    "author": "Microsoft Corporation",
    "author_email": "onnxruntime@microsoft.com",
    "download_url": "https://github.com/microsoft/onnxruntime/tags",
    "platform": null,
    "description": "OpenVINO\u2122 Execution Provider for ONNX Runtime\n===============================================\n\n`OpenVINO\u2122 Execution Provider for ONNX Runtime <https://onnxruntime.ai/docs/execution-providers/OpenVINO-ExecutionProvider.html>`_ is a product designed for ONNX Runtime developers who want to get started with OpenVINO\u2122 in their inferencing applications. This product delivers  `OpenVINO\u2122 <https://software.intel.com/content/www/us/en/develop/tools/openvino-toolkit.html>`_ inline optimizations which enhance inferencing performance with minimal code modifications.\n\nOpenVINO\u2122 Execution Provider for ONNX Runtime accelerates inference across many  `AI models <https://github.com/onnx/models>`_ on a variety of Intel\u00ae hardware such as:\n - Intel\u00ae CPUs\n - Intel\u00ae integrated GPUs\n - Intel\u00ae discrete GPUs\n - Intel\u00ae integrated NPUs (Windows only)\n\nInstallation\n------------\n\nRequirements\n^^^^^^^^^^^^\n\n- Ubuntu 18.04, 20.04, RHEL(CPU only) or Windows 10 - 64 bit\n- Python 3.9 or 3.10 or 3.11 for Linux and Python 3.10, 3.11 for Windows\n\nThis package supports:\n - Intel\u00ae CPUs\n - Intel\u00ae integrated GPUs\n - Intel\u00ae discrete GPUs\n - Intel\u00ae integrated NPUs (Windows only)\n\n``pip3 install onnxruntime-openvino``\n\nPlease install OpenVINO\u2122 PyPi Package separately for Windows.\nFor installation instructions on Windows please refer to  `OpenVINO\u2122 Execution Provider for ONNX Runtime for Windows <https://github.com/intel/onnxruntime/releases/>`_.\n\n**OpenVINO\u2122 Execution Provider for ONNX Runtime** Linux Wheels comes with pre-built libraries of OpenVINO\u2122 version 2024.1.0 eliminating the need to install OpenVINO\u2122 separately.\n\nFor more details on build and installation please refer to `Build <https://onnxruntime.ai/docs/build/eps.html#openvino>`_.\n\nUsage\n^^^^^\n\nBy default, Intel\u00ae CPU is used to run inference. However, you can change the default option to either Intel\u00ae integrated GPU, discrete GPU, integrated NPU (Windows only).\nInvoke `the provider config device type argument <https://onnxruntime.ai/docs/execution-providers/OpenVINO-ExecutionProvider.html#summary-of-options>`_ to change the hardware on which inferencing is done.\n\nFor more API calls and environment variables, see  `Usage <https://onnxruntime.ai/docs/execution-providers/OpenVINO-ExecutionProvider.html#configuration-options>`_.\n\nSamples\n^^^^^^^^\n\nTo see what you can do with **OpenVINO\u2122 Execution Provider for ONNX Runtime**, explore the demos located in the  `Examples <https://github.com/microsoft/onnxruntime-inference-examples/tree/main/python/OpenVINO_EP>`_.\n\nLicense\n^^^^^^^^\n\n**OpenVINO\u2122 Execution Provider for ONNX Runtime** is licensed under `MIT <https://github.com/microsoft/onnxruntime/blob/main/LICENSE>`_.\nBy contributing to the project, you agree to the license and copyright terms therein\nand release your contribution under these terms.\n\nSupport\n^^^^^^^^\n\nPlease submit your questions, feature requests and bug reports via   `GitHub Issues <https://github.com/microsoft/onnxruntime/issues>`_.\n\nHow to Contribute\n^^^^^^^^^^^^^^^^^^\n\nWe welcome community contributions to **OpenVINO\u2122 Execution Provider for ONNX Runtime**. If you have an idea for improvement:\n\n* Share your proposal via  `GitHub Issues <https://github.com/microsoft/onnxruntime/issues>`_.\n* Submit a  `Pull Request <https://github.com/microsoft/onnxruntime/pulls>`_.\n",
    "bugtrack_url": null,
    "license": "MIT License",
    "summary": "ONNX Runtime is a runtime accelerator for Machine Learning models",
    "version": "1.20.0",
    "project_urls": {
        "Download": "https://github.com/microsoft/onnxruntime/tags",
        "Homepage": "https://onnxruntime.ai"
    },
    "split_keywords": [
        "onnx",
        "machine",
        "learning"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "2f1be0fb933dceb485b4faa6daa2fd971cb9b177d7defc43aa7b41583a371340",
                "md5": "5506380f56618b16cf107036ef028013",
                "sha256": "ae9089466ad3930cced192e8604de161c17fe833b962e511c7133a3b148e6c87"
            },
            "downloads": -1,
            "filename": "onnxruntime_openvino-1.20.0-cp310-cp310-manylinux_2_28_x86_64.whl",
            "has_sig": false,
            "md5_digest": "5506380f56618b16cf107036ef028013",
            "packagetype": "bdist_wheel",
            "python_version": "cp310",
            "requires_python": null,
            "size": 61733919,
            "upload_time": "2024-11-25T12:09:33",
            "upload_time_iso_8601": "2024-11-25T12:09:33.407730Z",
            "url": "https://files.pythonhosted.org/packages/2f/1b/e0fb933dceb485b4faa6daa2fd971cb9b177d7defc43aa7b41583a371340/onnxruntime_openvino-1.20.0-cp310-cp310-manylinux_2_28_x86_64.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "c6f35be634a82bda2c63c66b50b1d6b5d48a5e51bc8328d40a0a62412e0737a6",
                "md5": "e19c3a1221bd54a876efb231e60e670d",
                "sha256": "a5e28a369394b895a0f7048d6ad940f1510a445aa3c89ad4039b57c1a006f68f"
            },
            "downloads": -1,
            "filename": "onnxruntime_openvino-1.20.0-cp311-cp311-manylinux_2_28_x86_64.whl",
            "has_sig": false,
            "md5_digest": "e19c3a1221bd54a876efb231e60e670d",
            "packagetype": "bdist_wheel",
            "python_version": "cp311",
            "requires_python": null,
            "size": 61735716,
            "upload_time": "2024-11-25T12:09:42",
            "upload_time_iso_8601": "2024-11-25T12:09:42.316369Z",
            "url": "https://files.pythonhosted.org/packages/c6/f3/5be634a82bda2c63c66b50b1d6b5d48a5e51bc8328d40a0a62412e0737a6/onnxruntime_openvino-1.20.0-cp311-cp311-manylinux_2_28_x86_64.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "9adcbd8a9c75e1ca31dc2283c588e4b4235932d5242e6d90bf8c41b67741a4cc",
                "md5": "27429c790be858d038f699bf5e2d787d",
                "sha256": "d5b3b547e887cbc4081dad940db7d9aef6103dcce30a6746f2042400ad70676f"
            },
            "downloads": -1,
            "filename": "onnxruntime_openvino-1.20.0-cp311-cp311-win_amd64.whl",
            "has_sig": false,
            "md5_digest": "27429c790be858d038f699bf5e2d787d",
            "packagetype": "bdist_wheel",
            "python_version": "cp311",
            "requires_python": null,
            "size": 12334558,
            "upload_time": "2024-11-25T12:09:48",
            "upload_time_iso_8601": "2024-11-25T12:09:48.840667Z",
            "url": "https://files.pythonhosted.org/packages/9a/dc/bd8a9c75e1ca31dc2283c588e4b4235932d5242e6d90bf8c41b67741a4cc/onnxruntime_openvino-1.20.0-cp311-cp311-win_amd64.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "3d0bd967ac3db4ff8edfd658c27c5e533f2095784a0536f5a98df3a1eb8f7681",
                "md5": "34521d9aadccfb9ee0e490226a46c2a4",
                "sha256": "97f424b05feb18b4dbb6e9a85d2bfbd4c928508dc8846622b1c12b4086ce937c"
            },
            "downloads": -1,
            "filename": "onnxruntime_openvino-1.20.0-cp312-cp312-manylinux_2_28_x86_64.whl",
            "has_sig": false,
            "md5_digest": "34521d9aadccfb9ee0e490226a46c2a4",
            "packagetype": "bdist_wheel",
            "python_version": "cp312",
            "requires_python": null,
            "size": 61737546,
            "upload_time": "2024-11-25T12:10:03",
            "upload_time_iso_8601": "2024-11-25T12:10:03.881295Z",
            "url": "https://files.pythonhosted.org/packages/3d/0b/d967ac3db4ff8edfd658c27c5e533f2095784a0536f5a98df3a1eb8f7681/onnxruntime_openvino-1.20.0-cp312-cp312-manylinux_2_28_x86_64.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "8b74f29b0967fc71e15e987d1d316aa3193f496985c2b0a7f511615a47c3b973",
                "md5": "090e86f5b16fbaf5eec2cd536707ffe7",
                "sha256": "dbd39d1dedf798997393f8fdf8cb89ee4ed905c9a8ea000abdce7c288181b829"
            },
            "downloads": -1,
            "filename": "onnxruntime_openvino-1.20.0-cp312-cp312-win_amd64.whl",
            "has_sig": false,
            "md5_digest": "090e86f5b16fbaf5eec2cd536707ffe7",
            "packagetype": "bdist_wheel",
            "python_version": "cp312",
            "requires_python": null,
            "size": 12337462,
            "upload_time": "2024-11-25T12:10:11",
            "upload_time_iso_8601": "2024-11-25T12:10:11.092882Z",
            "url": "https://files.pythonhosted.org/packages/8b/74/f29b0967fc71e15e987d1d316aa3193f496985c2b0a7f511615a47c3b973/onnxruntime_openvino-1.20.0-cp312-cp312-win_amd64.whl",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-11-25 12:09:33",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "microsoft",
    "github_project": "onnxruntime",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "requirements": [
        {
            "name": "coloredlogs",
            "specs": []
        },
        {
            "name": "flatbuffers",
            "specs": []
        },
        {
            "name": "numpy",
            "specs": [
                [
                    ">=",
                    "1.21.6"
                ]
            ]
        },
        {
            "name": "packaging",
            "specs": []
        },
        {
            "name": "protobuf",
            "specs": []
        },
        {
            "name": "sympy",
            "specs": []
        }
    ],
    "lcname": "onnxruntime-openvino"
}
        
Elapsed time: 0.48227s