onnxruntime-openvino


Nameonnxruntime-openvino JSON
Version 1.17.1 PyPI version JSON
download
home_pagehttps://onnxruntime.ai
SummaryONNX Runtime is a runtime accelerator for Machine Learning models
upload_time2024-03-08 14:52:14
maintainer
docs_urlNone
authorMicrosoft Corporation
requires_python
licenseMIT License
keywords onnx machine learning
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            OpenVINO™ Execution Provider for ONNX Runtime
===============================================

`OpenVINO™ Execution Provider for ONNX Runtime <https://onnxruntime.ai/docs/execution-providers/OpenVINO-ExecutionProvider.html>`_ is a product designed for ONNX Runtime developers who want to get started with OpenVINO™ in their inferencing applications. This product delivers  `OpenVINO™ <https://software.intel.com/content/www/us/en/develop/tools/openvino-toolkit.html>`_ inline optimizations which enhance inferencing performance with minimal code modifications.

OpenVINO™ Execution Provider for ONNX Runtime accelerates inference across many  `AI models <https://github.com/onnx/models>`_ on a variety of Intel® hardware such as:
 - Intel® CPUs
 - Intel® integrated GPUs
 - Intel® discrete GPUs

Installation
------------

Requirements
^^^^^^^^^^^^

- Ubuntu 20.06, or Windows 10 - 64 bit
- Python 3.9,3.10,3,11 for Linux and Python3.10 and Python 3.11 for Windows
- OpenVINO 2023.3.0 Version only

This package supports:
 - Intel® CPUs
 - Intel® integrated GPUs
 - Intel® discrete GPUs

``pip3 install onnxruntime-openvino``

Please install OpenVINO™ PyPi Package separately for Windows.
For installation instructions on Windows please refer to  `OpenVINO™ Execution Provider for ONNX Runtime for Windows <https://github.com/intel/onnxruntime/releases/>`_.

**OpenVINO™ Execution Provider for ONNX Runtime** Linux Wheels comes with pre-built libraries of OpenVINO™ version 2023.3.0 eliminating the need to install OpenVINO™ separately. The OpenVINO™ libraries are prebuilt with CXX11_ABI flag set to 0.

For more details on build and installation please refer to `Build <https://onnxruntime.ai/docs/build/eps.html#openvino>`_.

Usage
^^^^^

By default, Intel® CPU is used to run inference. However, you can change the default option to either Intel® integrated or discrete GPU.
Invoke `the provider config device type argument <https://onnxruntime.ai/docs/execution-providers/OpenVINO-ExecutionProvider.html#summary-of-options>`_ to change the hardware on which inferencing is done.

For more API calls and environment variables, see  `Usage <https://onnxruntime.ai/docs/execution-providers/OpenVINO-ExecutionProvider.html#configuration-options>`_.

Samples
^^^^^^^^

To see what you can do with **OpenVINO™ Execution Provider for ONNX Runtime**, explore the demos located in the  `Examples <https://github.com/microsoft/onnxruntime-inference-examples/tree/main/python/OpenVINO_EP>`_.

License
^^^^^^^^

**OpenVINO™ Execution Provider for ONNX Runtime** is licensed under `MIT <https://github.com/microsoft/onnxruntime/blob/main/LICENSE>`_.
By contributing to the project, you agree to the license and copyright terms therein
and release your contribution under these terms.

Support
^^^^^^^^

Please submit your questions, feature requests and bug reports via   `GitHub Issues <https://github.com/microsoft/onnxruntime/issues>`_.

How to Contribute
^^^^^^^^^^^^^^^^^^

We welcome community contributions to **OpenVINO™ Execution Provider for ONNX Runtime**. If you have an idea for improvement:

* Share your proposal via  `GitHub Issues <https://github.com/microsoft/onnxruntime/issues>`_.
* Submit a  `Pull Request <https://github.com/microsoft/onnxruntime/pulls>`_.

            

Raw data

            {
    "_id": null,
    "home_page": "https://onnxruntime.ai",
    "name": "onnxruntime-openvino",
    "maintainer": "",
    "docs_url": null,
    "requires_python": "",
    "maintainer_email": "",
    "keywords": "onnx machine learning",
    "author": "Microsoft Corporation",
    "author_email": "onnxruntime@microsoft.com",
    "download_url": "https://github.com/microsoft/onnxruntime/tags",
    "platform": null,
    "description": "OpenVINO\u2122 Execution Provider for ONNX Runtime\r\n===============================================\r\n\r\n`OpenVINO\u2122 Execution Provider for ONNX Runtime <https://onnxruntime.ai/docs/execution-providers/OpenVINO-ExecutionProvider.html>`_ is a product designed for ONNX Runtime developers who want to get started with OpenVINO\u2122 in their inferencing applications. This product delivers  `OpenVINO\u2122 <https://software.intel.com/content/www/us/en/develop/tools/openvino-toolkit.html>`_ inline optimizations which enhance inferencing performance with minimal code modifications.\r\n\r\nOpenVINO\u2122 Execution Provider for ONNX Runtime accelerates inference across many  `AI models <https://github.com/onnx/models>`_ on a variety of Intel\u00ae hardware such as:\r\n - Intel\u00ae CPUs\r\n - Intel\u00ae integrated GPUs\r\n - Intel\u00ae discrete GPUs\r\n\r\nInstallation\r\n------------\r\n\r\nRequirements\r\n^^^^^^^^^^^^\r\n\r\n- Ubuntu 20.06, or Windows 10 - 64 bit\r\n- Python 3.9,3.10,3,11 for Linux and Python3.10 and Python 3.11 for Windows\r\n- OpenVINO 2023.3.0 Version only\r\n\r\nThis package supports:\r\n - Intel\u00ae CPUs\r\n - Intel\u00ae integrated GPUs\r\n - Intel\u00ae discrete GPUs\r\n\r\n``pip3 install onnxruntime-openvino``\r\n\r\nPlease install OpenVINO\u2122 PyPi Package separately for Windows.\r\nFor installation instructions on Windows please refer to  `OpenVINO\u2122 Execution Provider for ONNX Runtime for Windows <https://github.com/intel/onnxruntime/releases/>`_.\r\n\r\n**OpenVINO\u2122 Execution Provider for ONNX Runtime** Linux Wheels comes with pre-built libraries of OpenVINO\u2122 version 2023.3.0 eliminating the need to install OpenVINO\u2122 separately. The OpenVINO\u2122 libraries are prebuilt with CXX11_ABI flag set to 0.\r\n\r\nFor more details on build and installation please refer to `Build <https://onnxruntime.ai/docs/build/eps.html#openvino>`_.\r\n\r\nUsage\r\n^^^^^\r\n\r\nBy default, Intel\u00ae CPU is used to run inference. However, you can change the default option to either Intel\u00ae integrated or discrete GPU.\r\nInvoke `the provider config device type argument <https://onnxruntime.ai/docs/execution-providers/OpenVINO-ExecutionProvider.html#summary-of-options>`_ to change the hardware on which inferencing is done.\r\n\r\nFor more API calls and environment variables, see  `Usage <https://onnxruntime.ai/docs/execution-providers/OpenVINO-ExecutionProvider.html#configuration-options>`_.\r\n\r\nSamples\r\n^^^^^^^^\r\n\r\nTo see what you can do with **OpenVINO\u2122 Execution Provider for ONNX Runtime**, explore the demos located in the  `Examples <https://github.com/microsoft/onnxruntime-inference-examples/tree/main/python/OpenVINO_EP>`_.\r\n\r\nLicense\r\n^^^^^^^^\r\n\r\n**OpenVINO\u2122 Execution Provider for ONNX Runtime** is licensed under `MIT <https://github.com/microsoft/onnxruntime/blob/main/LICENSE>`_.\r\nBy contributing to the project, you agree to the license and copyright terms therein\r\nand release your contribution under these terms.\r\n\r\nSupport\r\n^^^^^^^^\r\n\r\nPlease submit your questions, feature requests and bug reports via   `GitHub Issues <https://github.com/microsoft/onnxruntime/issues>`_.\r\n\r\nHow to Contribute\r\n^^^^^^^^^^^^^^^^^^\r\n\r\nWe welcome community contributions to **OpenVINO\u2122 Execution Provider for ONNX Runtime**. If you have an idea for improvement:\r\n\r\n* Share your proposal via  `GitHub Issues <https://github.com/microsoft/onnxruntime/issues>`_.\r\n* Submit a  `Pull Request <https://github.com/microsoft/onnxruntime/pulls>`_.\r\n",
    "bugtrack_url": null,
    "license": "MIT License",
    "summary": "ONNX Runtime is a runtime accelerator for Machine Learning models",
    "version": "1.17.1",
    "project_urls": {
        "Download": "https://github.com/microsoft/onnxruntime/tags",
        "Homepage": "https://onnxruntime.ai"
    },
    "split_keywords": [
        "onnx",
        "machine",
        "learning"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "73e7f5b6fcc537088db5cd7c8c9f5e21269467116b3a9f8e621b1aaaf79bfa47",
                "md5": "84726f2cc90f28db65f3518e147892a9",
                "sha256": "5152b5e56e83e022ced2986700d68dd8ba7b1466761725ce774f679c5710ab87"
            },
            "downloads": -1,
            "filename": "onnxruntime_openvino-1.17.1-cp310-cp310-win_amd64.whl",
            "has_sig": false,
            "md5_digest": "84726f2cc90f28db65f3518e147892a9",
            "packagetype": "bdist_wheel",
            "python_version": "cp310",
            "requires_python": null,
            "size": 6738768,
            "upload_time": "2024-03-08T14:52:14",
            "upload_time_iso_8601": "2024-03-08T14:52:14.475848Z",
            "url": "https://files.pythonhosted.org/packages/73/e7/f5b6fcc537088db5cd7c8c9f5e21269467116b3a9f8e621b1aaaf79bfa47/onnxruntime_openvino-1.17.1-cp310-cp310-win_amd64.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "8d2c5b46e5929fe1dfc559954c70336bf1c7383df4cd808a89f7bcc269eaa224",
                "md5": "ea1cc28c3528e35063b944f15194e943",
                "sha256": "21133a701bb07ea19e01f48b8c23beee575f2e879f49173843f275d7c91a625a"
            },
            "downloads": -1,
            "filename": "onnxruntime_openvino-1.17.1-cp311-cp311-win_amd64.whl",
            "has_sig": false,
            "md5_digest": "ea1cc28c3528e35063b944f15194e943",
            "packagetype": "bdist_wheel",
            "python_version": "cp311",
            "requires_python": null,
            "size": 6738827,
            "upload_time": "2024-03-08T14:52:17",
            "upload_time_iso_8601": "2024-03-08T14:52:17.689605Z",
            "url": "https://files.pythonhosted.org/packages/8d/2c/5b46e5929fe1dfc559954c70336bf1c7383df4cd808a89f7bcc269eaa224/onnxruntime_openvino-1.17.1-cp311-cp311-win_amd64.whl",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-03-08 14:52:14",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "microsoft",
    "github_project": "onnxruntime",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "onnxruntime-openvino"
}
        
Elapsed time: 0.21425s