optimum-deepsparse


Nameoptimum-deepsparse JSON
Version 0.1.0.dev1 PyPI version JSON
download
home_pagehttps://github.com/neuralmagic/deepsparse
SummaryOptimum DeepSparse is an extension of the Hugging Face Transformers library that integrates the DeepSparse inference runtime. DeepSparse offers GPU-class performance on CPUs, making it possible to run Transformers and other deep learning models on commodity hardware with sparsity. Optimum DeepSparse provides a framework for developers to easily integrate DeepSparse into their applications, regardless of the hardware platform.
upload_time2023-10-26 02:02:45
maintainer
docs_urlNone
authorNeuralmagic, Inc.
requires_python>=3.8, <3.11
licenseNeural Magic DeepSparse Community License, Apache
keywords inference cpu x86 arm transformers quantization pruning sparsity
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # optimum-deepsparse

Accelerated inference of 🤗 models on CPUs using the [DeepSparse Inference Runtime](https://github.com/neuralmagic/deepsparse).

[![DeepSparse Modeling / Python - Test](https://github.com/neuralmagic/optimum-deepsparse/actions/workflows/test_check.yaml/badge.svg)](https://github.com/neuralmagic/optimum-deepsparse/actions/workflows/test_check.yaml)
[![DeepSparse Modeling Nightly](https://github.com/neuralmagic/optimum-deepsparse/actions/workflows/test_nightly.yaml/badge.svg)](https://github.com/neuralmagic/optimum-deepsparse/actions/workflows/test_nightly.yaml)

## Install
Optimum DeepSparse is a fast-moving project, and you may want to install from source.

`pip install git+https://github.com/neuralmagic/optimum-deepsparse.git`

### Installing in developer mode

If you are working on the `optimum-deepsparse` code then you should use an editable install by cloning and installing `optimum` and `optimum-deepsparse`:

```
git clone https://github.com/huggingface/optimum
git clone https://github.com/neuralmagic/optimum-deepsparse
pip install -e optimum -e optimum-deepsparse
```

Now whenever you change the code, you'll be able to run with those changes instantly.


## How to use it?
To load a model and run inference with DeepSparse, you can just replace your `AutoModelForXxx` class with the corresponding `DeepSparseModelForXxx` class. 

```diff
import requests
from PIL import Image

- from transformers import AutoModelForImageClassification
+ from optimum.deepsparse import DeepSparseModelForImageClassification
from transformers import AutoFeatureExtractor, pipeline

url = "http://images.cocodataset.org/val2017/000000039769.jpg"
image = Image.open(requests.get(url, stream=True).raw)

model_id = "microsoft/resnet-50"
- model = AutoModelForImageClassification.from_pretrained(model_id)
+ model = DeepSparseModelForImageClassification.from_pretrained(model_id, export=True, input_shapes="[1,3,224,224]")
feature_extractor = AutoFeatureExtractor.from_pretrained(model_id)
cls_pipe = pipeline("image-classification", model=model, feature_extractor=feature_extractor)
outputs = cls_pipe(image)
```

| Supported Task                              | Model Class |
| ------------------------------------------- | ------------- |
| "image-classification"                      | DeepSparseModelForImageClassification  |
| "text-classification"/"sentiment-analysis"  | DeepSparseModelForSequenceClassification  |
| "audio-classification"                      | DeepSparseModelForAudioClassification  |
| "question-answering"                        | DeepSparseModelForQuestionAnswering  |
| "image-segmentation"                        | DeepSparseModelForSemanticSegmentation  |

If you find any issue while using those, please open an issue or a pull request.



            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/neuralmagic/deepsparse",
    "name": "optimum-deepsparse",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.8, <3.11",
    "maintainer_email": "",
    "keywords": "inference,cpu,x86,arm,transformers,quantization,pruning,sparsity",
    "author": "Neuralmagic, Inc.",
    "author_email": "michael@neuralmagic.com",
    "download_url": "https://files.pythonhosted.org/packages/b8/c3/dc8e590088b3e3d1f31f48da014c1cc852ba8bd64860abfdd45f34ff2b5b/optimum-deepsparse-0.1.0.dev1.tar.gz",
    "platform": null,
    "description": "# optimum-deepsparse\n\nAccelerated inference of \ud83e\udd17 models on CPUs using the [DeepSparse Inference Runtime](https://github.com/neuralmagic/deepsparse).\n\n[![DeepSparse Modeling / Python - Test](https://github.com/neuralmagic/optimum-deepsparse/actions/workflows/test_check.yaml/badge.svg)](https://github.com/neuralmagic/optimum-deepsparse/actions/workflows/test_check.yaml)\n[![DeepSparse Modeling Nightly](https://github.com/neuralmagic/optimum-deepsparse/actions/workflows/test_nightly.yaml/badge.svg)](https://github.com/neuralmagic/optimum-deepsparse/actions/workflows/test_nightly.yaml)\n\n## Install\nOptimum DeepSparse is a fast-moving project, and you may want to install from source.\n\n`pip install git+https://github.com/neuralmagic/optimum-deepsparse.git`\n\n### Installing in developer mode\n\nIf you are working on the `optimum-deepsparse` code then you should use an editable install by cloning and installing `optimum` and `optimum-deepsparse`:\n\n```\ngit clone https://github.com/huggingface/optimum\ngit clone https://github.com/neuralmagic/optimum-deepsparse\npip install -e optimum -e optimum-deepsparse\n```\n\nNow whenever you change the code, you'll be able to run with those changes instantly.\n\n\n## How to use it?\nTo load a model and run inference with DeepSparse, you can just replace your `AutoModelForXxx` class with the corresponding `DeepSparseModelForXxx` class. \n\n```diff\nimport requests\nfrom PIL import Image\n\n- from transformers import AutoModelForImageClassification\n+ from optimum.deepsparse import DeepSparseModelForImageClassification\nfrom transformers import AutoFeatureExtractor, pipeline\n\nurl = \"http://images.cocodataset.org/val2017/000000039769.jpg\"\nimage = Image.open(requests.get(url, stream=True).raw)\n\nmodel_id = \"microsoft/resnet-50\"\n- model = AutoModelForImageClassification.from_pretrained(model_id)\n+ model = DeepSparseModelForImageClassification.from_pretrained(model_id, export=True, input_shapes=\"[1,3,224,224]\")\nfeature_extractor = AutoFeatureExtractor.from_pretrained(model_id)\ncls_pipe = pipeline(\"image-classification\", model=model, feature_extractor=feature_extractor)\noutputs = cls_pipe(image)\n```\n\n| Supported Task                              | Model Class |\n| ------------------------------------------- | ------------- |\n| \"image-classification\"                      | DeepSparseModelForImageClassification  |\n| \"text-classification\"/\"sentiment-analysis\"  | DeepSparseModelForSequenceClassification  |\n| \"audio-classification\"                      | DeepSparseModelForAudioClassification  |\n| \"question-answering\"                        | DeepSparseModelForQuestionAnswering  |\n| \"image-segmentation\"                        | DeepSparseModelForSemanticSegmentation  |\n\nIf you find any issue while using those, please open an issue or a pull request.\n\n\n",
    "bugtrack_url": null,
    "license": "Neural Magic DeepSparse Community License, Apache",
    "summary": "Optimum DeepSparse is an extension of the Hugging Face Transformers library that integrates the DeepSparse inference runtime. DeepSparse offers GPU-class performance on CPUs, making it possible to run Transformers and other deep learning models on commodity hardware with sparsity. Optimum DeepSparse provides a framework for developers to easily integrate DeepSparse into their applications, regardless of the hardware platform.",
    "version": "0.1.0.dev1",
    "project_urls": {
        "Homepage": "https://github.com/neuralmagic/deepsparse"
    },
    "split_keywords": [
        "inference",
        "cpu",
        "x86",
        "arm",
        "transformers",
        "quantization",
        "pruning",
        "sparsity"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "df53313636c48365970d6093a4925ef6289c9a9b577b2bf48ca654b0494f6b83",
                "md5": "b156bf0ad9a0044dfb7b82358984b63f",
                "sha256": "7e5d50db484e518ea122e15f81503a8ff3feec93b6a552a8d13af76de7967a48"
            },
            "downloads": -1,
            "filename": "optimum_deepsparse-0.1.0.dev1-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "b156bf0ad9a0044dfb7b82358984b63f",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.8, <3.11",
            "size": 27406,
            "upload_time": "2023-10-26T02:02:44",
            "upload_time_iso_8601": "2023-10-26T02:02:44.122001Z",
            "url": "https://files.pythonhosted.org/packages/df/53/313636c48365970d6093a4925ef6289c9a9b577b2bf48ca654b0494f6b83/optimum_deepsparse-0.1.0.dev1-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "b8c3dc8e590088b3e3d1f31f48da014c1cc852ba8bd64860abfdd45f34ff2b5b",
                "md5": "7de60a9bfe9fb033ca6c34f5af061b3a",
                "sha256": "8597663583ba2b54e7ebe922a23ad3e0f746e3fcb0d2786d00f422f2a79e3a0c"
            },
            "downloads": -1,
            "filename": "optimum-deepsparse-0.1.0.dev1.tar.gz",
            "has_sig": false,
            "md5_digest": "7de60a9bfe9fb033ca6c34f5af061b3a",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.8, <3.11",
            "size": 26922,
            "upload_time": "2023-10-26T02:02:45",
            "upload_time_iso_8601": "2023-10-26T02:02:45.556076Z",
            "url": "https://files.pythonhosted.org/packages/b8/c3/dc8e590088b3e3d1f31f48da014c1cc852ba8bd64860abfdd45f34ff2b5b/optimum-deepsparse-0.1.0.dev1.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-10-26 02:02:45",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "neuralmagic",
    "github_project": "deepsparse",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "optimum-deepsparse"
}
        
Elapsed time: 0.12713s