Mobilint Model Zoo
========================
<div align="center">
<p>
<a href="https://www.mobilint.com/" target="_blank">
<img src="https://raw.githubusercontent.com/mobilint/mblt-model-zoo/master/assets/Mobilint_Logo_Primary.png" alt="Mobilint Logo" width="60%">
</a>
</p>
</div>
**mblt-model-zoo** is a curated collection of AI models optimized by [Mobilint](https://www.mobilint.com/)’s Neural Processing Units (NPUs).
Designed to help developers accelerate deployment, Mobilint's Model Zoo offers access to public, pre-trained, and pre-quantized models for vision, language, and multimodal tasks. Along with performance results, we provide pre- and post-processing tools to help developers evaluate, fine-tune, and integrate the models with ease.
## Installation
- Install Mobilint ACCELerator(MACCEL) on your environment. In case you are not Mobilint customer, please contact [us](mailto:tech-support@mobilint.com).
- Install **mblt-model-zoo** using pip:
```bash
pip install mblt-model-zoo
```
- If you want to install the latest version from the source, clone the repository and install it:
```bash
git clone https://github.com/mobilint/mblt-model-zoo.git
cd mblt-model-zoo
pip install -e .
```
## Quick Start Guide
### Initializing Quantized Model Class
**mblt-model-zoo** provides a quantized model with associated pre- and post-processing tools. The following code snippet shows how to use the pre-trained model for inference.
```python
from mblt_model_zoo.vision import ResNet50
# Load the pre-trained model.
# Automatically download the model if not found in the local cache.
resnet50 = ResNet50()
# Load the model trained with different recipe
# Currently, default is "DEFAULT", or "IMAGENET1K_V1.
resnet50 = ResNet50(model_type = "IMAGENET1K_V2")
# Download the model to local directory and load it
resnet50 = ResNet50(local_path = "path/to/local/") # the file will be downloaded to "path/to/local/model.mxq"
# Load the model from a local path or download as filename and file path you want
resnet50 = ResNet50(local_path = "path/to/local/model.mxq")
# Set inference mode for better performance
# Aries supports "single", "multi" and "global" inferece mode. Default is "global"
resnet50 = ResNet50(infer_mode = "global")
# (Beta) If you are holding a model compiled for Regulus, enable inference on the Regulus device.
resnet50 = ResNet50(product = "regulus")
# In summary, the model can be loaded with the following arguments.
# You may customize those arguments to work with Mobilint's NPU.
resnet50 = ResNet50(
local_path = None,
model_type = "DEFAULT",
infer_mode = "global",
product = "aries",
)
```
### Working with Quantized Model
With the image given as path, PIL image, numpy array, or torch tensor, you can perform inference with the quantized model. The following code snippet shows how to use the quantized model for inference:
```python
image_path = "path/to/image.jpg"
input_img = resnet50.preprocess(image_path) # Preprocess the input image
output = resnet50(input_img) # Perform inference with the quantized model
result = resnet50.postprocess(output) # Postprocess the output
result.plot(
source_path=image_path,
save_path="path/to/save/result.jpg",
)
```
### Listing Available Models
**mblt-model-zoo** offers a function to list all available models. You can use the following code snippet to list the models for a specific task (e.g., image classification, object detection, etc.):
```python
from mblt_model_zoo.vision import list_models
from pprint import pprint
available_models = list_models()
pprint(available_models)
```
## Model List
We provide the models that are quantized with our advanced quantization techniques. List of available vision models are [here](mblt_model_zoo/vision/README.md).
## Optional Extras
When working with tasks other than vision, extra dependencies may be required. Those options can be installed via `pip install mblt-model-zoo[NAME]` or `pip install -e .[NAME]`.
Currently, this optional functions are only available on environment equipped with Mobilint's [Aries](https://www.mobilint.com/aries).
|Name|Use|Details|
|-------|------|------|
|transformers|For using HuggingFace transformers related models|[README.md](mblt_model_zoo/transformers/README.md)
## License
The Mobilint Model Zoo is released under BSD 3-Clause License. Please see the [LICENSE](https://github.com/mobilint/mblt-model-zoo/blob/master/LICENSE) file for more details.
Additionally, the license for each model provided in this package follows the terms specified in the source link provided with it.
## Support & Issues
If you encounter any problems with this package, please feel free to contact [us](mailto:tech-support@mobilint.com).
Raw data
{
"_id": null,
"home_page": null,
"name": "mblt-model-zoo",
"maintainer": null,
"docs_url": null,
"requires_python": "<3.13,>=3.8",
"maintainer_email": null,
"keywords": "quantization, NPU, model zoo, pre-quantized models, inference, mobilint, mblt, aries",
"author": null,
"author_email": "\"Mobilint Inc.\" <tech-support@mobilint.com>",
"download_url": "https://files.pythonhosted.org/packages/8b/95/4555a17bbbcfe6872fc8733238e3989e47208e226b9dce5057f91fd856d1/mblt_model_zoo-0.0.3.tar.gz",
"platform": null,
"description": "Mobilint Model Zoo\n========================\n\n<div align=\"center\">\n<p>\n <a href=\"https://www.mobilint.com/\" target=\"_blank\">\n<img src=\"https://raw.githubusercontent.com/mobilint/mblt-model-zoo/master/assets/Mobilint_Logo_Primary.png\" alt=\"Mobilint Logo\" width=\"60%\">\n</a>\n</p>\n</div>\n\n**mblt-model-zoo** is a curated collection of AI models optimized by [Mobilint](https://www.mobilint.com/)\u2019s Neural Processing Units (NPUs).\n\nDesigned to help developers accelerate deployment, Mobilint's Model Zoo offers access to public, pre-trained, and pre-quantized models for vision, language, and multimodal tasks. Along with performance results, we provide pre- and post-processing tools to help developers evaluate, fine-tune, and integrate the models with ease.\n\n## Installation\n- Install Mobilint ACCELerator(MACCEL) on your environment. In case you are not Mobilint customer, please contact [us](mailto:tech-support@mobilint.com).\n- Install **mblt-model-zoo** using pip:\n```bash\npip install mblt-model-zoo\n```\n- If you want to install the latest version from the source, clone the repository and install it:\n```bash\ngit clone https://github.com/mobilint/mblt-model-zoo.git\ncd mblt-model-zoo\npip install -e .\n```\n## Quick Start Guide\n### Initializing Quantized Model Class\n**mblt-model-zoo** provides a quantized model with associated pre- and post-processing tools. The following code snippet shows how to use the pre-trained model for inference.\n\n```python\nfrom mblt_model_zoo.vision import ResNet50\n\n# Load the pre-trained model. \n# Automatically download the model if not found in the local cache.\nresnet50 = ResNet50() \n\n# Load the model trained with different recipe\n# Currently, default is \"DEFAULT\", or \"IMAGENET1K_V1.\nresnet50 = ResNet50(model_type = \"IMAGENET1K_V2\")\n\n# Download the model to local directory and load it\nresnet50 = ResNet50(local_path = \"path/to/local/\") # the file will be downloaded to \"path/to/local/model.mxq\"\n\n# Load the model from a local path or download as filename and file path you want\nresnet50 = ResNet50(local_path = \"path/to/local/model.mxq\")\n\n# Set inference mode for better performance\n# Aries supports \"single\", \"multi\" and \"global\" inferece mode. Default is \"global\"\nresnet50 = ResNet50(infer_mode = \"global\")\n\n# (Beta) If you are holding a model compiled for Regulus, enable inference on the Regulus device.\nresnet50 = ResNet50(product = \"regulus\")\n\n# In summary, the model can be loaded with the following arguments. \n# You may customize those arguments to work with Mobilint's NPU.\nresnet50 = ResNet50(\n local_path = None,\n model_type = \"DEFAULT\",\n infer_mode = \"global\",\n product = \"aries\",\n)\n\n```\n### Working with Quantized Model\nWith the image given as path, PIL image, numpy array, or torch tensor, you can perform inference with the quantized model. The following code snippet shows how to use the quantized model for inference:\n```python\nimage_path = \"path/to/image.jpg\"\n\ninput_img = resnet50.preprocess(image_path) # Preprocess the input image\noutput = resnet50(input_img) # Perform inference with the quantized model\nresult = resnet50.postprocess(output) # Postprocess the output\n\nresult.plot(\n source_path=image_path,\n save_path=\"path/to/save/result.jpg\",\n)\n```\n### Listing Available Models\n**mblt-model-zoo** offers a function to list all available models. You can use the following code snippet to list the models for a specific task (e.g., image classification, object detection, etc.):\n\n```python\nfrom mblt_model_zoo.vision import list_models\nfrom pprint import pprint\n\navailable_models = list_models()\npprint(available_models)\n```\n\n## Model List\nWe provide the models that are quantized with our advanced quantization techniques. List of available vision models are [here](mblt_model_zoo/vision/README.md).\n\n## Optional Extras\nWhen working with tasks other than vision, extra dependencies may be required. Those options can be installed via `pip install mblt-model-zoo[NAME]` or `pip install -e .[NAME]`.\n\nCurrently, this optional functions are only available on environment equipped with Mobilint's [Aries](https://www.mobilint.com/aries).\n\n|Name|Use|Details|\n|-------|------|------|\n|transformers|For using HuggingFace transformers related models|[README.md](mblt_model_zoo/transformers/README.md)\n\n## License\nThe Mobilint Model Zoo is released under BSD 3-Clause License. Please see the [LICENSE](https://github.com/mobilint/mblt-model-zoo/blob/master/LICENSE) file for more details.\n\nAdditionally, the license for each model provided in this package follows the terms specified in the source link provided with it.\n\n## Support & Issues\nIf you encounter any problems with this package, please feel free to contact [us](mailto:tech-support@mobilint.com).\n",
"bugtrack_url": null,
"license": "BSD-3-Clause",
"summary": "A collection of pre-quantized AI models for Mobilint NPUs.",
"version": "0.0.3",
"project_urls": {
"Home": "https://www.mobilint.com/",
"Repository": "https://github.com/mobilint/mblt-model-zoo"
},
"split_keywords": [
"quantization",
" npu",
" model zoo",
" pre-quantized models",
" inference",
" mobilint",
" mblt",
" aries"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "955800c7dcba8cedb1fe1053aef1db9bd8c8c074f44db905e62bf408595943c2",
"md5": "169035c6a7daeec9ccce21e86ca3dcf6",
"sha256": "590542869a77e8201dfc7849bbf79e12a9ebce95fe7c7b84ff017eff2f5d707b"
},
"downloads": -1,
"filename": "mblt_model_zoo-0.0.3-py3-none-any.whl",
"has_sig": false,
"md5_digest": "169035c6a7daeec9ccce21e86ca3dcf6",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": "<3.13,>=3.8",
"size": 98358,
"upload_time": "2025-08-01T05:05:01",
"upload_time_iso_8601": "2025-08-01T05:05:01.104046Z",
"url": "https://files.pythonhosted.org/packages/95/58/00c7dcba8cedb1fe1053aef1db9bd8c8c074f44db905e62bf408595943c2/mblt_model_zoo-0.0.3-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "8b954555a17bbbcfe6872fc8733238e3989e47208e226b9dce5057f91fd856d1",
"md5": "ab491ef05fe2679a87b8815dcbc9deaf",
"sha256": "fc122aa61f832f2ff52862b384914c06ba9b1f9ce00a0bb94da350a69a283d28"
},
"downloads": -1,
"filename": "mblt_model_zoo-0.0.3.tar.gz",
"has_sig": false,
"md5_digest": "ab491ef05fe2679a87b8815dcbc9deaf",
"packagetype": "sdist",
"python_version": "source",
"requires_python": "<3.13,>=3.8",
"size": 66602,
"upload_time": "2025-08-01T05:05:02",
"upload_time_iso_8601": "2025-08-01T05:05:02.410751Z",
"url": "https://files.pythonhosted.org/packages/8b/95/4555a17bbbcfe6872fc8733238e3989e47208e226b9dce5057f91fd856d1/mblt_model_zoo-0.0.3.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-08-01 05:05:02",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "mobilint",
"github_project": "mblt-model-zoo",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"lcname": "mblt-model-zoo"
}