# Infer
`infer` is a toolchain for doing optimized neural network inference with a number of different backends.
## Installation
Install from `pip`:
```bash
pip install ml-infer
# With development dependencies
pip install 'ml-infer[dev]'
# With TensorRT support
pip install 'ml-infer[tensorrt]'
# With CoreML support
pip install 'ml-infer[coreml]'
# With developer dependencies and TensorRT support
pip install 'ml-infer[dev,tensorrt]'
# With developer dependencies and CoreML support
pip install 'ml-infer[dev,coreml]'
```
## Supported Backend
- TensorRT
- CoreML
## References
- [Official TensorRT Export Pipeline](https://github.com/pytorch/TensorRT)
- [Official CoreML Repository](https://github.com/apple/coremltools)
Raw data
{
"_id": null,
"home_page": "https://github.com/codekansas/inference",
"name": "ml-infer",
"maintainer": "",
"docs_url": null,
"requires_python": ">=3.10",
"maintainer_email": "",
"keywords": "",
"author": "Benjamin Bolte",
"author_email": "",
"download_url": "",
"platform": null,
"description": "# Infer\n\n`infer` is a toolchain for doing optimized neural network inference with a number of different backends.\n\n## Installation\n\nInstall from `pip`:\n\n```bash\npip install ml-infer\n# With development dependencies\npip install 'ml-infer[dev]'\n# With TensorRT support\npip install 'ml-infer[tensorrt]'\n# With CoreML support\npip install 'ml-infer[coreml]'\n# With developer dependencies and TensorRT support\npip install 'ml-infer[dev,tensorrt]'\n# With developer dependencies and CoreML support\npip install 'ml-infer[dev,coreml]'\n```\n\n## Supported Backend\n\n- TensorRT\n- CoreML\n\n## References\n\n- [Official TensorRT Export Pipeline](https://github.com/pytorch/TensorRT)\n- [Official CoreML Repository](https://github.com/apple/coremltools)\n",
"bugtrack_url": null,
"license": "",
"summary": "Inference toolkit for machine learning models",
"version": "0.0.3",
"project_urls": {
"Homepage": "https://github.com/codekansas/inference"
},
"split_keywords": [],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "3d7a3680fcd85e34797157a2d71108def35c93219ac8130eba16f62b0bde9d4f",
"md5": "dd172432e643961120c4ae67a86e8577",
"sha256": "dc006fa4d9359e143c6d22aaf8f9a055b0cf83dde6a7864c68ea7d0a6f5955ce"
},
"downloads": -1,
"filename": "ml_infer-0.0.3-cp310-cp310-macosx_11_0_x86_64.whl",
"has_sig": false,
"md5_digest": "dd172432e643961120c4ae67a86e8577",
"packagetype": "bdist_wheel",
"python_version": "cp310",
"requires_python": ">=3.10",
"size": 70808,
"upload_time": "2023-05-05T11:16:44",
"upload_time_iso_8601": "2023-05-05T11:16:44.314664Z",
"url": "https://files.pythonhosted.org/packages/3d/7a/3680fcd85e34797157a2d71108def35c93219ac8130eba16f62b0bde9d4f/ml_infer-0.0.3-cp310-cp310-macosx_11_0_x86_64.whl",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2023-05-05 11:16:44",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "codekansas",
"github_project": "inference",
"github_not_found": true,
"lcname": "ml-infer"
}