infero


Nameinfero JSON
Version 0.0.16 PyPI version JSON
download
home_pageNone
SummaryEasily download, convert, and host your models using the ONNX runtime
upload_time2024-12-15 18:05:07
maintainerNone
docs_urlNone
authorameen-91
requires_python<4.0,>=3.12
licenseMIT
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # Infero

![PyPI - Python Version](https://img.shields.io/pypi/pyversions/infero)
![PyPI - Version](https://img.shields.io/pypi/v/infero)
![PyPI - Downloads](https://img.shields.io/pypi/dw/infero)
[![CI](https://github.com/norsulabs/infero/actions/workflows/ci.yaml/badge.svg)](https://github.com/norsulabs/infero/actions/workflows/ci.yaml)


## Overview



https://github.com/user-attachments/assets/4062501c-8420-4750-94bc-6a8f82a69989



Infero allows you to easily download, convert, and host your models using the ONNX runtime. It provides a simple CLI to run and maintain the models.

## Features

- Automatic downloads.
- Automatic ONNX conversions.
- Automatic server setup.
- 8-bit quantization support.

## Installation

To install Infero, run the following command:

```bash
pip install infero
```

## Usage

Here is a simple example of how to use Infero:

```bash
infero pull [hf_model_name]
```

To run a model:

```bash
infero run [hf_model_name]
```

With 8-bit quantization:

```bash
infero run [hf_model_name] --quantize
```

To list all available models:

```bash
infero list
```

To remove a model:

```bash
infero remove [hf_model_name]
```

Infero is licensed under the MIT License. See the [LICENSE](LICENSE) file for more details.

## Contact

For any questions or feedback, please contact us at support@norsulabs.com.


            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "infero",
    "maintainer": null,
    "docs_url": null,
    "requires_python": "<4.0,>=3.12",
    "maintainer_email": null,
    "keywords": null,
    "author": "ameen-91",
    "author_email": "mohammedameen9011@gmail.com",
    "download_url": "https://files.pythonhosted.org/packages/f5/d0/5ac13d4a28b5c9dbac1369bfe28d694adee0eeb9233e33bd0635cd8f9bef/infero-0.0.16.tar.gz",
    "platform": null,
    "description": "# Infero\n\n![PyPI - Python Version](https://img.shields.io/pypi/pyversions/infero)\n![PyPI - Version](https://img.shields.io/pypi/v/infero)\n![PyPI - Downloads](https://img.shields.io/pypi/dw/infero)\n[![CI](https://github.com/norsulabs/infero/actions/workflows/ci.yaml/badge.svg)](https://github.com/norsulabs/infero/actions/workflows/ci.yaml)\n\n\n## Overview\n\n\n\nhttps://github.com/user-attachments/assets/4062501c-8420-4750-94bc-6a8f82a69989\n\n\n\nInfero allows you to easily download, convert, and host your models using the ONNX runtime. It provides a simple CLI to run and maintain the models.\n\n## Features\n\n- Automatic downloads.\n- Automatic ONNX conversions.\n- Automatic server setup.\n- 8-bit quantization support.\n\n## Installation\n\nTo install Infero, run the following command:\n\n```bash\npip install infero\n```\n\n## Usage\n\nHere is a simple example of how to use Infero:\n\n```bash\ninfero pull [hf_model_name]\n```\n\nTo run a model:\n\n```bash\ninfero run [hf_model_name]\n```\n\nWith 8-bit quantization:\n\n```bash\ninfero run [hf_model_name] --quantize\n```\n\nTo list all available models:\n\n```bash\ninfero list\n```\n\nTo remove a model:\n\n```bash\ninfero remove [hf_model_name]\n```\n\nInfero is licensed under the MIT License. See the [LICENSE](LICENSE) file for more details.\n\n## Contact\n\nFor any questions or feedback, please contact us at support@norsulabs.com.\n\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "Easily download, convert, and host your models using the ONNX runtime",
    "version": "0.0.16",
    "project_urls": null,
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "4c007e7151c538b7ecb9ea7f99207d9171e7188b85c305a67d70b2a638fc26e2",
                "md5": "f0aada178e7b6cae16a7ec3c923e58d2",
                "sha256": "31ae2863995f701fb2c18699cb52212575d7b88a989de05c5a2e5624a830c85a"
            },
            "downloads": -1,
            "filename": "infero-0.0.16-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "f0aada178e7b6cae16a7ec3c923e58d2",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<4.0,>=3.12",
            "size": 8672,
            "upload_time": "2024-12-15T18:05:06",
            "upload_time_iso_8601": "2024-12-15T18:05:06.243023Z",
            "url": "https://files.pythonhosted.org/packages/4c/00/7e7151c538b7ecb9ea7f99207d9171e7188b85c305a67d70b2a638fc26e2/infero-0.0.16-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "f5d05ac13d4a28b5c9dbac1369bfe28d694adee0eeb9233e33bd0635cd8f9bef",
                "md5": "5a76fa9e016a4004177bae54196dabe0",
                "sha256": "ef6e7b974c41e8c4c3f5ee919303ea0b5dd8a17727099420653f805bf90c2234"
            },
            "downloads": -1,
            "filename": "infero-0.0.16.tar.gz",
            "has_sig": false,
            "md5_digest": "5a76fa9e016a4004177bae54196dabe0",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<4.0,>=3.12",
            "size": 6595,
            "upload_time": "2024-12-15T18:05:07",
            "upload_time_iso_8601": "2024-12-15T18:05:07.507740Z",
            "url": "https://files.pythonhosted.org/packages/f5/d0/5ac13d4a28b5c9dbac1369bfe28d694adee0eeb9233e33bd0635cd8f9bef/infero-0.0.16.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-12-15 18:05:07",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "lcname": "infero"
}
        
Elapsed time: 0.76337s