Name | inference-cli JSON |
Version |
0.31.1
JSON |
| download |
home_page | https://github.com/roboflow/inference |
Summary | With no prior knowledge of machine learning or device-specific deployment, you can deploy a computer vision model to a range of devices and environments using Roboflow Inference CLI. |
upload_time | 2024-12-13 18:57:17 |
maintainer | None |
docs_url | None |
author | Roboflow |
requires_python | <3.13,>=3.8 |
license | None |
keywords |
|
VCS |
|
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
<div align="center">
<img
width="100%"
src="https://github.com/roboflow/inference/assets/6319317/9230d986-183d-4ab0-922b-4b497f16d937"
/>
<br/>
### [inference package](https://pypi.org/project/inference/) | [inference repo](https://github.com/roboflow/inference)
<br/>
[![version](https://badge.fury.io/py/roboflow.svg)](https://badge.fury.io/py/inference-cli)
[![downloads](https://img.shields.io/pypi/dm/inference-cli)](https://pypistats.org/packages/inference-cli)
[![license](https://img.shields.io/pypi/l/inference-cli)](https://github.com/roboflow/inference/blob/main/LICENSE)
[![python-version](https://img.shields.io/pypi/pyversions/inference-cli)](https://badge.fury.io/py/inference-cli)
</div>
# Roboflow Inference CLI
Roboflow Inference CLI offers a lightweight interface for running the Roboflow inference server locally or the Roboflow Hosted API.
To create custom inference server Docker images, go to the parent package, [Roboflow Inference](https://pypi.org/project/inference/).
[Roboflow](https://roboflow.com) has everything you need to deploy a computer vision model to a range of devices and environments. Inference supports object detection, classification, and instance segmentation models, and running foundation models (CLIP and SAM).
## 👩🏫 Examples
### inference server start
Starts a local inference server. It optionally takes a port number (default is 9001) and will only start the docker container if there is not already a container running on that port.
Before you begin, ensure that you have Docker installed on your machine. Docker provides a containerized environment,
allowing the Roboflow Inference Server to run in a consistent and isolated manner, regardless of the host system. If
you haven't installed Docker yet, you can get it from [Docker's official website](https://www.docker.com/get-started).
The CLI will automatically detect the device you are running on and pull the appropriate Docker image.
```bash
inference server start --port 9001 [-e {optional_path_to_file_with_env_variables}]
```
Parameter `--env-file` (or `-e`) is the optional path for .env file that will be loaded into inference server
in case that values of internal parameters needs to be adjusted. Any value passed explicitly as command parameter
is considered as more important and will shadow the value defined in `.env` file under the same target variable name.
### inference server status
Checks the status of the local inference server.
```bash
inference server status
```
### inference server stop
Stops the inference server.
```bash
inference server stop
```
### inference infer
Runs inference on a single image. It takes a path to an image, a Roboflow project name, model version, and API key, and will return a JSON object with the model's predictions. You can also specify a host to run inference on our hosted inference server.
#### Local image
```bash
inference infer ./image.jpg --project-id my-project --model-version 1 --api-key my-api-key
```
#### Hosted image
```bash
inference infer https://[YOUR_HOSTED_IMAGE_URL] --project-id my-project --model-version 1 --api-key my-api-key
```
#### Hosted API inference
```bash
inference infer ./image.jpg --project-id my-project --model-version 1 --api-key my-api-key --host https://detect.roboflow.com
```
## Supported Devices
Roboflow Inference CLI currently supports the following device targets:
- x86 CPU
- ARM64 CPU
- NVIDIA GPU
For Jetson specific inference server images, check out the [Roboflow Inference](https://pypi.org/project/inference/) package, or pull the images directly following instructions in the official [Roboflow Inference documentation](https://inference.roboflow.com/quickstart/docker/#pull-from-docker-hub).
## 📝 license
The Roboflow Inference code is distributed under an [Apache 2.0 license](https://github.com/roboflow/inference/blob/master/LICENSE.core). The models supported by Roboflow Inference have their own licenses. View the licenses for supported models below.
| model | license |
| :------------------------ | :-----------------------------------------------------------------------------------------------------------------------------------: |
| `inference/models/clip` | [MIT](https://github.com/openai/CLIP/blob/main/LICENSE) |
| `inference/models/gaze` | [MIT](https://github.com/Ahmednull/L2CS-Net/blob/main/LICENSE), [Apache 2.0](https://github.com/google/mediapipe/blob/master/LICENSE) |
| `inference/models/sam` | [Apache 2.0](https://github.com/facebookresearch/segment-anything/blob/main/LICENSE) |
| `inference/models/vit` | [Apache 2.0](https://github.com/roboflow/inference/main/inference/models/vit/LICENSE) |
| `inference/models/yolact` | [MIT](https://github.com/dbolya/yolact/blob/master/README.md) |
| `inference/models/yolov5` | [AGPL-3.0](https://github.com/ultralytics/yolov5/blob/master/LICENSE) |
| `inference/models/yolov7` | [GPL-3.0](https://github.com/WongKinYiu/yolov7/blob/main/README.md) |
| `inference/models/yolov8` | [AGPL-3.0](https://github.com/ultralytics/ultralytics/blob/master/LICENSE) |
## 🚀 enterprise
With a Roboflow Inference Enterprise License, you can access additional Inference features, including:
- Server cluster deployment
- Active learning
- YOLOv5 and YOLOv8 model sub-license
To learn more, [contact the Roboflow team](https://roboflow.com/sales).
## 📚 documentation
Visit our [documentation](https://roboflow.github.io/inference) for usage examples and reference for Roboflow Inference.
## 💻 explore more Roboflow open source projects
| Project | Description |
| :---------------------------------------------------------------- | :----------------------------------------------------------------------------------------------------------------------------------------------------- |
| [supervision](https://roboflow.com/supervision) | General-purpose utilities for use in computer vision projects, from predictions filtering and display to object tracking to model evaluation. |
| [Autodistill](https://github.com/autodistill/autodistill) | Automatically label images for use in training computer vision models. |
| [Inference](https://github.com/roboflow/inference) (this project) | An easy-to-use, production-ready inference server for computer vision supporting deployment of many popular model architectures and fine-tuned models. |
| [Notebooks](https://roboflow.com/notebooks) | Tutorials for computer vision tasks, from training state-of-the-art models to tracking objects to counting objects in a zone. |
| [Collect](https://github.com/roboflow/roboflow-collect) | Automated, intelligent data collection powered by CLIP. |
<br>
<div align="center">
<div align="center">
<a href="https://youtube.com/roboflow">
<img
src="https://media.roboflow.com/notebooks/template/icons/purple/youtube.png?ik-sdk-version=javascript-1.4.3&updatedAt=1672949634652"
width="3%"
/>
</a>
<img src="https://raw.githubusercontent.com/ultralytics/assets/main/social/logo-transparent.png" width="3%"/>
<a href="https://roboflow.com">
<img
src="https://media.roboflow.com/notebooks/template/icons/purple/roboflow-app.png?ik-sdk-version=javascript-1.4.3&updatedAt=1672949746649"
width="3%"
/>
</a>
<img src="https://raw.githubusercontent.com/ultralytics/assets/main/social/logo-transparent.png" width="3%"/>
<a href="https://www.linkedin.com/company/roboflow-ai/">
<img
src="https://media.roboflow.com/notebooks/template/icons/purple/linkedin.png?ik-sdk-version=javascript-1.4.3&updatedAt=1672949633691"
width="3%"
/>
</a>
<img src="https://raw.githubusercontent.com/ultralytics/assets/main/social/logo-transparent.png" width="3%"/>
<a href="https://docs.roboflow.com">
<img
src="https://media.roboflow.com/notebooks/template/icons/purple/knowledge.png?ik-sdk-version=javascript-1.4.3&updatedAt=1672949634511"
width="3%"
/>
</a>
<img src="https://raw.githubusercontent.com/ultralytics/assets/main/social/logo-transparent.png" width="3%"/>
<a href="https://disuss.roboflow.com">
<img
src="https://media.roboflow.com/notebooks/template/icons/purple/forum.png?ik-sdk-version=javascript-1.4.3&updatedAt=1672949633584"
width="3%"
/>
<img src="https://raw.githubusercontent.com/ultralytics/assets/main/social/logo-transparent.png" width="3%"/>
<a href="https://blog.roboflow.com">
<img
src="https://media.roboflow.com/notebooks/template/icons/purple/blog.png?ik-sdk-version=javascript-1.4.3&updatedAt=1672949633605"
width="3%"
/>
</a>
</a>
</div>
</div>
</div>
Raw data
{
"_id": null,
"home_page": "https://github.com/roboflow/inference",
"name": "inference-cli",
"maintainer": null,
"docs_url": null,
"requires_python": "<3.13,>=3.8",
"maintainer_email": null,
"keywords": null,
"author": "Roboflow",
"author_email": "help@roboflow.com",
"download_url": null,
"platform": null,
"description": "<div align=\"center\">\n <img\n width=\"100%\"\n src=\"https://github.com/roboflow/inference/assets/6319317/9230d986-183d-4ab0-922b-4b497f16d937\"\n />\n\n <br/>\n\n### [inference package](https://pypi.org/project/inference/) | [inference repo](https://github.com/roboflow/inference)\n\n <br/>\n\n[![version](https://badge.fury.io/py/roboflow.svg)](https://badge.fury.io/py/inference-cli)\n[![downloads](https://img.shields.io/pypi/dm/inference-cli)](https://pypistats.org/packages/inference-cli)\n[![license](https://img.shields.io/pypi/l/inference-cli)](https://github.com/roboflow/inference/blob/main/LICENSE)\n[![python-version](https://img.shields.io/pypi/pyversions/inference-cli)](https://badge.fury.io/py/inference-cli)\n\n</div>\n\n# Roboflow Inference CLI\n\nRoboflow Inference CLI offers a lightweight interface for running the Roboflow inference server locally or the Roboflow Hosted API.\n\nTo create custom inference server Docker images, go to the parent package, [Roboflow Inference](https://pypi.org/project/inference/).\n\n[Roboflow](https://roboflow.com) has everything you need to deploy a computer vision model to a range of devices and environments. Inference supports object detection, classification, and instance segmentation models, and running foundation models (CLIP and SAM).\n\n## \ud83d\udc69\u200d\ud83c\udfeb Examples\n\n### inference server start\n\nStarts a local inference server. It optionally takes a port number (default is 9001) and will only start the docker container if there is not already a container running on that port.\n\nBefore you begin, ensure that you have Docker installed on your machine. Docker provides a containerized environment, \nallowing the Roboflow Inference Server to run in a consistent and isolated manner, regardless of the host system. If \nyou haven't installed Docker yet, you can get it from [Docker's official website](https://www.docker.com/get-started).\n\nThe CLI will automatically detect the device you are running on and pull the appropriate Docker image.\n\n```bash\ninference server start --port 9001 [-e {optional_path_to_file_with_env_variables}]\n```\n\nParameter `--env-file` (or `-e`) is the optional path for .env file that will be loaded into inference server \nin case that values of internal parameters needs to be adjusted. Any value passed explicitly as command parameter\nis considered as more important and will shadow the value defined in `.env` file under the same target variable name.\n\n### inference server status\n\nChecks the status of the local inference server.\n\n```bash\ninference server status\n```\n\n### inference server stop\n\nStops the inference server.\n\n```bash\ninference server stop\n```\n\n### inference infer\n\nRuns inference on a single image. It takes a path to an image, a Roboflow project name, model version, and API key, and will return a JSON object with the model's predictions. You can also specify a host to run inference on our hosted inference server.\n\n#### Local image\n\n```bash\ninference infer ./image.jpg --project-id my-project --model-version 1 --api-key my-api-key\n```\n\n#### Hosted image\n\n```bash\ninference infer https://[YOUR_HOSTED_IMAGE_URL] --project-id my-project --model-version 1 --api-key my-api-key\n```\n\n#### Hosted API inference\n\n```bash\ninference infer ./image.jpg --project-id my-project --model-version 1 --api-key my-api-key --host https://detect.roboflow.com\n```\n\n## Supported Devices\n\nRoboflow Inference CLI currently supports the following device targets:\n\n- x86 CPU\n- ARM64 CPU\n- NVIDIA GPU\n\nFor Jetson specific inference server images, check out the [Roboflow Inference](https://pypi.org/project/inference/) package, or pull the images directly following instructions in the official [Roboflow Inference documentation](https://inference.roboflow.com/quickstart/docker/#pull-from-docker-hub).\n\n## \ud83d\udcdd license\n\nThe Roboflow Inference code is distributed under an [Apache 2.0 license](https://github.com/roboflow/inference/blob/master/LICENSE.core). The models supported by Roboflow Inference have their own licenses. View the licenses for supported models below.\n\n| model | license |\n| :------------------------ | :-----------------------------------------------------------------------------------------------------------------------------------: |\n| `inference/models/clip` | [MIT](https://github.com/openai/CLIP/blob/main/LICENSE) |\n| `inference/models/gaze` | [MIT](https://github.com/Ahmednull/L2CS-Net/blob/main/LICENSE), [Apache 2.0](https://github.com/google/mediapipe/blob/master/LICENSE) |\n| `inference/models/sam` | [Apache 2.0](https://github.com/facebookresearch/segment-anything/blob/main/LICENSE) |\n| `inference/models/vit` | [Apache 2.0](https://github.com/roboflow/inference/main/inference/models/vit/LICENSE) |\n| `inference/models/yolact` | [MIT](https://github.com/dbolya/yolact/blob/master/README.md) |\n| `inference/models/yolov5` | [AGPL-3.0](https://github.com/ultralytics/yolov5/blob/master/LICENSE) |\n| `inference/models/yolov7` | [GPL-3.0](https://github.com/WongKinYiu/yolov7/blob/main/README.md) |\n| `inference/models/yolov8` | [AGPL-3.0](https://github.com/ultralytics/ultralytics/blob/master/LICENSE) |\n\n## \ud83d\ude80 enterprise\n\nWith a Roboflow Inference Enterprise License, you can access additional Inference features, including:\n\n- Server cluster deployment\n- Active learning\n- YOLOv5 and YOLOv8 model sub-license\n\nTo learn more, [contact the Roboflow team](https://roboflow.com/sales).\n\n## \ud83d\udcda documentation\n\nVisit our [documentation](https://roboflow.github.io/inference) for usage examples and reference for Roboflow Inference.\n\n## \ud83d\udcbb explore more Roboflow open source projects\n\n| Project | Description |\n| :---------------------------------------------------------------- | :----------------------------------------------------------------------------------------------------------------------------------------------------- |\n| [supervision](https://roboflow.com/supervision) | General-purpose utilities for use in computer vision projects, from predictions filtering and display to object tracking to model evaluation. |\n| [Autodistill](https://github.com/autodistill/autodistill) | Automatically label images for use in training computer vision models. |\n| [Inference](https://github.com/roboflow/inference) (this project) | An easy-to-use, production-ready inference server for computer vision supporting deployment of many popular model architectures and fine-tuned models. |\n| [Notebooks](https://roboflow.com/notebooks) | Tutorials for computer vision tasks, from training state-of-the-art models to tracking objects to counting objects in a zone. |\n| [Collect](https://github.com/roboflow/roboflow-collect) | Automated, intelligent data collection powered by CLIP. |\n\n<br>\n\n<div align=\"center\">\n\n <div align=\"center\">\n <a href=\"https://youtube.com/roboflow\">\n <img\n src=\"https://media.roboflow.com/notebooks/template/icons/purple/youtube.png?ik-sdk-version=javascript-1.4.3&updatedAt=1672949634652\"\n width=\"3%\"\n />\n </a>\n <img src=\"https://raw.githubusercontent.com/ultralytics/assets/main/social/logo-transparent.png\" width=\"3%\"/>\n <a href=\"https://roboflow.com\">\n <img\n src=\"https://media.roboflow.com/notebooks/template/icons/purple/roboflow-app.png?ik-sdk-version=javascript-1.4.3&updatedAt=1672949746649\"\n width=\"3%\"\n />\n </a>\n <img src=\"https://raw.githubusercontent.com/ultralytics/assets/main/social/logo-transparent.png\" width=\"3%\"/>\n <a href=\"https://www.linkedin.com/company/roboflow-ai/\">\n <img\n src=\"https://media.roboflow.com/notebooks/template/icons/purple/linkedin.png?ik-sdk-version=javascript-1.4.3&updatedAt=1672949633691\"\n width=\"3%\"\n />\n </a>\n <img src=\"https://raw.githubusercontent.com/ultralytics/assets/main/social/logo-transparent.png\" width=\"3%\"/>\n <a href=\"https://docs.roboflow.com\">\n <img\n src=\"https://media.roboflow.com/notebooks/template/icons/purple/knowledge.png?ik-sdk-version=javascript-1.4.3&updatedAt=1672949634511\"\n width=\"3%\"\n />\n </a>\n <img src=\"https://raw.githubusercontent.com/ultralytics/assets/main/social/logo-transparent.png\" width=\"3%\"/>\n <a href=\"https://disuss.roboflow.com\">\n <img\n src=\"https://media.roboflow.com/notebooks/template/icons/purple/forum.png?ik-sdk-version=javascript-1.4.3&updatedAt=1672949633584\"\n width=\"3%\"\n />\n <img src=\"https://raw.githubusercontent.com/ultralytics/assets/main/social/logo-transparent.png\" width=\"3%\"/>\n <a href=\"https://blog.roboflow.com\">\n <img\n src=\"https://media.roboflow.com/notebooks/template/icons/purple/blog.png?ik-sdk-version=javascript-1.4.3&updatedAt=1672949633605\"\n width=\"3%\"\n />\n </a>\n </a>\n </div>\n\n</div>\n </div>\n",
"bugtrack_url": null,
"license": null,
"summary": "With no prior knowledge of machine learning or device-specific deployment, you can deploy a computer vision model to a range of devices and environments using Roboflow Inference CLI.",
"version": "0.31.1",
"project_urls": {
"Homepage": "https://github.com/roboflow/inference"
},
"split_keywords": [],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "86709a8d190193617edc904ec741ccc2c0d5f163e4d034fd4328ca306ec1b390",
"md5": "f44225937c8184d9103942aeb017cd2c",
"sha256": "be548d04657eb5ec07a27553fc878af12917ce8de9ff96e883745a8d2d5a97d9"
},
"downloads": -1,
"filename": "inference_cli-0.31.1-py3-none-any.whl",
"has_sig": false,
"md5_digest": "f44225937c8184d9103942aeb017cd2c",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": "<3.13,>=3.8",
"size": 75823,
"upload_time": "2024-12-13T18:57:17",
"upload_time_iso_8601": "2024-12-13T18:57:17.536488Z",
"url": "https://files.pythonhosted.org/packages/86/70/9a8d190193617edc904ec741ccc2c0d5f163e4d034fd4328ca306ec1b390/inference_cli-0.31.1-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-12-13 18:57:17",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "roboflow",
"github_project": "inference",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"lcname": "inference-cli"
}