Name | inference JSON |
Version |
0.42.1
JSON |
| download |
home_page | https://github.com/roboflow/inference |
Summary | With no prior knowledge of machine learning or device-specific deployment, you can deploy a computer vision model to a range of devices and environments using Roboflow Inference. |
upload_time | 2025-03-17 22:29:40 |
maintainer | None |
docs_url | None |
author | Roboflow |
requires_python | <3.13,>=3.9 |
license | None |
keywords |
|
VCS |
 |
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
<div align="center">
<p>
<a align="center" href="" target="https://inference.roboflow.com/">
<img
width="100%"
src="https://github.com/roboflow/inference/blob/main/banner.png?raw=true"
>
</a>
</p>
<br>
[notebooks](https://github.com/roboflow/notebooks) | [supervision](https://github.com/roboflow/supervision) | [autodistill](https://github.com/autodistill/autodistill) | [maestro](https://github.com/roboflow/multimodal-maestro)
<br>
[](https://badge.fury.io/py/inference)
[](https://pypistats.org/packages/inference)
[](https://hub.docker.com/u/roboflow)
[](https://github.com/roboflow/inference/blob/main/LICENSE.core)
<!-- [](https://huggingface.co/spaces/Roboflow/workflows) -->
</div>
## Make Any Camera an AI Camera
Inference turns any computer or edge device into a command center for your computer vision projects.
* đ ī¸ Self-host [your own fine-tuned models](https://inference.roboflow.com/quickstart/explore_models/)
* đ§ Access the latest and greatest foundation models (like [Florence-2](https://blog.roboflow.com/florence-2/), [CLIP](https://blog.roboflow.com/openai-clip/), and [SAM2](https://blog.roboflow.com/what-is-segment-anything-2/))
* đ¤ Use [Workflows](https://inference.roboflow.com/workflows/about/) to track, count, time, measure, and visualize
* đī¸ Combine ML with traditional CV methods (like OCR, Barcode Reading, QR, and template matching)
* đ Monitor, record, and analyze predictions
* đĨ [Manage](https://inference.roboflow.com/workflows/video_processing/overview/) cameras and video streams
* đŦ Send notifications when events happen
* đ Connect with external systems and APIs
* đ [Extend](https://inference.roboflow.com/workflows/create_workflow_block/) with your own code and models
* đ Deploy production systems at scale
See [Example Workflows](https://inference.roboflow.com/workflows/gallery/) for common use-cases like detecting small objects with SAHI, multi-model consensus, active learning, reading license plates, blurring faces, background removal, and more.
[Time In Zone Workflow Example](https://github.com/user-attachments/assets/743233d9-3460-442d-83f8-20e29e76b346)
## đĨ quickstart
[Install Docker](https://docs.docker.com/engine/install/) (and
[NVIDIA Container Toolkit](https://docs.nvidia.com/datacenter/cloud-native/container-toolkit/latest/install-guide.html)
for GPU acceleration if you have a CUDA-enabled GPU). Then run
```
pip install inference-cli && inference server start --dev
```
This will pull the proper image for your machine and start it in development mode.
In development mode, a Jupyter notebook server with a quickstart guide runs on
[`localhost:9002`](http://localhost:9002). Dive in there for a whirlwind tour
of your new Inference Server's functionality!
Now you're ready to connect your camera streams and
[start building & deploying Workflows in the UI](https://app.roboflow.com/workflows)
or [interacting with your new server](https://inference.roboflow.com/workflows/create_and_run/)
via its API.
## đ ī¸ build with Workflows
A key component of Inference is [Workflows](https://roboflow.com/workflows), composable blocks of common functionality that give models a common interface to make chaining and experimentation easy.

With Workflows, you can:
* Detect, classify, and segment objects in images using state-of-the-art models.
* Use Large Multimodal Models (LMMs) to make determinations at any stage in a workflow.
* Seamlessly swap out models for a given task.
* Chain models together.
* Track, count, time, measure, and visualize objects.
* Add business logic and extend functionality to work with your external systems.
Workflows allow you to extend simple model predictions to build computer vision micro-services that fit into a larger application or fully self-contained visual agents that run on a video stream.
[Learn more](https://roboflow.com/workflows), read [the Workflows docs](https://inference.roboflow.com/workflows/about/), or [start building](https://app.roboflow.com/workflows).
<table border="0" cellspacing="0" cellpadding="0" role="presentation">
<tr>
<!-- Left cell (thumbnail) -->
<td width="300" valign="top">
<a href="https://youtu.be/aPxlImNxj5A">
<img src="https://img.youtube.com/vi/aPxlImNxj5A/0.jpg"
alt="Self Checkout with Workflows" width="300" />
</a>
</td>
<!-- Right cell (title, date, description) -->
<td valign="middle">
<strong>
<a href="https://youtu.be/aPxlImNxj5A">Tutorial: Build an AI-Powered Self-Serve Checkout</a>
</strong><br />
<strong>Created: 2 Feb 2025</strong><br /><br />
Make a computer vision app that identifies different pieces of hardware, calculates
the total cost, and records the results to a database.
</td>
</tr>
<tr>
<td width="300" valign="top">
<a href="https://youtu.be/r3Ke7ZEh2Qo">
<img src="https://img.youtube.com/vi/r3Ke7ZEh2Qo/0.jpg"
alt="Workflows Tutorial" width="300" />
</a>
</td>
<td valign="middle">
<strong>
<a href="https://youtu.be/r3Ke7ZEh2Qo">
Tutorial: Intro to Workflows
</a>
</strong><br />
<strong>Created: 6 Jan 2025</strong><br /><br />
Learn how to build and deploy Workflows for common use-cases like detecting
vehicles, filtering detections, visualizing results, and calculating dwell
time on a live video stream.
</td>
</tr>
<tr>
<!-- Left cell (thumbnail) -->
<td width="300" valign="top">
<a href="https://youtu.be/tZa-QgFn7jg">
<img src="https://img.youtube.com/vi/tZa-QgFn7jg/0.jpg"
alt="Smart Parking with AI" width="300" />
</a>
</td>
<!-- Right cell (title, date, description) -->
<td valign="middle">
<strong>
<a href="https://youtu.be/tZa-QgFn7jg">Tutorial: Build a Smart Parking System</a>
</strong><br />
<strong>Created: 27 Nov 2024</strong><br /><br />
Build a smart parking lot management system using Roboflow Workflows!
This tutorial covers license plate detection with YOLOv8, object tracking
with ByteTrack, and real-time notifications with a Telegram bot.
</td>
</tr>
</table>
## đ connecting via api
Once you've installed Inference, your machine is a fully-featured CV center.
You can use its API to run models and workflows on images and video streams.
By default, the server is running locally on
[`localhost:9001`](http://localhost:9001).
To interface with your server via Python, use our SDK.
`pip install inference-sdk` then run
[an example model comparison Workflow](https://app.roboflow.com/workflows/embed/eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJ3b3JrZmxvd0lkIjoiSHhIODdZR0FGUWhaVmtOVWNEeVUiLCJ3b3Jrc3BhY2VJZCI6IlhySm9BRVFCQkFPc2ozMmpYZ0lPIiwidXNlcklkIjoiNXcyMFZ6UU9iVFhqSmhUanE2a2FkOXVicm0zMyIsImlhdCI6MTczNTIzNDA4Mn0.AA78pZnlivFs5pBPVX9cMigFAOIIMZk0dA4gxEF5tj4)
like this:
```python
from inference_sdk import InferenceHTTPClient
client = InferenceHTTPClient(
api_url="http://localhost:9001", # use local inference server
# api_key="<YOUR API KEY>" # optional to access your private data and models
)
result = client.run_workflow(
workspace_name="roboflow-docs",
workflow_id="model-comparison",
images={
"image": "https://media.roboflow.com/workflows/examples/bleachers.jpg"
},
parameters={
"model1": "yolov8n-640",
"model2": "yolov11n-640"
}
)
print(result)
```
In other languages, use the server's REST API;
you can access the API docs for your server at
[`/docs` (OpenAPI format)](http://localhost:9001/docs) or
[`/redoc` (Redoc Format)](http://localhost:9001/redoc).
Check out [the inference_sdk docs](https://inference.roboflow.com/inference_helpers/inference_sdk/)
to see what else you can do with your new server.
## đĨ connect to video streams
The inference server is a video processing beast. You can set it up to run
Workflows on RTSP streams, webcam devices, and more. It will handle hardware
acceleration, multiprocessing, video decoding and GPU batching to get the
most out of your hardware.
[This example workflow](https://app.roboflow.com/workflows/embed/eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJ3b3JrZmxvd0lkIjoiNHMzSDAzcmtyU0JiSDhFMjEzZUUiLCJ3b3Jrc3BhY2VJZCI6IlhySm9BRVFCQkFPc2ozMmpYZ0lPIiwidXNlcklkIjoiNXcyMFZ6UU9iVFhqSmhUanE2a2FkOXVicm0zMyIsImlhdCI6MTczNTIzOTk3NX0.TYdmD5AS8tbpz8AxEr5xW-05LlegK61kq-5_OReIrwc?showGraph=true&hideToolbar=false)
will watch a stream for frames that
[CLIP thinks](https://blog.roboflow.com/openai-clip/) match an
inputted text prompt.
```python
from inference_sdk import InferenceHTTPClient
import atexit
import time
max_fps = 4
client = InferenceHTTPClient(
api_url="http://localhost:9001", # use local inference server
# api_key="<YOUR API KEY>" # optional to access your private data and models
)
# Start a stream on an rtsp stream
result = client.start_inference_pipeline_with_workflow(
video_reference=["rtsp://user:password@192.168.0.100:554/"],
workspace_name="roboflow-docs",
workflow_id="clip-frames",
max_fps=max_fps,
workflows_parameters={
"prompt": "blurry", # change to look for something else
"threshold": 0.16
}
)
pipeline_id = result["context"]["pipeline_id"]
# Terminate the pipeline when the script exits
atexit.register(lambda: client.terminate_inference_pipeline(pipeline_id))
while True:
result = client.consume_inference_pipeline_result(pipeline_id=pipeline_id)
if not result["outputs"] or not result["outputs"][0]:
# still initializing
continue
output = result["outputs"][0]
is_match = output.get("is_match")
similarity = round(output.get("similarity")*100, 1)
print(f"Matches prompt? {is_match} (similarity: {similarity}%)")
time.sleep(1/max_fps)
```
Pipeline outputs can be consumed via API for downstream processing or the
Workflow can be configured to call external services with Notification blocks
(like [Email](https://inference.roboflow.com/workflows/blocks/email_notification/)
or [Twilio](https://inference.roboflow.com/workflows/blocks/twilio_sms_notification/))
or the [Webhook block](https://inference.roboflow.com/workflows/blocks/webhook_sink/).
For more info on video pipeline management, see the
[Video Processing overview](https://inference.roboflow.com/workflows/video_processing/overview/).
If you have a Roboflow account & have linked an API key, you can also remotely
[monitor and manage your running streams](https://app.roboflow.com/devices)
via the Roboflow UI.
## đ connect to the cloud
Without an API Key, you can access a wide range of pre-trained and foundational models and run public Workflows.
Pass an optional [Roboflow API Key](https://app.roboflow.com/settings/api) to the `inference_sdk` or API to access additional features enhanced by Roboflow's Cloud
platform. When running with an API Key, usage is metered according to
Roboflow's [pricing tiers](https://roboflow.com/pricing).
| | Open Access | With API Key (Metered) |
|-------------------------|-------------|--------------|
| [Pre-Trained Models](https://inference.roboflow.com/quickstart/aliases/#supported-pre-trained-models) | â
| â
| [Foundation Models](https://inference.roboflow.com/foundation/about/) | â
| â
| [Video Stream Management](https://inference.roboflow.com/workflows/video_processing/overview/) | â
| â
| [Dynamic Python Blocks](https://inference.roboflow.com/workflows/custom_python_code_blocks/) | â
| â
| [Public Workflows](https://inference.roboflow.com/workflows/about/) | â
| â
| [Private Workflows](https://docs.roboflow.com/workflows/create-a-workflow) | | â
| [Fine-Tuned Models](https://roboflow.com/train) | | â
| [Universe Models](https://roboflow.com/universe) | | â
| [Active Learning](https://inference.roboflow.com/workflows/blocks/roboflow_dataset_upload/) | | â
| [Serverless Hosted API](https://docs.roboflow.com/deploy/hosted-api) | | â
| [Dedicated Deployments](https://docs.roboflow.com/deploy/dedicated-deployments) | | â
| [Commercial Model Licensing](https://roboflow.com/licensing) | | Paid
| [Device Management](https://docs.roboflow.com/roboflow-enterprise) | | Enterprise
| [Model Monitoring](https://docs.roboflow.com/deploy/model-monitoring) | | Enterprise
## đŠī¸ hosted compute
If you don't want to manage your own infrastructure for self-hosting, Roboflow offers a hosted Inference Server via [one-click Dedicated Deployments](https://docs.roboflow.com/deploy/dedicated-deployments) (CPU and GPU machines) billed hourly, or simple models and Workflows via our [serverless Hosted API](https://docs.roboflow.com/deploy/hosted-api) billed per API-call.
We offer a [generous free-tier](https://roboflow.com/pricing) to get started.
## đĨī¸ run on-prem or self-hosted
Inference is designed to run on a wide range of hardware from beefy cloud servers to tiny edge devices. This lets you easily develop against your local machine or our cloud infrastructure and then seamlessly switch to another device for production deployment.
`inference server start` attempts to automatically choose the optimal container to optimize performance on your machine (including with GPU acceleration via NVIDIA CUDA when available). Special installation notes and performance tips by device are listed below:
* [Linux](https://inference.roboflow.com/install/linux/)
* [Windows](https://inference.roboflow.com/install/windows/)
* [Mac](https://inference.roboflow.com/install/mac/)
* [NVIDIA Jetson](https://inference.roboflow.com/install/jetson/)
* [Raspberry Pi](https://inference.roboflow.com/install/raspberry-pi/)
* [Your Own Cloud](https://inference.roboflow.com/install/cloud/)
* [Other Devices](https://inference.roboflow.com/install/other/)
### âī¸ New: Enterprise Hardware
For manufacturing and logistics use-cases Roboflow now offers [the NVIDIA Jetson-based Flowbox](https://roboflow.com/industries/manufacturing/box), a ruggedized CV center pre-configured with Inference and optimized for running in secure networks. It has integrated support for machine vision cameras like Basler and Lucid over GigE, supports interfacing with PLCs and HMIs via OPC or MQTT, enables enterprise device management through a DMZ, and comes with the support of our team of computer vision experts to ensure your project is a success.
## đ documentation
Visit our [documentation](https://inference.roboflow.com) to explore comprehensive guides, detailed API references, and a wide array of tutorials designed to help you harness the full potential of the Inference package.
## Š license
The core of Inference is licensed under Apache 2.0.
Models are subject to licensing which respects the underlying architecture. These licenses are listed in [`inference/models`](/inference/models). Paid Roboflow accounts include a commercial license for some models (see [roboflow.com/licensing](https://roboflow.com/licensing) for details).
Cloud connected functionality (like our model and Workflows registries, dataset management, model monitoring, device management, and managed infrastructure) requires a Roboflow account and API key & is metered based on usage.
Enterprise functionality is source-available in [`inference/enterprise`](/inference/enterprise/) under an [enterprise license](/inference/enterprise/LICENSE.txt) and usage in production requires an active Enterprise contract in good standing.
See the "Self Hosting and Edge Deployment" section of the [Roboflow Licensing](https://roboflow.com/licensing) documentation for more information on how Roboflow Inference is licensed.
## đ contribution
We would love your input to improve Roboflow Inference! Please see our [contributing guide](https://github.com/roboflow/inference/blob/master/CONTRIBUTING.md) to get started. Thank you to all of our contributors! đ
<br>
<div align="center">
<div align="center">
<a href="https://youtube.com/roboflow">
<img
src="https://media.roboflow.com/notebooks/template/icons/purple/youtube.png?ik-sdk-version=javascript-1.4.3&updatedAt=1672949634652"
width="3%"
/>
</a>
<img src="https://raw.githubusercontent.com/ultralytics/assets/main/social/logo-transparent.png" width="3%"/>
<a href="https://roboflow.com">
<img
src="https://media.roboflow.com/notebooks/template/icons/purple/roboflow-app.png?ik-sdk-version=javascript-1.4.3&updatedAt=1672949746649"
width="3%"
/>
</a>
<img src="https://raw.githubusercontent.com/ultralytics/assets/main/social/logo-transparent.png" width="3%"/>
<a href="https://www.linkedin.com/company/roboflow-ai/">
<img
src="https://media.roboflow.com/notebooks/template/icons/purple/linkedin.png?ik-sdk-version=javascript-1.4.3&updatedAt=1672949633691"
width="3%"
/>
</a>
<img src="https://raw.githubusercontent.com/ultralytics/assets/main/social/logo-transparent.png" width="3%"/>
<a href="https://docs.roboflow.com">
<img
src="https://media.roboflow.com/notebooks/template/icons/purple/knowledge.png?ik-sdk-version=javascript-1.4.3&updatedAt=1672949634511"
width="3%"
/>
</a>
<img src="https://raw.githubusercontent.com/ultralytics/assets/main/social/logo-transparent.png" width="3%"/>
<a href="https://disuss.roboflow.com">
<img
src="https://media.roboflow.com/notebooks/template/icons/purple/forum.png?ik-sdk-version=javascript-1.4.3&updatedAt=1672949633584"
width="3%"
/>
<img src="https://raw.githubusercontent.com/ultralytics/assets/main/social/logo-transparent.png" width="3%"/>
<a href="https://blog.roboflow.com">
<img
src="https://media.roboflow.com/notebooks/template/icons/purple/blog.png?ik-sdk-version=javascript-1.4.3&updatedAt=1672949633605"
width="3%"
/>
</a>
</a>
</div>
</div>
Raw data
{
"_id": null,
"home_page": "https://github.com/roboflow/inference",
"name": "inference",
"maintainer": null,
"docs_url": null,
"requires_python": "<3.13,>=3.9",
"maintainer_email": null,
"keywords": null,
"author": "Roboflow",
"author_email": "help@roboflow.com",
"download_url": null,
"platform": null,
"description": "<div align=\"center\">\n <p>\n <a align=\"center\" href=\"\" target=\"https://inference.roboflow.com/\">\n <img\n width=\"100%\"\n src=\"https://github.com/roboflow/inference/blob/main/banner.png?raw=true\"\n >\n </a>\n </p>\n\n <br>\n\n[notebooks](https://github.com/roboflow/notebooks) | [supervision](https://github.com/roboflow/supervision) | [autodistill](https://github.com/autodistill/autodistill) | [maestro](https://github.com/roboflow/multimodal-maestro)\n\n <br>\n\n[](https://badge.fury.io/py/inference)\n[](https://pypistats.org/packages/inference)\n[](https://hub.docker.com/u/roboflow)\n[](https://github.com/roboflow/inference/blob/main/LICENSE.core)\n\n<!-- [](https://huggingface.co/spaces/Roboflow/workflows) -->\n\n</div>\n\n## Make Any Camera an AI Camera\n\nInference turns any computer or edge device into a command center for your computer vision projects.\n\n* \ud83d\udee0\ufe0f Self-host [your own fine-tuned models](https://inference.roboflow.com/quickstart/explore_models/)\n* \ud83e\udde0 Access the latest and greatest foundation models (like [Florence-2](https://blog.roboflow.com/florence-2/), [CLIP](https://blog.roboflow.com/openai-clip/), and [SAM2](https://blog.roboflow.com/what-is-segment-anything-2/))\n* \ud83e\udd1d Use [Workflows](https://inference.roboflow.com/workflows/about/) to track, count, time, measure, and visualize\n* \ud83d\udc41\ufe0f Combine ML with traditional CV methods (like OCR, Barcode Reading, QR, and template matching)\n* \ud83d\udcc8 Monitor, record, and analyze predictions\n* \ud83c\udfa5 [Manage](https://inference.roboflow.com/workflows/video_processing/overview/) cameras and video streams\n* \ud83d\udcec Send notifications when events happen\n* \ud83d\udedc Connect with external systems and APIs\n* \ud83d\udd17 [Extend](https://inference.roboflow.com/workflows/create_workflow_block/) with your own code and models\n* \ud83d\ude80 Deploy production systems at scale\n\nSee [Example Workflows](https://inference.roboflow.com/workflows/gallery/) for common use-cases like detecting small objects with SAHI, multi-model consensus, active learning, reading license plates, blurring faces, background removal, and more.\n\n[Time In Zone Workflow Example](https://github.com/user-attachments/assets/743233d9-3460-442d-83f8-20e29e76b346)\n\n## \ud83d\udd25 quickstart\n\n[Install Docker](https://docs.docker.com/engine/install/) (and\n[NVIDIA Container Toolkit](https://docs.nvidia.com/datacenter/cloud-native/container-toolkit/latest/install-guide.html)\nfor GPU acceleration if you have a CUDA-enabled GPU). Then run\n\n```\npip install inference-cli && inference server start --dev\n```\n\nThis will pull the proper image for your machine and start it in development mode.\n\nIn development mode, a Jupyter notebook server with a quickstart guide runs on \n[`localhost:9002`](http://localhost:9002). Dive in there for a whirlwind tour\nof your new Inference Server's functionality!\n\nNow you're ready to connect your camera streams and\n[start building & deploying Workflows in the UI](https://app.roboflow.com/workflows)\nor [interacting with your new server](https://inference.roboflow.com/workflows/create_and_run/)\nvia its API.\n\n## \ud83d\udee0\ufe0f build with Workflows\n\nA key component of Inference is [Workflows](https://roboflow.com/workflows), composable blocks of common functionality that give models a common interface to make chaining and experimentation easy.\n\n\n\nWith Workflows, you can:\n* Detect, classify, and segment objects in images using state-of-the-art models.\n* Use Large Multimodal Models (LMMs) to make determinations at any stage in a workflow.\n* Seamlessly swap out models for a given task.\n* Chain models together.\n* Track, count, time, measure, and visualize objects.\n* Add business logic and extend functionality to work with your external systems.\n\nWorkflows allow you to extend simple model predictions to build computer vision micro-services that fit into a larger application or fully self-contained visual agents that run on a video stream.\n\n[Learn more](https://roboflow.com/workflows), read [the Workflows docs](https://inference.roboflow.com/workflows/about/), or [start building](https://app.roboflow.com/workflows).\n\n<table border=\"0\" cellspacing=\"0\" cellpadding=\"0\" role=\"presentation\">\n <tr>\n <!-- Left cell (thumbnail) -->\n <td width=\"300\" valign=\"top\">\n <a href=\"https://youtu.be/aPxlImNxj5A\">\n <img src=\"https://img.youtube.com/vi/aPxlImNxj5A/0.jpg\" \n alt=\"Self Checkout with Workflows\" width=\"300\" />\n </a>\n </td>\n <!-- Right cell (title, date, description) -->\n <td valign=\"middle\">\n <strong>\n <a href=\"https://youtu.be/aPxlImNxj5A\">Tutorial: Build an AI-Powered Self-Serve Checkout</a>\n </strong><br />\n <strong>Created: 2 Feb 2025</strong><br /><br />\n Make a computer vision app that identifies different pieces of hardware, calculates\n the total cost, and records the results to a database.\n </td>\n </tr>\n\n <tr>\n <td width=\"300\" valign=\"top\">\n <a href=\"https://youtu.be/r3Ke7ZEh2Qo\">\n <img src=\"https://img.youtube.com/vi/r3Ke7ZEh2Qo/0.jpg\" \n alt=\"Workflows Tutorial\" width=\"300\" />\n </a>\n </td>\n <td valign=\"middle\">\n <strong>\n <a href=\"https://youtu.be/r3Ke7ZEh2Qo\">\n Tutorial: Intro to Workflows\n </a>\n </strong><br />\n <strong>Created: 6 Jan 2025</strong><br /><br />\n Learn how to build and deploy Workflows for common use-cases like detecting\n vehicles, filtering detections, visualizing results, and calculating dwell \n time on a live video stream.\n </td>\n </tr>\n\n <tr>\n <!-- Left cell (thumbnail) -->\n <td width=\"300\" valign=\"top\">\n <a href=\"https://youtu.be/tZa-QgFn7jg\">\n <img src=\"https://img.youtube.com/vi/tZa-QgFn7jg/0.jpg\" \n alt=\"Smart Parking with AI\" width=\"300\" />\n </a>\n </td>\n <!-- Right cell (title, date, description) -->\n <td valign=\"middle\">\n <strong>\n <a href=\"https://youtu.be/tZa-QgFn7jg\">Tutorial: Build a Smart Parking System</a>\n </strong><br />\n <strong>Created: 27 Nov 2024</strong><br /><br />\n Build a smart parking lot management system using Roboflow Workflows!\n This tutorial covers license plate detection with YOLOv8, object tracking\n with ByteTrack, and real-time notifications with a Telegram bot.\n </td>\n </tr>\n</table>\n\n## \ud83d\udcdf connecting via api\n \nOnce you've installed Inference, your machine is a fully-featured CV center.\nYou can use its API to run models and workflows on images and video streams.\nBy default, the server is running locally on\n[`localhost:9001`](http://localhost:9001).\n\nTo interface with your server via Python, use our SDK.\n`pip install inference-sdk` then run\n[an example model comparison Workflow](https://app.roboflow.com/workflows/embed/eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJ3b3JrZmxvd0lkIjoiSHhIODdZR0FGUWhaVmtOVWNEeVUiLCJ3b3Jrc3BhY2VJZCI6IlhySm9BRVFCQkFPc2ozMmpYZ0lPIiwidXNlcklkIjoiNXcyMFZ6UU9iVFhqSmhUanE2a2FkOXVicm0zMyIsImlhdCI6MTczNTIzNDA4Mn0.AA78pZnlivFs5pBPVX9cMigFAOIIMZk0dA4gxEF5tj4)\nlike this:\n\n```python\nfrom inference_sdk import InferenceHTTPClient\n\nclient = InferenceHTTPClient(\n api_url=\"http://localhost:9001\", # use local inference server\n # api_key=\"<YOUR API KEY>\" # optional to access your private data and models\n)\n\nresult = client.run_workflow(\n workspace_name=\"roboflow-docs\",\n workflow_id=\"model-comparison\",\n images={\n \"image\": \"https://media.roboflow.com/workflows/examples/bleachers.jpg\"\n },\n parameters={\n \"model1\": \"yolov8n-640\",\n \"model2\": \"yolov11n-640\"\n }\n)\n\nprint(result)\n```\n\nIn other languages, use the server's REST API;\nyou can access the API docs for your server at\n[`/docs` (OpenAPI format)](http://localhost:9001/docs) or\n[`/redoc` (Redoc Format)](http://localhost:9001/redoc).\n\nCheck out [the inference_sdk docs](https://inference.roboflow.com/inference_helpers/inference_sdk/)\nto see what else you can do with your new server.\n\n## \ud83c\udfa5 connect to video streams\n\nThe inference server is a video processing beast. You can set it up to run\nWorkflows on RTSP streams, webcam devices, and more. It will handle hardware\nacceleration, multiprocessing, video decoding and GPU batching to get the\nmost out of your hardware.\n\n[This example workflow](https://app.roboflow.com/workflows/embed/eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJ3b3JrZmxvd0lkIjoiNHMzSDAzcmtyU0JiSDhFMjEzZUUiLCJ3b3Jrc3BhY2VJZCI6IlhySm9BRVFCQkFPc2ozMmpYZ0lPIiwidXNlcklkIjoiNXcyMFZ6UU9iVFhqSmhUanE2a2FkOXVicm0zMyIsImlhdCI6MTczNTIzOTk3NX0.TYdmD5AS8tbpz8AxEr5xW-05LlegK61kq-5_OReIrwc?showGraph=true&hideToolbar=false)\nwill watch a stream for frames that\n[CLIP thinks](https://blog.roboflow.com/openai-clip/) match an\ninputted text prompt.\n```python\nfrom inference_sdk import InferenceHTTPClient\nimport atexit\nimport time\n\nmax_fps = 4\n\nclient = InferenceHTTPClient(\n api_url=\"http://localhost:9001\", # use local inference server\n # api_key=\"<YOUR API KEY>\" # optional to access your private data and models\n)\n\n# Start a stream on an rtsp stream\nresult = client.start_inference_pipeline_with_workflow(\n video_reference=[\"rtsp://user:password@192.168.0.100:554/\"],\n workspace_name=\"roboflow-docs\",\n workflow_id=\"clip-frames\",\n max_fps=max_fps,\n workflows_parameters={\n \"prompt\": \"blurry\", # change to look for something else\n \"threshold\": 0.16\n }\n)\n\npipeline_id = result[\"context\"][\"pipeline_id\"]\n\n# Terminate the pipeline when the script exits\natexit.register(lambda: client.terminate_inference_pipeline(pipeline_id))\n\nwhile True:\n result = client.consume_inference_pipeline_result(pipeline_id=pipeline_id)\n\n if not result[\"outputs\"] or not result[\"outputs\"][0]:\n # still initializing\n continue\n\n output = result[\"outputs\"][0]\n is_match = output.get(\"is_match\")\n similarity = round(output.get(\"similarity\")*100, 1)\n print(f\"Matches prompt? {is_match} (similarity: {similarity}%)\")\n\n time.sleep(1/max_fps)\n```\n\nPipeline outputs can be consumed via API for downstream processing or the\nWorkflow can be configured to call external services with Notification blocks\n(like [Email](https://inference.roboflow.com/workflows/blocks/email_notification/)\nor [Twilio](https://inference.roboflow.com/workflows/blocks/twilio_sms_notification/))\nor the [Webhook block](https://inference.roboflow.com/workflows/blocks/webhook_sink/).\nFor more info on video pipeline management, see the\n[Video Processing overview](https://inference.roboflow.com/workflows/video_processing/overview/).\n\nIf you have a Roboflow account & have linked an API key, you can also remotely\n[monitor and manage your running streams](https://app.roboflow.com/devices)\nvia the Roboflow UI.\n\n## \ud83d\udd11 connect to the cloud\n\nWithout an API Key, you can access a wide range of pre-trained and foundational models and run public Workflows.\n\nPass an optional [Roboflow API Key](https://app.roboflow.com/settings/api) to the `inference_sdk` or API to access additional features enhanced by Roboflow's Cloud\nplatform. When running with an API Key, usage is metered according to\nRoboflow's [pricing tiers](https://roboflow.com/pricing).\n\n| | Open Access | With API Key (Metered) |\n|-------------------------|-------------|--------------|\n| [Pre-Trained Models](https://inference.roboflow.com/quickstart/aliases/#supported-pre-trained-models) | \u2705 | \u2705\n| [Foundation Models](https://inference.roboflow.com/foundation/about/) | \u2705 | \u2705\n| [Video Stream Management](https://inference.roboflow.com/workflows/video_processing/overview/) | \u2705 | \u2705\n| [Dynamic Python Blocks](https://inference.roboflow.com/workflows/custom_python_code_blocks/) | \u2705 | \u2705\n| [Public Workflows](https://inference.roboflow.com/workflows/about/) | \u2705 | \u2705\n| [Private Workflows](https://docs.roboflow.com/workflows/create-a-workflow) | | \u2705\n| [Fine-Tuned Models](https://roboflow.com/train) | | \u2705\n| [Universe Models](https://roboflow.com/universe) | | \u2705\n| [Active Learning](https://inference.roboflow.com/workflows/blocks/roboflow_dataset_upload/) | | \u2705\n| [Serverless Hosted API](https://docs.roboflow.com/deploy/hosted-api) | | \u2705\n| [Dedicated Deployments](https://docs.roboflow.com/deploy/dedicated-deployments) | | \u2705\n| [Commercial Model Licensing](https://roboflow.com/licensing) | | Paid\n| [Device Management](https://docs.roboflow.com/roboflow-enterprise) | | Enterprise\n| [Model Monitoring](https://docs.roboflow.com/deploy/model-monitoring) | | Enterprise\n\n## \ud83c\udf29\ufe0f hosted compute\n\nIf you don't want to manage your own infrastructure for self-hosting, Roboflow offers a hosted Inference Server via [one-click Dedicated Deployments](https://docs.roboflow.com/deploy/dedicated-deployments) (CPU and GPU machines) billed hourly, or simple models and Workflows via our [serverless Hosted API](https://docs.roboflow.com/deploy/hosted-api) billed per API-call.\n\nWe offer a [generous free-tier](https://roboflow.com/pricing) to get started.\n\n## \ud83d\udda5\ufe0f run on-prem or self-hosted\n\nInference is designed to run on a wide range of hardware from beefy cloud servers to tiny edge devices. This lets you easily develop against your local machine or our cloud infrastructure and then seamlessly switch to another device for production deployment.\n\n`inference server start` attempts to automatically choose the optimal container to optimize performance on your machine (including with GPU acceleration via NVIDIA CUDA when available). Special installation notes and performance tips by device are listed below:\n\n* [Linux](https://inference.roboflow.com/install/linux/)\n* [Windows](https://inference.roboflow.com/install/windows/)\n* [Mac](https://inference.roboflow.com/install/mac/)\n* [NVIDIA Jetson](https://inference.roboflow.com/install/jetson/)\n* [Raspberry Pi](https://inference.roboflow.com/install/raspberry-pi/)\n* [Your Own Cloud](https://inference.roboflow.com/install/cloud/)\n* [Other Devices](https://inference.roboflow.com/install/other/)\n\n### \u2b50\ufe0f New: Enterprise Hardware\n\nFor manufacturing and logistics use-cases Roboflow now offers [the NVIDIA Jetson-based Flowbox](https://roboflow.com/industries/manufacturing/box), a ruggedized CV center pre-configured with Inference and optimized for running in secure networks. It has integrated support for machine vision cameras like Basler and Lucid over GigE, supports interfacing with PLCs and HMIs via OPC or MQTT, enables enterprise device management through a DMZ, and comes with the support of our team of computer vision experts to ensure your project is a success.\n\n## \ud83d\udcda documentation\n\nVisit our [documentation](https://inference.roboflow.com) to explore comprehensive guides, detailed API references, and a wide array of tutorials designed to help you harness the full potential of the Inference package.\n\n## \u00a9 license\n\nThe core of Inference is licensed under Apache 2.0.\n\nModels are subject to licensing which respects the underlying architecture. These licenses are listed in [`inference/models`](/inference/models). Paid Roboflow accounts include a commercial license for some models (see [roboflow.com/licensing](https://roboflow.com/licensing) for details).\n\nCloud connected functionality (like our model and Workflows registries, dataset management, model monitoring, device management, and managed infrastructure) requires a Roboflow account and API key & is metered based on usage.\n\nEnterprise functionality is source-available in [`inference/enterprise`](/inference/enterprise/) under an [enterprise license](/inference/enterprise/LICENSE.txt) and usage in production requires an active Enterprise contract in good standing.\n\nSee the \"Self Hosting and Edge Deployment\" section of the [Roboflow Licensing](https://roboflow.com/licensing) documentation for more information on how Roboflow Inference is licensed.\n\n## \ud83c\udfc6 contribution\n\nWe would love your input to improve Roboflow Inference! Please see our [contributing guide](https://github.com/roboflow/inference/blob/master/CONTRIBUTING.md) to get started. Thank you to all of our contributors! \ud83d\ude4f\n\n\n<br>\n\n<div align=\"center\">\n <div align=\"center\">\n <a href=\"https://youtube.com/roboflow\">\n <img\n src=\"https://media.roboflow.com/notebooks/template/icons/purple/youtube.png?ik-sdk-version=javascript-1.4.3&updatedAt=1672949634652\"\n width=\"3%\"\n />\n </a>\n <img src=\"https://raw.githubusercontent.com/ultralytics/assets/main/social/logo-transparent.png\" width=\"3%\"/>\n <a href=\"https://roboflow.com\">\n <img\n src=\"https://media.roboflow.com/notebooks/template/icons/purple/roboflow-app.png?ik-sdk-version=javascript-1.4.3&updatedAt=1672949746649\"\n width=\"3%\"\n />\n </a>\n <img src=\"https://raw.githubusercontent.com/ultralytics/assets/main/social/logo-transparent.png\" width=\"3%\"/>\n <a href=\"https://www.linkedin.com/company/roboflow-ai/\">\n <img\n src=\"https://media.roboflow.com/notebooks/template/icons/purple/linkedin.png?ik-sdk-version=javascript-1.4.3&updatedAt=1672949633691\"\n width=\"3%\"\n />\n </a>\n <img src=\"https://raw.githubusercontent.com/ultralytics/assets/main/social/logo-transparent.png\" width=\"3%\"/>\n <a href=\"https://docs.roboflow.com\">\n <img\n src=\"https://media.roboflow.com/notebooks/template/icons/purple/knowledge.png?ik-sdk-version=javascript-1.4.3&updatedAt=1672949634511\"\n width=\"3%\"\n />\n </a>\n <img src=\"https://raw.githubusercontent.com/ultralytics/assets/main/social/logo-transparent.png\" width=\"3%\"/>\n <a href=\"https://disuss.roboflow.com\">\n <img\n src=\"https://media.roboflow.com/notebooks/template/icons/purple/forum.png?ik-sdk-version=javascript-1.4.3&updatedAt=1672949633584\"\n width=\"3%\"\n />\n <img src=\"https://raw.githubusercontent.com/ultralytics/assets/main/social/logo-transparent.png\" width=\"3%\"/>\n <a href=\"https://blog.roboflow.com\">\n <img\n src=\"https://media.roboflow.com/notebooks/template/icons/purple/blog.png?ik-sdk-version=javascript-1.4.3&updatedAt=1672949633605\"\n width=\"3%\"\n />\n </a>\n </a>\n </div>\n</div>\n",
"bugtrack_url": null,
"license": null,
"summary": "With no prior knowledge of machine learning or device-specific deployment, you can deploy a computer vision model to a range of devices and environments using Roboflow Inference.",
"version": "0.42.1",
"project_urls": {
"Homepage": "https://github.com/roboflow/inference"
},
"split_keywords": [],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "08e136fc6849c1c3c66657016794bc9117805aa9cf5561e95be7b62cc987adb7",
"md5": "fb13655ecd90d041a1a5cdb2399bf85f",
"sha256": "e769ce1494c0beb372695885682f3a7390d43740e54d9940cd2f4ff72d169c02"
},
"downloads": -1,
"filename": "inference-0.42.1-py3-none-any.whl",
"has_sig": false,
"md5_digest": "fb13655ecd90d041a1a5cdb2399bf85f",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": "<3.13,>=3.9",
"size": 1007903,
"upload_time": "2025-03-17T22:29:40",
"upload_time_iso_8601": "2025-03-17T22:29:40.668286Z",
"url": "https://files.pythonhosted.org/packages/08/e1/36fc6849c1c3c66657016794bc9117805aa9cf5561e95be7b62cc987adb7/inference-0.42.1-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-03-17 22:29:40",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "roboflow",
"github_project": "inference",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"lcname": "inference"
}