vidgear


Namevidgear JSON
Version 0.3.3 PyPI version JSON
download
home_pagehttps://abhitronix.github.io/vidgear
SummaryHigh-performance cross-platform Video Processing Python framework powerpacked with unique trailblazing features.
upload_time2024-06-22 19:12:05
maintainerNone
docs_urlNone
authorAbhishek Thakur
requires_python>=3.8
licenseApache License 2.0
keywords opencv multithreading ffmpeg picamera2 starlette mss pyzmq dxcam aiortc uvicorn uvloop yt-dlp asyncio dash hls video processing video stabilization computer vision video streaming raspberrypi youtube twitch webrtc
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            <!--
===============================================
vidgear library source-code is deployed under the Apache 2.0 License:

Copyright (c) 2019 Abhishek Thakur(@abhiTronix) <abhi.una12@gmail.com>

Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at

   http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
===============================================
-->

<h1 align="center">
  <img src="https://abhitronix.github.io/vidgear/latest/assets/images/vidgear.png" alt="VidGear" title="Logo designed by Abhishek Thakur(@abhiTronix), under CC-BY-NC-SA 4.0 License" width="80%"/>
</h1>
<h2 align="center">
  <img src="https://abhitronix.github.io/vidgear/latest/assets/images/tagline.svg" alt="VidGear tagline" width="40%"/>
</h2>

<div align="center">

[Releases][release]&nbsp;&nbsp;&nbsp;|&nbsp;&nbsp;&nbsp;[Gears][gears]&nbsp;&nbsp;&nbsp;|&nbsp;&nbsp;&nbsp;[Documentation][docs]&nbsp;&nbsp;&nbsp;|&nbsp;&nbsp;&nbsp;[Installation][installation]&nbsp;&nbsp;&nbsp;|&nbsp;&nbsp;&nbsp;[License](https://github.com/abhiTronix/vidgear#copyright)

[![Build Status][github-cli]][github-flow] [![Codecov branch][codecov]][code] [![Azure DevOps builds (branch)][azure-badge]][azure-pipeline]

[![Glitter chat][gitter-bagde]][gitter] [![Build Status][appveyor]][app] [![PyPi version][pypi-badge]][pypi]

[![Code Style][black-badge]][black]

</div>

&nbsp;

VidGear is a **High-Performance Video Processing Python Library** that provides an easy-to-use, highly extensible, thoroughly optimised **Multi-Threaded + Asyncio API Framework** on top of many state-of-the-art specialized libraries like _[OpenCV][opencv], [FFmpeg][ffmpeg], [ZeroMQ][zmq], [picamera2][picamera2], [starlette][starlette], [yt_dlp][yt_dlp], [pyscreenshot][pyscreenshot], [dxcam][dxcam], [aiortc][aiortc] and [python-mss][mss]_ serving at its backend, and enable us to flexibly exploit their internal parameters and methods, while silently delivering **robust error-handling and real-time performance 🔥**

VidGear primarily focuses on simplicity, and thereby lets programmers and software developers to easily integrate and perform Complex Video Processing Tasks, in just a few lines of code.

&nbsp;

The following **functional block diagram** clearly depicts the generalized functioning of VidGear APIs:

<p align="center">
  <img src="https://abhitronix.github.io/vidgear/latest/assets/images/gears_fbd.png" alt="@Vidgear Functional Block Diagram" />
</p>

&nbsp;

# Table of Contents

- [**TL;DR**](https://github.com/abhiTronix/vidgear#tldr)
- [**Getting Started**](https://github.com/abhiTronix/vidgear#getting-started)
- [**Gears: What are these?**](https://github.com/abhiTronix/vidgear#gears-what-are-these)
  - [**CamGear**](https://github.com/abhiTronix/vidgear#camgear)
  - [**PiGear**](https://github.com/abhiTronix/vidgear#pigear)
  - [**VideoGear**](https://github.com/abhiTronix/vidgear#videogear)
  - [**ScreenGear**](https://github.com/abhiTronix/vidgear#screengear)
  - [**WriteGear**](https://github.com/abhiTronix/vidgear#writegear)
  - [**StreamGear**](https://github.com/abhiTronix/vidgear#streamgear)
  - [**NetGear**](https://github.com/abhiTronix/vidgear#netgear)
  - [**WebGear**](https://github.com/abhiTronix/vidgear#webgear)
  - [**WebGear_RTC**](https://github.com/abhiTronix/vidgear#webgear_rtc)
  - [**NetGear_Async**](https://github.com/abhiTronix/vidgear#netgear_async)
- [**Contributions**](https://github.com/abhiTronix/vidgear#contributions)
- [**Donations**](https://github.com/abhiTronix/vidgear#donations)
- [**Citation**](https://github.com/abhiTronix/vidgear#citation)
- [**Copyright**](https://github.com/abhiTronix/vidgear#copyright)

&nbsp;

&nbsp;

## TL;DR

#### What is vidgear?

> _"VidGear is a cross-platform High-Performance Framework that provides an one-stop **Video-Processing** solution for building complex real-time media applications in python."_

#### What does it do?

> _"VidGear can read, write, process, send & receive video files/frames/streams from/to various devices in real-time, and [**faster**][tqm-doc] than underline libraries."_

#### What is its purpose?

> _"Write Less and Accomplish More"_ — **VidGear's Motto**

> _"Built with simplicity in mind, VidGear lets programmers and software developers to easily integrate and perform **Complex Video-Processing Tasks** in their existing or newer applications without going through hefty documentation and in just a [**few lines of code**][switch_from_cv]. Beneficial for both, if you're new to programming with Python language or already a pro at it."_

&nbsp;

&nbsp;

## Getting Started

If this is your first time using VidGear, head straight to the [Installation >>][installation] to install VidGear.

Once you have VidGear installed, **Checkout its Well-Documented [Function-Specific Gears >>][gears]**

Also, if you're already familiar with [OpenCV][opencv] library, then see [Switching from OpenCV Library >>][switch_from_cv]

Or, if you're just getting started with OpenCV-Python programming, then refer this [FAQ >>](https://abhitronix.github.io/vidgear/latest/help/general_faqs/#im-new-to-python-programming-or-its-usage-in-opencv-library-how-to-use-vidgear-in-my-projects)

&nbsp;

&nbsp;

## Gears: What are these?

> **VidGear is built with multiple APIs a.k.a [Gears][gears], each with some unique functionality.**

Each API is designed exclusively to handle/control/process different data-specific & device-specific video streams, network streams, and media encoders/decoders. These APIs provides the user an easy-to-use, dynamic, extensible, and exposed Multi-Threaded + Asyncio optimized internal layer above state-of-the-art libraries to work with, while silently delivering robust error-handling.

**These Gears can be classified as follows:**

**A. Video-Capture Gears:**

- [**CamGear:**](https://github.com/abhiTronix/vidgear#camgear) Multi-Threaded API targeting various IP-USB-Cameras/Network-Streams/Streaming-Sites-URLs.
- [**PiGear:**](https://github.com/abhiTronix/vidgear#pigear) Multi-Threaded API targeting various Camera Modules and _(limited)_ USB cameras on Raspberry Pis 🍇.
- [**ScreenGear:**](https://github.com/abhiTronix/vidgear#screengear) High-performance API targeting rapid Screencasting Capabilities.
- [**VideoGear:**](https://github.com/abhiTronix/vidgear#videogear) Common Video-Capture API with internal [Video Stabilizer](https://abhitronix.github.io/vidgear/latest/gears/stabilizer/overview/) wrapper.

**B. Video-Writer Gears:**

- [**WriteGear:**](https://github.com/abhiTronix/vidgear#writegear) Handles Lossless Video-Writer for file/stream/frames Encoding and Compression.

**C. Streaming Gears:**

- [**StreamGear**](https://github.com/abhiTronix/vidgear#streamgear): Handles Transcoding of High-Quality, Dynamic & Adaptive Streaming Formats.

- **Asynchronous I/O Streaming Gear:**

  - [**WebGear:**](https://github.com/abhiTronix/vidgear#webgear) ASGI Video-Server that broadcasts Live MJPEG-Frames to any web-browser on the network.
  - [**WebGear_RTC:**](https://github.com/abhiTronix/vidgear#webgear_rtc) Real-time Asyncio WebRTC media server for streaming directly to peer clients over the network.

**D. Network Gears:**

- [**NetGear:**](https://github.com/abhiTronix/vidgear#netgear) Handles High-Performance Video-Frames & Data Transfer between interconnecting systems over the network.

- **Asynchronous I/O Network Gear:**

  - [**NetGear_Async:**](https://github.com/abhiTronix/vidgear#netgear_async) Immensely Memory-Efficient Asyncio Video-Frames Network Messaging Framework.

&nbsp;

&nbsp;

## CamGear

<p align="center">
  <img src="https://abhitronix.github.io/vidgear/latest/assets/images/camgear.png" alt="CamGear Functional Block Diagram" width="45%"/>
</p>

> _CamGear can grab ultra-fast frames from a diverse range of file-formats/devices/streams, which includes almost any IP-USB Cameras, multimedia video file-formats ([*upto 4k tested*][test-4k]), various network stream protocols such as `http(s), rtp, rtsp, rtmp, mms, etc.`, and GStreamer's pipelines, plus direct support for live video streaming sites like YouTube, Twitch, LiveStream, Dailymotion etc._

CamGear provides a flexible, high-level, multi-threaded framework around OpenCV's [VideoCapture class][opencv-vc] with access almost all of its available parameters. CamGear internally implements [`yt_dlp`][yt_dlp] backend class for seamlessly pipelining live video-frames and metadata from various streaming services like [YouTube][youtube-doc], [Twitch][piping-live-videos], and [many more >>](https://github.com/yt-dlp/yt-dlp/blob/master/supportedsites.md#supported-sites). Furthermore, its framework relies exclusively on [**Threaded Queue mode**][tqm-doc] for ultra-fast, error-free, and synchronized video-frame handling.

### CamGear API Guide:

[**>>> Usage Guide**][camgear-doc]

&nbsp;

&nbsp;

## VideoGear

> _VideoGear API provides a special internal wrapper around VidGear's exclusive [**Video Stabilizer**][stabilizer-doc] class._

VideoGear also acts as a Common Video-Capture API that provides internal access for both [CamGear](https://github.com/abhiTronix/vidgear#camgear) and [PiGear](https://github.com/abhiTronix/vidgear#pigear) APIs and their parameters with an exclusive `enablePiCamera` boolean flag.

VideoGear is ideal when you need to switch to different video sources without changing your code much. Also, it enables easy stabilization for various video-streams _(real-time or not)_ with minimum effort and writing way fewer lines of code.

**Below is a snapshot of a VideoGear Stabilizer in action (_See its detailed usage [here][stabilizer-doc-ex]_):**

<p align="center">
  <img src="https://user-images.githubusercontent.com/34266896/211500670-b3aaf4db-a52a-4836-a03c-c2c17b971feb.gif" alt="VideoGear Stabilizer in action!"/>
  <br>
  <sub><i>Original Video Courtesy <a href="http://liushuaicheng.org/SIGGRAPH2013/database.html" title="opensourced video samples database">@SIGGRAPH2013</a></i></sub>
</p>

**Code to generate above result:**

```python
# import required libraries
from vidgear.gears import VideoGear
import numpy as np
import cv2

# open any valid video stream with stabilization enabled(`stabilize = True`)
stream_stab = VideoGear(source="test.mp4", stabilize=True).start()

# open same stream without stabilization for comparison
stream_org = VideoGear(source="test.mp4").start()

# loop over
while True:

    # read stabilized frames
    frame_stab = stream_stab.read()

    # check for stabilized frame if Nonetype
    if frame_stab is None:
        break

    # read un-stabilized frame
    frame_org = stream_org.read()

    # concatenate both frames
    output_frame = np.concatenate((frame_org, frame_stab), axis=1)

    # put text over concatenated frame
    cv2.putText(
        output_frame,
        "Before",
        (10, output_frame.shape[0] - 10),
        cv2.FONT_HERSHEY_SIMPLEX,
        0.6,
        (0, 255, 0),
        2,
    )
    cv2.putText(
        output_frame,
        "After",
        (output_frame.shape[1] // 2 + 10, output_frame.shape[0] - 10),
        cv2.FONT_HERSHEY_SIMPLEX,
        0.6,
        (0, 255, 0),
        2,
    )

    # Show output window
    cv2.imshow("Stabilized Frame", output_frame)

    # check for 'q' key if pressed
    key = cv2.waitKey(1) & 0xFF
    if key == ord("q"):
        break

# close output window
cv2.destroyAllWindows()

# safely close both video streams
stream_org.stop()
stream_stab.stop()
```

### VideoGear API Guide:

[**>>> Usage Guide**][videogear-doc]

&nbsp;

&nbsp;

## PiGear

<p align="center">
  <img src="https://abhitronix.github.io/vidgear/latest/assets/images/picam2.webp" alt="PiGear" width="50%" />
</p>

> _PiGear is a specialized API similar to the [CamGear API](https://github.com/abhiTronix/vidgear#camgear) but optimized for **Raspberry Pi :grapes: Boards**, offering comprehensive **support for camera modules** _(e.g., [OmniVision OV5647 Camera Module][ov5647-picam], [Sony IMX219 Camera Module][imx219-picam])_, along with **limited compatibility for USB cameras**._

PiGear implements a seamless and robust wrapper around the [picamera2][picamera2] python library, simplifying integration with minimal code changes and ensuring a smooth transition for developers already familiar with the Picamera2 API. PiGear leverages the `libcamera` API under the hood with multi-threading, providing high-performance :fire:, enhanced control and functionality for Raspberry Pi camera modules. 

PiGear handles common configuration parameters and non-standard settings for various camera types, simplifying the integration process. PiGear currently supports PiCamera2 API parameters such as `sensor`, `controls`, `transform`, and `format` etc., with internal type and sanity checks for robust performance.

While primarily focused on Raspberry Pi camera modules, PiGear also provides **basic functionality for USB webcams** only with Picamera2 API, along with the ability to accurately differentiate between USB and Raspberry Pi cameras using metadata. 

PiGear seamlessly switches to the legacy [picamera][picamera] library if the `picamera2` library is unavailable, ensuring seamless backward compatibility. For this, PiGear also provides a flexible multi-threaded framework around complete `picamera` API, allowing developers to effortlessly exploit a wide range of parameters, such as `brightness`, `saturation`, `sensor_mode`, `iso`, `exposure`, and more. 

Furthermore, PiGear supports the use of multiple camera modules, including those found on Raspberry Pi Compute Module IO boards and USB cameras _(only with Picamera2 API)_.

Best of all, PiGear contains **Threaded Internal Timer** - that silently keeps active track of any frozen-threads/hardware-failures and exit safely, if any does occur. That means that if you're running PiGear API in your script and someone accidentally pulls the Camera-Module cable out, instead of going into possible kernel panic, API will exit safely to save resources.

**Code to open picamera2 stream with variable parameters in PiGear API:**

```python
# import required libraries
from vidgear.gears import PiGear
from libcamera import Transform
import cv2

# formulate various Picamera2 API 
# configurational parameters
options = {
    "controls": {"Brightness": 0.5, "ExposureValue": 2.0},
    "transform": Transform(hflip=1),
    "sensor": {"output_size": (480, 320)},  # will override `resolution`
    "format": "RGB888", # 8-bit BGR
}

# open pi video stream with defined parameters
stream = PiGear(resolution=(640, 480), framerate=60, logging=True, **options).start()

# loop over
while True:

    # read frames from stream
    frame = stream.read()

    # check for frame if Nonetype
    if frame is None:
        break

    # {do something with the frame here}

    # Show output window
    cv2.imshow("Output Frame", frame)

    # check for 'q' key if pressed
    key = cv2.waitKey(1) & 0xFF
    if key == ord("q"):
        break

# close output window
cv2.destroyAllWindows()

# safely close video stream
stream.stop()
```

### PiGear API Guide:

[**>>> Usage Guide**][pigear-doc]

&nbsp;

&nbsp;

## ScreenGear

> _ScreenGear is designed exclusively for targeting rapid Screencasting Capabilities, which means it can grab frames from your monitor in real-time, either by defining an area on the computer screen or full-screen, at the expense of inconsiderable latency. ScreenGear also seamlessly support frame capturing from multiple monitors as well as supports multiple backends._

ScreenGear implements a Lightning-Fast API wrapper around [**dxcam**][dxcam], [**pyscreenshot**][pyscreenshot] & [**python-mss**][mss] python libraries and also supports an easy and flexible direct internal parameters manipulation.

**Below is a snapshot of a ScreenGear API in action:**

<p align="center">
  <img src="https://abhitronix.github.io/vidgear/latest/assets/gifs/screengear.gif" alt="ScreenGear in action!"/>
</p>

**Code to generate the above results:**

```python
# import required libraries
from vidgear.gears import ScreenGear
import cv2

# open video stream with default parameters
stream = ScreenGear().start()

# loop over
while True:

    # read frames from stream
    frame = stream.read()

    # check for frame if Nonetype
    if frame is None:
        break

    # {do something with the frame here}

    # Show output window
    cv2.imshow("Output Frame", frame)

    # check for 'q' key if pressed
    key = cv2.waitKey(1) & 0xFF
    if key == ord("q"):
        break

# close output window
cv2.destroyAllWindows()

# safely close video stream
stream.stop()
```

### ScreenGear API Guide:

[**>>> Usage Guide**][screengear-doc]

&nbsp;

&nbsp;

## WriteGear

<p align="center">
  <img src="https://abhitronix.github.io/vidgear/latest/assets/images/writegear.png" alt="WriteGear Functional Block Diagram" width="70%" />
</p>

> _WriteGear handles various powerful Video-Writer Tools that provide us the freedom to do almost anything imaginable with multimedia data._

WriteGear API provides a complete, flexible, and robust wrapper around [**FFmpeg**][ffmpeg], a leading multimedia framework. WriteGear can process real-time frames into a lossless compressed video-file with any suitable specifications _(such as`bitrate, codec, framerate, resolution, subtitles,  etc.`)_.

WriteGear also supports streaming with traditional protocols such as [RTSP/RTP][rtsp-ex], RTMP. It is powerful enough to perform complex tasks such as [Live-Streaming][live-stream] _(such as for Twitch, YouTube etc.)_ and [Multiplexing Video-Audio][live-audio-doc] with real-time frames in just few lines of code.

Best of all, WriteGear grants users the complete freedom to play with any FFmpeg parameter with its exclusive **Custom Commands function** _(see this [doc][custom-command-doc])_ without relying on any third-party API.

In addition to this, WriteGear also provides flexible access to [**OpenCV's VideoWriter API**][opencv-writer] tools for video-frames encoding without compression.

**WriteGear primarily operates in the following two modes:**

- **Compression Mode:** In this mode, WriteGear utilizes powerful [**FFmpeg**][ffmpeg] inbuilt encoders to encode lossless multimedia files. This mode provides us the ability to exploit almost any parameter available within FFmpeg, effortlessly and flexibly, and while doing that it robustly handles all errors/warnings quietly. **You can find more about this mode [here >>][cm-writegear-doc]**

- **Non-Compression Mode:** In this mode, WriteGear utilizes basic [**OpenCV's inbuilt VideoWriter API**][opencv-vw] tools. This mode also supports all parameter transformations available within OpenCV's VideoWriter API, but it lacks the ability to manipulate encoding parameters and other important features like video compression, audio encoding, etc. **You can learn about this mode [here >>][ncm-writegear-doc]**

### WriteGear API Guide:

[**>>> Usage Guide**][writegear-doc]

&nbsp;

&nbsp;

## StreamGear

<p align="center">
  <img src="https://abhitronix.github.io/vidgear/latest/assets/images/streamgear_flow.webp" alt="NetGear API" width=80%/>
</p>

> _StreamGear streamlines and simplifies the transcoding workflow to generate Ultra-Low Latency, High-Quality, Dynamic & Adaptive Streaming Formats like MPEG-DASH and Apple HLS with just a few lines of Python code, allowing developers to focus on their application logic rather than dealing with the complexities of transcoding and chunking media files._

StreamGear API provides a standalone, highly extensible, and flexible wrapper around the [**FFmpeg**](https://ffmpeg.org/) multimedia framework for generating chunk-encoded media segments from your multimedia content effortlessly.

With StreamGear, you can transcode source video/audio files and real-time video frames into a sequence of multiple smaller chunks/segments of suitable lengths. These segments facilitate streaming at different quality levels _(bitrates or spatial resolutions)_ and allow for seamless switching between quality levels during playback based on available bandwidth. You can serve these segments on a web server, making them easily accessible via standard **HTTP GET** requests.

SteamGear currently supports both [**MPEG-DASH**](https://www.encoding.com/mpeg-dash/) _(Dynamic Adaptive Streaming over HTTP, ISO/IEC 23009-1)_  and [**Apple HLS**](https://developer.apple.com/documentation/http_live_streaming) _(HTTP Live Streaming)_. 

Additionally, StreamGear generates a manifest file _(such as MPD for DASH)_ or a master playlist _(such as M3U8 for Apple HLS)_ alongside the segments. These files contain essential segment information, _including timing, URLs, and media characteristics like video resolution and adaptive bitrates_. They are provided to the client before the streaming session begins.

**StreamGear primarily works in two Independent Modes for transcoding which serves different purposes:**

- **Single-Source Mode 💿 :** In this mode, StreamGear **transcodes entire video file** _(as opposed to frame-by-frame)_ into a sequence of multiple smaller chunks/segments for streaming. This mode works exceptionally well when you're transcoding long-duration lossless videos(with audio) for streaming that required no interruptions. But on the downside, the provided source cannot be flexibly manipulated or transformed before sending onto FFmpeg Pipeline for processing. **_Learn more about this mode [here >>][ss-mode-doc]_**

- **Real-time Frames Mode 🎞️ :** In this mode, StreamGear directly **transcodes frame-by-frame** _(as opposed to a entire video file)_, into a sequence of multiple smaller chunks/segments for streaming. This mode works exceptionally well when you desire to flexibility manipulate or transform [`numpy.ndarray`](https://numpy.org/doc/1.18/reference/generated/numpy.ndarray.html#numpy-ndarray) frames in real-time before sending them onto FFmpeg Pipeline for processing. But on the downside, audio has to added manually _(as separate source)_ for streams. **_Learn more about this mode [here >>][rtf-mode-doc]_**

### StreamGear API Guide:

[**>>> Usage Guide**][streamgear-doc]

&nbsp;

&nbsp;

## NetGear

<p align="center">
  <img src="https://abhitronix.github.io/vidgear/latest/assets/images/netgear.png" alt="NetGear API" width=65%/>
</p>

> _NetGear is exclusively designed to transfer video-frames & data synchronously between interconnecting systems over the network in real-time._

NetGear implements a high-level wrapper around [**PyZmQ**][pyzmq] python library that contains python bindings for [**ZeroMQ**][zmq] - a high-performance asynchronous distributed messaging library.

NetGear seamlessly supports additional [**bidirectional data transmission**][netgear_bidata_doc] between receiver(client) and sender(server) while transferring video-frames all in real-time.

NetGear can also robustly handle [**Multiple Server-Systems**][netgear_multi_server_doc] and [**Multiple Client-Systems**][netgear_multi_client_doc] and at once, thereby providing access to a seamless exchange of video-frames & data between multiple devices across the network at the same time.

NetGear allows remote connection over [**SSH Tunnel**][netgear_sshtunnel_doc] that allows us to connect NetGear client and server via secure SSH connection over the untrusted network and access its intranet services across firewalls.

NetGear also enables real-time [**JPEG Frame Compression**][netgear_compression_doc] capabilities for boosting performance significantly while sending video-frames over the network in real-time.

For security, NetGear implements easy access to ZeroMQ's powerful, smart & secure Security Layers that enable [**Strong encryption on data**][netgear_security_doc] and unbreakable authentication between the Server and the Client with the help of custom certificates.

**NetGear as of now seamlessly supports three ZeroMQ messaging patterns:**

- [**`zmq.PAIR`**][zmq-pair] _(ZMQ Pair Pattern)_
- [**`zmq.REQ/zmq.REP`**][zmq-req-rep] _(ZMQ Request/Reply Pattern)_
- [**`zmq.PUB/zmq.SUB`**][zmq-pub-sub] _(ZMQ Publish/Subscribe Pattern)_

Whereas supported protocol are: `tcp` and `ipc`.

### NetGear API Guide:

[**>>> Usage Guide**][netgear-doc]

&nbsp;

&nbsp;

## WebGear

> _WebGear is a powerful [ASGI](https://asgi.readthedocs.io/en/latest/) Video-Broadcaster API ideal for transmitting [Motion-JPEG](https://en.wikipedia.org/wiki/Motion_JPEG)-frames from a single source to multiple recipients via the browser._

WebGear API works on [**Starlette**](https://www.starlette.io/)'s ASGI application and provides a highly extensible and flexible async wrapper around its complete framework. WebGear can flexibly interact with Starlette's ecosystem of shared middleware, mountable applications, [Response classes](https://www.starlette.io/responses/), [Routing tables](https://www.starlette.io/routing/), [Static Files](https://www.starlette.io/staticfiles/), [Templating engine(with Jinja2)](https://www.starlette.io/templates/), etc.

WebGear API uses an intraframe-only compression scheme under the hood where the sequence of video-frames are first encoded as JPEG-DIB (JPEG with Device-Independent Bit compression) and then streamed over HTTP using Starlette's Multipart [Streaming Response](https://www.starlette.io/responses/#streamingresponse) and a [Uvicorn](https://www.uvicorn.org/#quickstart) ASGI Server. This method imposes lower processing and memory requirements, but the quality is not the best, since JPEG compression is not very efficient for motion video.

In layman's terms, WebGear acts as a powerful **Video Broadcaster** that transmits live video-frames to any web-browser in the network. Additionally, WebGear API also provides a special internal wrapper around [VideoGear](https://github.com/abhiTronix/vidgear#videogear), which itself provides internal access to both [CamGear](https://github.com/abhiTronix/vidgear#camgear) and [PiGear](https://github.com/abhiTronix/vidgear#pigear) APIs, thereby granting it exclusive power of broadcasting frames from any incoming stream. It also allows us to define our custom Server as source to transform frames easily before sending them across the network(see this [doc][webgear-cs] example).

**Below is a snapshot of a WebGear Video Server in action on Chrome browser:**

<p align="center">
  <img src="https://user-images.githubusercontent.com/34266896/211500287-0c12bfdf-2cbb-417a-9f3c-7a8b03ca5b6a.gif" alt="WebGear in action!" width="80%" />
  <br>
  <sub><i>WebGear Video Server at <a href="http://localhost:8000/" title="default address">http://localhost:8000/</a> address.</i></sub>
</p>

**Code to generate the above result:**

```python
# import required libraries
import uvicorn
from vidgear.gears.asyncio import WebGear

# various performance tweaks
options = {
    "frame_size_reduction": 40,
    "jpeg_compression_quality": 80,
    "jpeg_compression_fastdct": True,
    "jpeg_compression_fastupsample": False,
}

# initialize WebGear app
web = WebGear(source="foo.mp4", logging=True, **options)

# run this app on Uvicorn server at address http://localhost:8000/
uvicorn.run(web(), host="localhost", port=8000)

# close app safely
web.shutdown()
```

### WebGear API Guide:

[**>>> Usage Guide**][webgear-doc]

&nbsp;

&nbsp;

## WebGear_RTC

> _WebGear_RTC is similar to [WeGear API](https://github.com/abhiTronix/vidgear#webgear) in many aspects but utilizes [WebRTC][webrtc] technology under the hood instead of Motion JPEG, which makes it suitable for building powerful video-streaming solutions for all modern browsers as well as native clients available on all major platforms._

WebGear_RTC is implemented with the help of [**aiortc**][aiortc] library which is built on top of asynchronous I/O framework for Web Real-Time Communication (WebRTC) and Object Real-Time Communication (ORTC) and supports many features like SDP generation/parsing, Interactive Connectivity Establishment with half-trickle and mDNS support, DTLS key and certificate generation, DTLS handshake, etc.

WebGear_RTC can handle [multiple consumers][webgear_rtc-mc] seamlessly and provides native support for ICE _(Interactive Connectivity Establishment)_ protocol, STUN _(Session Traversal Utilities for NAT)_, and TURN _(Traversal Using Relays around NAT)_ servers that help us to seamlessly establish direct media connection with the remote peers for uninterrupted data flow. It also allows us to define our custom streaming class with suitable source to transform frames easily before sending them across the network(see this [doc][webgear_rtc-cs] example).

WebGear_RTC API works in conjunction with [**Starlette**][starlette]'s ASGI application and provides easy access to its complete framework. WebGear_RTC can also flexibly interact with Starlette's ecosystem of shared middleware, mountable applications, [Response classes](https://www.starlette.io/responses/), [Routing tables](https://www.starlette.io/routing/), [Static Files](https://www.starlette.io/staticfiles/), [Templating engine(with Jinja2)](https://www.starlette.io/templates/), etc.

Additionally, WebGear_RTC API also provides a special internal wrapper around [VideoGear](https://github.com/abhiTronix/vidgear#videogear), which itself provides internal access to both [CamGear](https://github.com/abhiTronix/vidgear#camgear) and [PiGear](https://github.com/abhiTronix/vidgear#pigear) APIs.

**Below is a snapshot of a WebGear_RTC Media Server in action on Chrome browser:**

<p align="center">
  <img src="https://user-images.githubusercontent.com/34266896/211502451-6dc1fb24-2472-4e95-b38e-cab252071cc7.gif" alt="WebGear_RTC in action!" width="80%" />
  <br>
  <sub><i>WebGear_RTC Video Server at <a href="http://localhost:8000/" title="default address">http://localhost:8000/</a> address.</i></sub>
</p>

**Code to generate the above result:**

```python
# import required libraries
import uvicorn
from vidgear.gears.asyncio import WebGear_RTC

# various performance tweaks
options = {
    "frame_size_reduction": 30,
}

# initialize WebGear_RTC app
web = WebGear_RTC(source="foo.mp4", logging=True, **options)

# run this app on Uvicorn server at address http://localhost:8000/
uvicorn.run(web(), host="localhost", port=8000)

# close app safely
web.shutdown()
```

### WebGear_RTC API Guide:

[**>>> Usage Guide**][webgear_rtc-doc]

&nbsp;

&nbsp;

## NetGear_Async

<p align="center">
  <img src="https://abhitronix.github.io/vidgear/latest/assets/images/zmq_asyncio.png" alt="WebGear in action!" width="70%"/>
</p>
.

> _NetGear_Async can generate the same performance as [NetGear API](https://github.com/abhiTronix/vidgear#netgear) at about one-third the memory consumption, and also provide complete server-client handling with various options to use variable protocols/patterns similar to NetGear, but lacks in term of flexibility as it supports only a few [NetGear's Exclusive Modes][netgear-exm]._

NetGear_Async is built on [`zmq.asyncio`][asyncio-zmq], and powered by a high-performance asyncio event loop called [**`uvloop`**][uvloop] to achieve unmatchable high-speed and lag-free video streaming over the network with minimal resource constraints. NetGear_Async can transfer thousands of frames in just a few seconds without causing any significant load on your system.

NetGear_Async provides complete server-client handling and options to use variable protocols/patterns similar to [NetGear API](https://github.com/abhiTronix/vidgear#netgear). Furthermore, NetGear_Async allows us to define our custom Server as source to transform frames easily before sending them across the network(see this [doc][netgear_async-cs] example).

NetGear_Async now supports additional [**bidirectional data transmission**][btm_netgear_async] between receiver(client) and sender(server) while transferring video-frames. Users can easily build complex applications such as like [Real-Time Video Chat][rtvc] in just few lines of code.

NetGear_Async as of now supports all four ZeroMQ messaging patterns:

- [**`zmq.PAIR`**][zmq-pair] _(ZMQ Pair Pattern)_
- [**`zmq.REQ/zmq.REP`**][zmq-req-rep] _(ZMQ Request/Reply Pattern)_
- [**`zmq.PUB/zmq.SUB`**][zmq-pub-sub] _(ZMQ Publish/Subscribe Pattern)_
- [**`zmq.PUSH/zmq.PULL`**][zmq-pull-push] _(ZMQ Push/Pull Pattern)_

Whereas supported protocol are: `tcp` and `ipc`.

### NetGear_Async API Guide:

[**>>> Usage Guide**][netgear_async-doc]

&nbsp;

&nbsp;

# Contributions

<div align="center">
   <h3>👑 Contributor Hall of Fame 👑</h3><br>
   <a href="https://github.com/abhiTronix/vidgear/graphs/contributors">
    <img src="https://contributors-img.web.app/image?repo=abhiTronix/vidgear"/><br><br>
  </a>
  <p><i>We're happy to meet new contributors💗</i></p><br>
</div>

We welcome your contributions to help us improve and extend this project. If you want to get involved with VidGear development, checkout the **[Contribution Guidelines ▶️][contribute]**

We're offering support for VidGear on [**Gitter Community Channel**](https://gitter.im/vidgear/community?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge). Come and join the conversation over there!

&nbsp;

&nbsp;

# Donations

<div align="center">
   <img src="https://abhitronix.github.io/vidgear/latest/assets/images/help_us.png" alt="PiGear" width="50%" />
   <p><i>VidGear is free and open source and will always remain so. ❤️</i></p>
</div>

It is something I am doing with my own free time. But so much more needs to be done, and I need your help to do this. For just the price of a cup of coffee, you can make a difference 🙂

<a href='https://ko-fi.com/W7W8WTYO' target='_blank'><img height='36' style='border:0px;height:36px;' src='https://cdn.ko-fi.com/cdn/kofi1.png?v=3' border='0' alt='Buy Me a Coffee at ko-fi.com' /></a>

&nbsp;

&nbsp;

# Citation

Here is a Bibtex entry you can use to cite this project in a publication:

[![DOI](https://zenodo.org/badge/DOI/10.5281/zenodo.8332548.svg)](https://doi.org/10.5281/zenodo.8332548)

```BibTeX
@software{vidgear,
  author       = {Abhishek Thakur and
                  Zoe Papakipos and
                  Christian Clauss and
                  Christian Hollinger and
                  Ian Max Andolina and
                  Vincent Boivin and
                  Kyle Ahn and
                  freol35241 and
                  Benjamin Lowe and
                  Mickaël Schoentgen and
                  Renaud Bouckenooghe and
                  Ibtsam Ahmad},
  title        = {abhiTronix/vidgear: VidGear Stable v0.3.2},
  month        = sep,
  year         = 2023,
  publisher    = {Zenodo},
  version      = {vidgear-0.3.2},
  doi          = {10.5281/zenodo.8332548},
  url          = {https://doi.org/10.5281/zenodo.8332548}
}
```

&nbsp;

&nbsp;

# Copyright

**Copyright (c) abhiTronix 2019**

This library is released under the **[Apache 2.0 License][license]**.

<!--
Badges
-->

[appveyor]: https://img.shields.io/appveyor/ci/abhitronix/vidgear.svg?style=for-the-badge&logo=appveyor
[codecov]: https://img.shields.io/codecov/c/github/abhiTronix/vidgear/testing?logo=codecov&style=for-the-badge
[github-cli]: https://img.shields.io/github/actions/workflow/status/abhiTronix/vidgear/.github/workflows/ci_linux.yml?style=for-the-badge&logo=data:image/svg%2bxml;base64,PHN2ZyB3aWR0aD0iNDgiIGhlaWdodD0iNDgiIGZpbGw9Im5vbmUiIHhtbG5zPSJodHRwOi8vd3d3LnczLm9yZy8yMDAwL3N2ZyI+PHBhdGggY2xpcC1ydWxlPSJldmVub2RkIiBkPSJNMTAgMWE5IDkgMCAwMTkgOSA5IDkgMCAwMS05IDkgOSA5IDAgMDEtOS05IDkgOSAwIDAxOS05ek0yMyAxOWE2IDYgMCAxMTAgMTIgNiA2IDAgMDEwLTEyek0yMyAzNWE2IDYgMCAxMTAgMTIgNiA2IDAgMDEwLTEyeiIgc3Ryb2tlPSJ2YXIoLS1jb2xvci1tYXJrZXRpbmctaWNvbi1wcmltYXJ5LCAjMjA4OEZGKSIgc3Ryb2tlLXdpZHRoPSIyIiBzdHJva2UtbGluZWNhcD0icm91bmQiIHN0cm9rZS1saW5lam9pbj0icm91bmQiLz48cGF0aCBjbGlwLXJ1bGU9ImV2ZW5vZGQiIGQ9Ik00MSAzNWE2IDYgMCAxMTAgMTIgNiA2IDAgMDEwLTEyeiIgc3Ryb2tlPSJ2YXIoLS1jb2xvci1tYXJrZXRpbmctaWNvbi1zZWNvbmRhcnksICM3OUI4RkYpIiBzdHJva2Utd2lkdGg9IjIiIHN0cm9rZS1saW5lY2FwPSJyb3VuZCIgc3Ryb2tlLWxpbmVqb2luPSJyb3VuZCIvPjxwYXRoIGQ9Ik0yNS4wMzcgMjMuNjA3bC0zLjA3IDMuMDY1LTEuNDktMS40ODUiIHN0cm9rZT0idmFyKC0tY29sb3ItbWFya2V0aW5nLWljb24tcHJpbWFyeSwgIzIwODhGRikiIHN0cm9rZS13aWR0aD0iMiIgc3Ryb2tlLWxpbmVjYXA9InJvdW5kIiBzdHJva2UtbGluZWpvaW49InJvdW5kIi8+PHBhdGggY2xpcC1ydWxlPSJldmVub2RkIiBkPSJNNDEgMTlhNiA2IDAgMTEwIDEyIDYgNiAwIDAxMC0xMnoiIHN0cm9rZT0idmFyKC0tY29sb3ItbWFya2V0aW5nLWljb24tcHJpbWFyeSwgIzIwODhGRikiIHN0cm9rZS13aWR0aD0iMiIgc3Ryb2tlLWxpbmVjYXA9InJvdW5kIiBzdHJva2UtbGluZWpvaW49InJvdW5kIi8+PHBhdGggZD0iTTQzLjAzNiAyMy42MDdsLTMuMDY5IDMuMDY1LTEuNDktMS40ODVNNyA2LjgxMmExIDEgMCAwMTEuNTMzLS44NDZsNS4xMTMgMy4yMmExIDEgMCAwMS0uMDA2IDEuNjk3bC01LjExMyAzLjE3QTEgMSAwIDAxNyAxMy4yMDNWNi44MTN6TTkgMTl2MTVjMCAzLjg2NiAzLjE3NyA3IDcgN2gxIiBzdHJva2U9InZhcigtLWNvbG9yLW1hcmtldGluZy1pY29uLXByaW1hcnksICMyMDg4RkYpIiBzdHJva2Utd2lkdGg9IjIiIHN0cm9rZS1saW5lY2FwPSJyb3VuZCIgc3Ryb2tlLWxpbmVqb2luPSJyb3VuZCIvPjxwYXRoIGQ9Ik0xNi45NDkgMjZhMSAxIDAgMTAwLTJ2MnpNOCAxOS4wMzVBNi45NjUgNi45NjUgMCAwMDE0Ljk2NSAyNnYtMkE0Ljk2NSA0Ljk2NSAwIDAxMTAgMTkuMDM1SDh6TTE0Ljk2NSAyNmgxLjk4NHYtMmgtMS45ODR2MnoiIGZpbGw9InZhcigtLWNvbG9yLW1hcmtldGluZy1pY29uLXByaW1hcnksICMyMDg4RkYpIi8+PHBhdGggZD0iTTI5LjA1NSAyNWg1Ljk0NCIgc3Ryb2tlPSJ2YXIoLS1jb2xvci1tYXJrZXRpbmctaWNvbi1wcmltYXJ5LCAjMjA4OEZGKSIgc3Ryb2tlLXdpZHRoPSIyIiBzdHJva2UtbGluZWNhcD0icm91bmQiIHN0cm9rZS1saW5lam9pbj0icm91bmQiLz48cGF0aCBmaWxsLXJ1bGU9ImV2ZW5vZGQiIGNsaXAtcnVsZT0iZXZlbm9kZCIgZD0iTTIxIDQwYTEgMSAwIDExLS4wMDEgMi4wMDFBMSAxIDAgMDEyMSA0MHpNMjUgNDBhMSAxIDAgMTEtLjAwMSAyLjAwMUExIDEgMCAwMTI1IDQweiIgZmlsbD0idmFyKC0tY29sb3ItbWFya2V0aW5nLWljb24tc2Vjb25kYXJ5LCAjNzlCOEZGKSIvPjxwYXRoIGQ9Ik0zNC4wMDUgNDEuMDA3bC0xLjAxMy4wMzMiIHN0cm9rZT0idmFyKC0tY29sb3ItbWFya2V0aW5nLWljb24tc2Vjb25kYXJ5LCAjNzlCOEZGKSIgc3Ryb2tlLXdpZHRoPSIyIiBzdHJva2UtbGluZWNhcD0icm91bmQiLz48L3N2Zz4=
[prs-badge]: https://img.shields.io/badge/PRs-welcome-brightgreen.svg?style=for-the-badge&logo=data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAACAAAAAgCAYAAABzenr0AAABC0lEQVRYhdWVPQoCMRCFX6HY2ghaiZUXsLW0EDyBrbWtN/EUHsHTWFnYyCL4gxibVZZlZzKTnWz0QZpk5r0vIdkF/kBPAMOKeddE+CQPKoc5Yt5cTjBMdQSwDQToWgBJAn3jmhqgltapAV6E6b5U17MGGAUaUj07TficMfIBZDV6vxowBm1BP9WbSQE4o5h9IjPJmy73TEPDDxVmoZdQrQ5jRhly9Q8tgMUXkIIWn0oG4GYQfAXQzz1PGoCiQndM7b4RgJay/h7zBLT3hASgoKjamQJMreKf0gfuAGyYtXEIAKcL/Dss15iq6ohXghozLYiAMxPuACwtIT4yeQUxAaLrZwAoqGRKGk7qDSYTfYQ8LuYnAAAAAElFTkSuQmCC
[twitter-badge]: https://img.shields.io/badge/Tweet-Now-blue.svg?style=for-the-badge&logo=twitter
[azure-badge]: https://img.shields.io/azure-devops/build/abhiuna12/942b3b13-d745-49e9-8d7d-b3918ff43ac2/2/testing?logo=azure-pipelines&style=for-the-badge
[pypi-badge]: https://img.shields.io/pypi/v/vidgear.svg?style=for-the-badge&logo=pypi
[gitter-bagde]: https://img.shields.io/badge/Chat-Gitter-blueviolet.svg?style=for-the-badge&logo=gitter
[coffee-badge]: https://abhitronix.github.io/img/vidgear/orange_img.png
[kofi-badge]: https://www.ko-fi.com/img/githubbutton_sm.svg
[black-badge]: https://img.shields.io/badge/code%20style-black-000000.svg?style=for-the-badge&logo=github

<!--
Internal URLs
-->

[release]: https://github.com/abhiTronix/vidgear/releases/latest
[pypi]: https://pypi.org/project/vidgear/
[gitter]: https://gitter.im/vidgear/community?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge
[twitter-intent]: https://twitter.com/intent/tweet?url=https%3A%2F%2Fabhitronix.github.io%2Fvidgear&via%20%40abhi_una12&text=Checkout%20VidGear%20-%20A%20High-Performance%20Video-Processing%20Python%20Framework.&hashtags=vidgear%20%23videoprocessing%20%23python%20%23threaded%20%23asyncio
[coffee]: https://www.buymeacoffee.com/2twOXFvlA
[kofi]: https://ko-fi.com/W7W8WTYO
[license]: https://github.com/abhiTronix/vidgear/blob/master/LICENSE
[github-flow]: https://github.com/abhiTronix/vidgear/actions?query=workflow%3A%22Run+Linux+CI-Tests+for+vidgear%22
[azure-pipeline]: https://dev.azure.com/abhiuna12/public/_build?definitionId=2
[app]: https://ci.appveyor.com/project/abhiTronix/vidgear
[code]: https://codecov.io/gh/abhiTronix/vidgear
[btm_netgear_async]: https://abhitronix.github.io/vidgear/latest/gears/netgear_async/advanced/bidirectional_mode/
[rtvc]: https://abhitronix.github.io/vidgear/latest/gears/netgear_async/advanced/bidirectional_mode/#using-bidirectional-mode-for-video-frames-transfer
[test-4k]: https://github.com/abhiTronix/vidgear/blob/e0843720202b0921d1c26e2ce5b11fadefbec892/vidgear/tests/benchmark_tests/test_benchmark_playback.py#L65
[bs_script_dataset]: https://github.com/abhiTronix/vidgear/blob/testing/scripts/bash/prepare_dataset.sh
[faq]: https://abhitronix.github.io/vidgear/latest/help/get_help/#frequently-asked-questions
[contribute]: https://abhitronix.github.io/vidgear/latest/contribution
[rtsp-ex]: https://abhitronix.github.io/vidgear/latest/help/writegear_ex/#using-writegears-compression-mode-for-rtsprtp-live-streaming
[doc-vidgear-purpose]: https://abhitronix.github.io/vidgear/latest/help/motivation/#why-is-vidgear-a-thing
[live-stream]: https://abhitronix.github.io/vidgear/latest/gears/writegear/compression/usage/#using-compression-mode-for-live-streaming
[live-audio-doc]: https://abhitronix.github.io/vidgear/latest/gears/writegear/compression/usage/#using-compression-mode-with-live-audio-input
[piping-live-videos]: https://abhitronix.github.io/vidgear/latest/gears/camgear/usage/#using-camgear-with-streaming-websites
[ffmpeg-doc]: https://abhitronix.github.io/vidgear/latest/gears/writegear/compression/advanced/ffmpeg_install/
[youtube-doc]: https://abhitronix.github.io/vidgear/latest/gears/camgear/usage/#using-camgear-with-youtube-videos
[tqm-doc]: https://abhitronix.github.io/vidgear/latest/bonus/TQM/#threaded-queue-mode
[camgear-doc]: https://abhitronix.github.io/vidgear/latest/gears/camgear/overview/
[stabilizer-doc]: https://abhitronix.github.io/vidgear/latest/gears/stabilizer/overview/
[stabilizer-doc-ex]: https://abhitronix.github.io/vidgear/latest/gears/videogear/usage/#using-videogear-with-video-stabilizer-backend
[videogear-doc]: https://abhitronix.github.io/vidgear/latest/gears/videogear/overview/
[pigear-doc]: https://abhitronix.github.io/vidgear/latest/gears/pigear/overview/
[cm-writegear-doc]: https://abhitronix.github.io/vidgear/latest/gears/writegear/compression/overview/
[ncm-writegear-doc]: https://abhitronix.github.io/vidgear/latest/gears/writegear/non_compression/overview/
[screengear-doc]: https://abhitronix.github.io/vidgear/latest/gears/screengear/overview/
[streamgear-doc]: https://abhitronix.github.io/vidgear/latest/gears/streamgear/introduction/
[writegear-doc]: https://abhitronix.github.io/vidgear/latest/gears/writegear/introduction/
[netgear-doc]: https://abhitronix.github.io/vidgear/latest/gears/netgear/overview/
[webgear-doc]: https://abhitronix.github.io/vidgear/latest/gears/webgear/overview/
[webgear_rtc-doc]: https://abhitronix.github.io/vidgear/latest/gears/webgear_rtc/overview/
[netgear_async-doc]: https://abhitronix.github.io/vidgear/latest/gears/netgear_async/overview/
[drop35]: https://github.com/abhiTronix/vidgear/issues/99
[custom-command-doc]: https://abhitronix.github.io/vidgear/latest/gears/writegear/compression/advanced/cciw/
[advanced-webgear-doc]: https://abhitronix.github.io/vidgear/latest/gears/webgear/advanced/
[netgear_bidata_doc]: https://abhitronix.github.io/vidgear/latest/gears/netgear/advanced/bidirectional_mode/
[netgear_compression_doc]: https://abhitronix.github.io/vidgear/latest/gears/netgear/advanced/compression/
[netgear_security_doc]: https://abhitronix.github.io/vidgear/latest/gears/netgear/advanced/secure_mode/
[netgear_multi_server_doc]: https://abhitronix.github.io/vidgear/latest/gears/netgear/advanced/multi_server/
[netgear_multi_client_doc]: https://abhitronix.github.io/vidgear/latest/gears/netgear/advanced/multi_client/
[netgear_sshtunnel_doc]: https://abhitronix.github.io/vidgear/latest/gears/netgear/advanced/ssh_tunnel/
[netgear-exm]: https://abhitronix.github.io/vidgear/latest/gears/netgear/overview/#modes-of-operation
[stabilize_webgear_doc]: https://abhitronix.github.io/vidgear/latest/gears/webgear/advanced/#using-webgear-with-real-time-video-stabilization-enabled
[netgear_async-cs]: https://abhitronix.github.io/vidgear/latest/gears/netgear_async/usage/#using-netgear_async-with-a-custom-sourceopencv
[installation]: https://abhitronix.github.io/vidgear/latest/installation/
[gears]: https://abhitronix.github.io/vidgear/latest/gears
[switch_from_cv]: https://abhitronix.github.io/vidgear/latest/switch_from_cv/
[ss-mode-doc]: https://abhitronix.github.io/vidgear/latest/gears/streamgear/ssm/#overview
[rtf-mode-doc]: https://abhitronix.github.io/vidgear/latest/gears/streamgear/rtfm/#overview
[webgear-cs]: https://abhitronix.github.io/vidgear/latest/gears/webgear/advanced/#using-webgear-with-a-custom-sourceopencv
[webgear_rtc-cs]: https://abhitronix.github.io/vidgear/latest/gears/webgear_rtc/advanced/#using-webgear_rtc-with-a-custom-sourceopencv
[webgear_rtc-mc]: https://abhitronix.github.io/vidgear/latest/gears/webgear_rtc/advanced/#using-webgear_rtc-as-real-time-broadcaster
[docs]: https://abhitronix.github.io/vidgear

<!--
External URLs
-->

[asyncio-zmq]: https://pyzmq.readthedocs.io/en/latest/api/zmq.asyncio.html
[uvloop]: https://github.com/MagicStack/uvloop
[streamlink]: https://streamlink.github.io/
[aiortc]: https://aiortc.readthedocs.io/en/latest/
[pyscreenshot]: https://github.com/ponty/pyscreenshot
[uvloop-ns]: https://github.com/MagicStack/uvloop/issues/14
[ffmpeg]: https://www.ffmpeg.org/
[flake8]: https://flake8.pycqa.org/en/latest/
[dxcam]: https://github.com/ra1nty/DXcam
[black]: https://github.com/psf/black
[pytest]: https://docs.pytest.org/en/latest/
[opencv-writer]: https://docs.opencv.org/master/dd/d9e/classcv_1_1VideoWriter.html#ad59c61d8881ba2b2da22cff5487465b5
[opencv-windows]: https://www.learnopencv.com/install-opencv3-on-windows/
[opencv-linux]: https://www.pyimagesearch.com/2018/05/28/ubuntu-18-04-how-to-install-opencv/
[opencv-pi]: https://www.pyimagesearch.com/2018/09/26/install-opencv-4-on-your-raspberry-pi/
[starlette]: https://www.starlette.io/
[uvicorn]: http://www.uvicorn.org/
[daphne]: https://github.com/django/daphne/
[hypercorn]: https://pgjones.gitlab.io/hypercorn/
[prs]: http://makeapullrequest.com
[opencv]: https://github.com/opencv/opencv
[picamera]: https://github.com/waveform80/picamera
[pafy]: https://github.com/mps-youtube/pafy
[pyzmq]: https://github.com/zeromq/pyzmq
[zmq]: https://zeromq.org/
[mss]: https://github.com/BoboTiG/python-mss
[pip]: https://pip.pypa.io/en/stable/installing/
[opencv-vc]: https://docs.opencv.org/master/d8/dfe/classcv_1_1VideoCapture.html#a57c0e81e83e60f36c83027dc2a188e80
[ov5647-picam]: https://github.com/techyian/MMALSharp/doc/OmniVision-OV5647-Camera-Module
[imx219-picam]: https://github.com/techyian/MMALSharp/doc/Sony-IMX219-Camera-Module
[opencv-vw]: https://docs.opencv.org/3.4/d8/dfe/classcv_1_1VideoCapture.html
[yt_dlp]: https://github.com/yt-dlp/yt-dlp
[numpy]: https://github.com/numpy/numpy
[zmq-pair]: https://learning-0mq-with-pyzmq.readthedocs.io/en/latest/pyzmq/patterns/pair.html
[zmq-req-rep]: https://learning-0mq-with-pyzmq.readthedocs.io/en/latest/pyzmq/patterns/client_server.html
[zmq-pub-sub]: https://learning-0mq-with-pyzmq.readthedocs.io/en/latest/pyzmq/patterns/pubsub.html
[zmq-pull-push]: https://learning-0mq-with-pyzmq.readthedocs.io/en/latest/pyzmq/patterns/pushpull.html#push-pull
[picamera2]:https://github.com/raspberrypi/picamera2
[picamera-setting]: https://picamera.readthedocs.io/en/release-1.13/quickstart.html
[webrtc]: https://webrtc.org/



            

Raw data

            {
    "_id": null,
    "home_page": "https://abhitronix.github.io/vidgear",
    "name": "vidgear",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.8",
    "maintainer_email": null,
    "keywords": "OpenCV, multithreading, FFmpeg, picamera2, starlette, mss, pyzmq, dxcam, aiortc, uvicorn, uvloop, yt-dlp, asyncio, dash, hls, Video Processing, Video Stabilization, Computer Vision, Video Streaming, raspberrypi, YouTube, Twitch, WebRTC",
    "author": "Abhishek Thakur",
    "author_email": "abhi.una12@gmail.com",
    "download_url": "https://files.pythonhosted.org/packages/bc/ce/43c5eb89f9ec6d10ffe6c938c40bbec6ca18fd45954b83682269cabea689/vidgear-0.3.3.tar.gz",
    "platform": null,
    "description": "<!--\n===============================================\nvidgear library source-code is deployed under the Apache 2.0 License:\n\nCopyright (c) 2019 Abhishek Thakur(@abhiTronix) <abhi.una12@gmail.com>\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n===============================================\n-->\n\n<h1 align=\"center\">\n  <img src=\"https://abhitronix.github.io/vidgear/latest/assets/images/vidgear.png\" alt=\"VidGear\" title=\"Logo designed by Abhishek Thakur(@abhiTronix), under CC-BY-NC-SA 4.0 License\" width=\"80%\"/>\n</h1>\n<h2 align=\"center\">\n  <img src=\"https://abhitronix.github.io/vidgear/latest/assets/images/tagline.svg\" alt=\"VidGear tagline\" width=\"40%\"/>\n</h2>\n\n<div align=\"center\">\n\n[Releases][release]&nbsp;&nbsp;&nbsp;|&nbsp;&nbsp;&nbsp;[Gears][gears]&nbsp;&nbsp;&nbsp;|&nbsp;&nbsp;&nbsp;[Documentation][docs]&nbsp;&nbsp;&nbsp;|&nbsp;&nbsp;&nbsp;[Installation][installation]&nbsp;&nbsp;&nbsp;|&nbsp;&nbsp;&nbsp;[License](https://github.com/abhiTronix/vidgear#copyright)\n\n[![Build Status][github-cli]][github-flow] [![Codecov branch][codecov]][code] [![Azure DevOps builds (branch)][azure-badge]][azure-pipeline]\n\n[![Glitter chat][gitter-bagde]][gitter] [![Build Status][appveyor]][app] [![PyPi version][pypi-badge]][pypi]\n\n[![Code Style][black-badge]][black]\n\n</div>\n\n&nbsp;\n\nVidGear is a **High-Performance Video Processing Python Library** that provides an easy-to-use, highly extensible, thoroughly optimised **Multi-Threaded + Asyncio API Framework** on top of many state-of-the-art specialized libraries like _[OpenCV][opencv], [FFmpeg][ffmpeg], [ZeroMQ][zmq], [picamera2][picamera2], [starlette][starlette], [yt_dlp][yt_dlp], [pyscreenshot][pyscreenshot], [dxcam][dxcam], [aiortc][aiortc] and [python-mss][mss]_ serving at its backend, and enable us to flexibly exploit their internal parameters and methods, while silently delivering **robust error-handling and real-time performance \ud83d\udd25**\n\nVidGear primarily focuses on simplicity, and thereby lets programmers and software developers to easily integrate and perform Complex Video Processing Tasks, in just a few lines of code.\n\n&nbsp;\n\nThe following **functional block diagram** clearly depicts the generalized functioning of VidGear APIs:\n\n<p align=\"center\">\n  <img src=\"https://abhitronix.github.io/vidgear/latest/assets/images/gears_fbd.png\" alt=\"@Vidgear Functional Block Diagram\" />\n</p>\n\n&nbsp;\n\n# Table of Contents\n\n- [**TL;DR**](https://github.com/abhiTronix/vidgear#tldr)\n- [**Getting Started**](https://github.com/abhiTronix/vidgear#getting-started)\n- [**Gears: What are these?**](https://github.com/abhiTronix/vidgear#gears-what-are-these)\n  - [**CamGear**](https://github.com/abhiTronix/vidgear#camgear)\n  - [**PiGear**](https://github.com/abhiTronix/vidgear#pigear)\n  - [**VideoGear**](https://github.com/abhiTronix/vidgear#videogear)\n  - [**ScreenGear**](https://github.com/abhiTronix/vidgear#screengear)\n  - [**WriteGear**](https://github.com/abhiTronix/vidgear#writegear)\n  - [**StreamGear**](https://github.com/abhiTronix/vidgear#streamgear)\n  - [**NetGear**](https://github.com/abhiTronix/vidgear#netgear)\n  - [**WebGear**](https://github.com/abhiTronix/vidgear#webgear)\n  - [**WebGear_RTC**](https://github.com/abhiTronix/vidgear#webgear_rtc)\n  - [**NetGear_Async**](https://github.com/abhiTronix/vidgear#netgear_async)\n- [**Contributions**](https://github.com/abhiTronix/vidgear#contributions)\n- [**Donations**](https://github.com/abhiTronix/vidgear#donations)\n- [**Citation**](https://github.com/abhiTronix/vidgear#citation)\n- [**Copyright**](https://github.com/abhiTronix/vidgear#copyright)\n\n&nbsp;\n\n&nbsp;\n\n## TL;DR\n\n#### What is vidgear?\n\n> _\"VidGear is a cross-platform High-Performance Framework that provides an one-stop **Video-Processing** solution for building complex real-time media applications in python.\"_\n\n#### What does it do?\n\n> _\"VidGear can read, write, process, send & receive video files/frames/streams from/to various devices in real-time, and [**faster**][tqm-doc] than underline libraries.\"_\n\n#### What is its purpose?\n\n> _\"Write Less and Accomplish More\"_ \u2014 **VidGear's Motto**\n\n> _\"Built with simplicity in mind, VidGear lets programmers and software developers to easily integrate and perform **Complex Video-Processing Tasks** in their existing or newer applications without going through hefty documentation and in just a [**few lines of code**][switch_from_cv]. Beneficial for both, if you're new to programming with Python language or already a pro at it.\"_\n\n&nbsp;\n\n&nbsp;\n\n## Getting Started\n\nIf this is your first time using VidGear, head straight to the [Installation >>][installation] to install VidGear.\n\nOnce you have VidGear installed, **Checkout its Well-Documented [Function-Specific Gears >>][gears]**\n\nAlso, if you're already familiar with [OpenCV][opencv] library, then see [Switching from OpenCV Library >>][switch_from_cv]\n\nOr, if you're just getting started with OpenCV-Python programming, then refer this [FAQ >>](https://abhitronix.github.io/vidgear/latest/help/general_faqs/#im-new-to-python-programming-or-its-usage-in-opencv-library-how-to-use-vidgear-in-my-projects)\n\n&nbsp;\n\n&nbsp;\n\n## Gears: What are these?\n\n> **VidGear is built with multiple APIs a.k.a [Gears][gears], each with some unique functionality.**\n\nEach API is designed exclusively to handle/control/process different data-specific & device-specific video streams, network streams, and media encoders/decoders. These APIs provides the user an easy-to-use, dynamic, extensible, and exposed Multi-Threaded + Asyncio optimized internal layer above state-of-the-art libraries to work with, while silently delivering robust error-handling.\n\n**These Gears can be classified as follows:**\n\n**A. Video-Capture Gears:**\n\n- [**CamGear:**](https://github.com/abhiTronix/vidgear#camgear) Multi-Threaded API targeting various IP-USB-Cameras/Network-Streams/Streaming-Sites-URLs.\n- [**PiGear:**](https://github.com/abhiTronix/vidgear#pigear) Multi-Threaded API targeting various Camera Modules and _(limited)_ USB cameras on Raspberry Pis \ud83c\udf47.\n- [**ScreenGear:**](https://github.com/abhiTronix/vidgear#screengear) High-performance API targeting rapid Screencasting Capabilities.\n- [**VideoGear:**](https://github.com/abhiTronix/vidgear#videogear) Common Video-Capture API with internal [Video Stabilizer](https://abhitronix.github.io/vidgear/latest/gears/stabilizer/overview/) wrapper.\n\n**B. Video-Writer Gears:**\n\n- [**WriteGear:**](https://github.com/abhiTronix/vidgear#writegear) Handles Lossless Video-Writer for file/stream/frames Encoding and Compression.\n\n**C. Streaming Gears:**\n\n- [**StreamGear**](https://github.com/abhiTronix/vidgear#streamgear): Handles Transcoding of High-Quality, Dynamic & Adaptive Streaming Formats.\n\n- **Asynchronous I/O Streaming Gear:**\n\n  - [**WebGear:**](https://github.com/abhiTronix/vidgear#webgear) ASGI Video-Server that broadcasts Live MJPEG-Frames to any web-browser on the network.\n  - [**WebGear_RTC:**](https://github.com/abhiTronix/vidgear#webgear_rtc) Real-time Asyncio WebRTC media server for streaming directly to peer clients over the network.\n\n**D. Network Gears:**\n\n- [**NetGear:**](https://github.com/abhiTronix/vidgear#netgear) Handles High-Performance Video-Frames & Data Transfer between interconnecting systems over the network.\n\n- **Asynchronous I/O Network Gear:**\n\n  - [**NetGear_Async:**](https://github.com/abhiTronix/vidgear#netgear_async) Immensely Memory-Efficient Asyncio Video-Frames Network Messaging Framework.\n\n&nbsp;\n\n&nbsp;\n\n## CamGear\n\n<p align=\"center\">\n  <img src=\"https://abhitronix.github.io/vidgear/latest/assets/images/camgear.png\" alt=\"CamGear Functional Block Diagram\" width=\"45%\"/>\n</p>\n\n> _CamGear can grab ultra-fast frames from a diverse range of file-formats/devices/streams, which includes almost any IP-USB Cameras, multimedia video file-formats ([*upto 4k tested*][test-4k]), various network stream protocols such as `http(s), rtp, rtsp, rtmp, mms, etc.`, and GStreamer's pipelines, plus direct support for live video streaming sites like YouTube, Twitch, LiveStream, Dailymotion etc._\n\nCamGear provides a flexible, high-level, multi-threaded framework around OpenCV's [VideoCapture class][opencv-vc] with access almost all of its available parameters. CamGear internally implements [`yt_dlp`][yt_dlp] backend class for seamlessly pipelining live video-frames and metadata from various streaming services like [YouTube][youtube-doc], [Twitch][piping-live-videos], and [many more >>](https://github.com/yt-dlp/yt-dlp/blob/master/supportedsites.md#supported-sites). Furthermore, its framework relies exclusively on [**Threaded Queue mode**][tqm-doc] for ultra-fast, error-free, and synchronized video-frame handling.\n\n### CamGear API Guide:\n\n[**>>> Usage Guide**][camgear-doc]\n\n&nbsp;\n\n&nbsp;\n\n## VideoGear\n\n> _VideoGear API provides a special internal wrapper around VidGear's exclusive [**Video Stabilizer**][stabilizer-doc] class._\n\nVideoGear also acts as a Common Video-Capture API that provides internal access for both [CamGear](https://github.com/abhiTronix/vidgear#camgear) and [PiGear](https://github.com/abhiTronix/vidgear#pigear) APIs and their parameters with an exclusive `enablePiCamera` boolean flag.\n\nVideoGear is ideal when you need to switch to different video sources without changing your code much. Also, it enables easy stabilization for various video-streams _(real-time or not)_ with minimum effort and writing way fewer lines of code.\n\n**Below is a snapshot of a VideoGear Stabilizer in action (_See its detailed usage [here][stabilizer-doc-ex]_):**\n\n<p align=\"center\">\n  <img src=\"https://user-images.githubusercontent.com/34266896/211500670-b3aaf4db-a52a-4836-a03c-c2c17b971feb.gif\" alt=\"VideoGear Stabilizer in action!\"/>\n  <br>\n  <sub><i>Original Video Courtesy <a href=\"http://liushuaicheng.org/SIGGRAPH2013/database.html\" title=\"opensourced video samples database\">@SIGGRAPH2013</a></i></sub>\n</p>\n\n**Code to generate above result:**\n\n```python\n# import required libraries\nfrom vidgear.gears import VideoGear\nimport numpy as np\nimport cv2\n\n# open any valid video stream with stabilization enabled(`stabilize = True`)\nstream_stab = VideoGear(source=\"test.mp4\", stabilize=True).start()\n\n# open same stream without stabilization for comparison\nstream_org = VideoGear(source=\"test.mp4\").start()\n\n# loop over\nwhile True:\n\n    # read stabilized frames\n    frame_stab = stream_stab.read()\n\n    # check for stabilized frame if Nonetype\n    if frame_stab is None:\n        break\n\n    # read un-stabilized frame\n    frame_org = stream_org.read()\n\n    # concatenate both frames\n    output_frame = np.concatenate((frame_org, frame_stab), axis=1)\n\n    # put text over concatenated frame\n    cv2.putText(\n        output_frame,\n        \"Before\",\n        (10, output_frame.shape[0] - 10),\n        cv2.FONT_HERSHEY_SIMPLEX,\n        0.6,\n        (0, 255, 0),\n        2,\n    )\n    cv2.putText(\n        output_frame,\n        \"After\",\n        (output_frame.shape[1] // 2 + 10, output_frame.shape[0] - 10),\n        cv2.FONT_HERSHEY_SIMPLEX,\n        0.6,\n        (0, 255, 0),\n        2,\n    )\n\n    # Show output window\n    cv2.imshow(\"Stabilized Frame\", output_frame)\n\n    # check for 'q' key if pressed\n    key = cv2.waitKey(1) & 0xFF\n    if key == ord(\"q\"):\n        break\n\n# close output window\ncv2.destroyAllWindows()\n\n# safely close both video streams\nstream_org.stop()\nstream_stab.stop()\n```\n\n### VideoGear API Guide:\n\n[**>>> Usage Guide**][videogear-doc]\n\n&nbsp;\n\n&nbsp;\n\n## PiGear\n\n<p align=\"center\">\n  <img src=\"https://abhitronix.github.io/vidgear/latest/assets/images/picam2.webp\" alt=\"PiGear\" width=\"50%\" />\n</p>\n\n> _PiGear is a specialized API similar to the [CamGear API](https://github.com/abhiTronix/vidgear#camgear) but optimized for **Raspberry Pi :grapes: Boards**, offering comprehensive **support for camera modules** _(e.g., [OmniVision OV5647 Camera Module][ov5647-picam], [Sony IMX219 Camera Module][imx219-picam])_, along with **limited compatibility for USB cameras**._\n\nPiGear implements a seamless and robust wrapper around the [picamera2][picamera2] python library, simplifying integration with minimal code changes and ensuring a smooth transition for developers already familiar with the Picamera2 API. PiGear leverages the `libcamera` API under the hood with multi-threading, providing high-performance :fire:, enhanced control and functionality for Raspberry Pi camera modules. \n\nPiGear handles common configuration parameters and non-standard settings for various camera types, simplifying the integration process. PiGear currently supports PiCamera2 API parameters such as `sensor`, `controls`, `transform`, and `format` etc., with internal type and sanity checks for robust performance.\n\nWhile primarily focused on Raspberry Pi camera modules, PiGear also provides **basic functionality for USB webcams** only with Picamera2 API, along with the ability to accurately differentiate between USB and Raspberry Pi cameras using metadata. \n\nPiGear seamlessly switches to the legacy [picamera][picamera] library if the `picamera2` library is unavailable, ensuring seamless backward compatibility. For this, PiGear also provides a flexible multi-threaded framework around complete `picamera` API, allowing developers to effortlessly exploit a wide range of parameters, such as `brightness`, `saturation`, `sensor_mode`, `iso`, `exposure`, and more. \n\nFurthermore, PiGear supports the use of multiple camera modules, including those found on Raspberry Pi Compute Module IO boards and USB cameras _(only with Picamera2 API)_.\n\nBest of all, PiGear contains **Threaded Internal Timer** - that silently keeps active track of any frozen-threads/hardware-failures and exit safely, if any does occur. That means that if you're running PiGear API in your script and someone accidentally pulls the Camera-Module cable out, instead of going into possible kernel panic, API will exit safely to save resources.\n\n**Code to open picamera2 stream with variable parameters in PiGear API:**\n\n```python\n# import required libraries\nfrom vidgear.gears import PiGear\nfrom libcamera import Transform\nimport cv2\n\n# formulate various Picamera2 API \n# configurational parameters\noptions = {\n    \"controls\": {\"Brightness\": 0.5, \"ExposureValue\": 2.0},\n    \"transform\": Transform(hflip=1),\n    \"sensor\": {\"output_size\": (480, 320)},  # will override `resolution`\n    \"format\": \"RGB888\", # 8-bit BGR\n}\n\n# open pi video stream with defined parameters\nstream = PiGear(resolution=(640, 480), framerate=60, logging=True, **options).start()\n\n# loop over\nwhile True:\n\n    # read frames from stream\n    frame = stream.read()\n\n    # check for frame if Nonetype\n    if frame is None:\n        break\n\n    # {do something with the frame here}\n\n    # Show output window\n    cv2.imshow(\"Output Frame\", frame)\n\n    # check for 'q' key if pressed\n    key = cv2.waitKey(1) & 0xFF\n    if key == ord(\"q\"):\n        break\n\n# close output window\ncv2.destroyAllWindows()\n\n# safely close video stream\nstream.stop()\n```\n\n### PiGear API Guide:\n\n[**>>> Usage Guide**][pigear-doc]\n\n&nbsp;\n\n&nbsp;\n\n## ScreenGear\n\n> _ScreenGear is designed exclusively for targeting rapid Screencasting Capabilities, which means it can grab frames from your monitor in real-time, either by defining an area on the computer screen or full-screen, at the expense of inconsiderable latency. ScreenGear also seamlessly support frame capturing from multiple monitors as well as supports multiple backends._\n\nScreenGear implements a Lightning-Fast API wrapper around [**dxcam**][dxcam], [**pyscreenshot**][pyscreenshot] & [**python-mss**][mss] python libraries and also supports an easy and flexible direct internal parameters manipulation.\n\n**Below is a snapshot of a ScreenGear API in action:**\n\n<p align=\"center\">\n  <img src=\"https://abhitronix.github.io/vidgear/latest/assets/gifs/screengear.gif\" alt=\"ScreenGear in action!\"/>\n</p>\n\n**Code to generate the above results:**\n\n```python\n# import required libraries\nfrom vidgear.gears import ScreenGear\nimport cv2\n\n# open video stream with default parameters\nstream = ScreenGear().start()\n\n# loop over\nwhile True:\n\n    # read frames from stream\n    frame = stream.read()\n\n    # check for frame if Nonetype\n    if frame is None:\n        break\n\n    # {do something with the frame here}\n\n    # Show output window\n    cv2.imshow(\"Output Frame\", frame)\n\n    # check for 'q' key if pressed\n    key = cv2.waitKey(1) & 0xFF\n    if key == ord(\"q\"):\n        break\n\n# close output window\ncv2.destroyAllWindows()\n\n# safely close video stream\nstream.stop()\n```\n\n### ScreenGear API Guide:\n\n[**>>> Usage Guide**][screengear-doc]\n\n&nbsp;\n\n&nbsp;\n\n## WriteGear\n\n<p align=\"center\">\n  <img src=\"https://abhitronix.github.io/vidgear/latest/assets/images/writegear.png\" alt=\"WriteGear Functional Block Diagram\" width=\"70%\" />\n</p>\n\n> _WriteGear handles various powerful Video-Writer Tools that provide us the freedom to do almost anything imaginable with multimedia data._\n\nWriteGear API provides a complete, flexible, and robust wrapper around [**FFmpeg**][ffmpeg], a leading multimedia framework. WriteGear can process real-time frames into a lossless compressed video-file with any suitable specifications _(such as`bitrate, codec, framerate, resolution, subtitles,  etc.`)_.\n\nWriteGear also supports streaming with traditional protocols such as [RTSP/RTP][rtsp-ex], RTMP. It is powerful enough to perform complex tasks such as [Live-Streaming][live-stream] _(such as for Twitch, YouTube etc.)_ and [Multiplexing Video-Audio][live-audio-doc] with real-time frames in just few lines of code.\n\nBest of all, WriteGear grants users the complete freedom to play with any FFmpeg parameter with its exclusive **Custom Commands function** _(see this [doc][custom-command-doc])_ without relying on any third-party API.\n\nIn addition to this, WriteGear also provides flexible access to [**OpenCV's VideoWriter API**][opencv-writer] tools for video-frames encoding without compression.\n\n**WriteGear primarily operates in the following two modes:**\n\n- **Compression Mode:** In this mode, WriteGear utilizes powerful [**FFmpeg**][ffmpeg] inbuilt encoders to encode lossless multimedia files. This mode provides us the ability to exploit almost any parameter available within FFmpeg, effortlessly and flexibly, and while doing that it robustly handles all errors/warnings quietly. **You can find more about this mode [here >>][cm-writegear-doc]**\n\n- **Non-Compression Mode:** In this mode, WriteGear utilizes basic [**OpenCV's inbuilt VideoWriter API**][opencv-vw] tools. This mode also supports all parameter transformations available within OpenCV's VideoWriter API, but it lacks the ability to manipulate encoding parameters and other important features like video compression, audio encoding, etc. **You can learn about this mode [here >>][ncm-writegear-doc]**\n\n### WriteGear API Guide:\n\n[**>>> Usage Guide**][writegear-doc]\n\n&nbsp;\n\n&nbsp;\n\n## StreamGear\n\n<p align=\"center\">\n  <img src=\"https://abhitronix.github.io/vidgear/latest/assets/images/streamgear_flow.webp\" alt=\"NetGear API\" width=80%/>\n</p>\n\n> _StreamGear streamlines and simplifies the transcoding workflow to generate Ultra-Low Latency, High-Quality, Dynamic & Adaptive Streaming Formats like MPEG-DASH and Apple HLS with just a few lines of Python code, allowing developers to focus on their application logic rather than dealing with the complexities of transcoding and chunking media files._\n\nStreamGear API provides a standalone, highly extensible, and flexible wrapper around the [**FFmpeg**](https://ffmpeg.org/) multimedia framework for generating chunk-encoded media segments from your multimedia content effortlessly.\n\nWith StreamGear, you can transcode source video/audio files and real-time video frames into a sequence of multiple smaller chunks/segments of suitable lengths. These segments facilitate streaming at different quality levels _(bitrates or spatial resolutions)_ and allow for seamless switching between quality levels during playback based on available bandwidth. You can serve these segments on a web server, making them easily accessible via standard **HTTP GET** requests.\n\nSteamGear currently supports both [**MPEG-DASH**](https://www.encoding.com/mpeg-dash/) _(Dynamic Adaptive Streaming over HTTP, ISO/IEC 23009-1)_  and [**Apple HLS**](https://developer.apple.com/documentation/http_live_streaming) _(HTTP Live Streaming)_. \n\nAdditionally, StreamGear generates a manifest file _(such as MPD for DASH)_ or a master playlist _(such as M3U8 for Apple HLS)_ alongside the segments. These files contain essential segment information, _including timing, URLs, and media characteristics like video resolution and adaptive bitrates_. They are provided to the client before the streaming session begins.\n\n**StreamGear primarily works in two Independent Modes for transcoding which serves different purposes:**\n\n- **Single-Source Mode \ud83d\udcbf :** In this mode, StreamGear **transcodes entire video file** _(as opposed to frame-by-frame)_ into a sequence of multiple smaller chunks/segments for streaming. This mode works exceptionally well when you're transcoding long-duration lossless videos(with audio) for streaming that required no interruptions. But on the downside, the provided source cannot be flexibly manipulated or transformed before sending onto FFmpeg Pipeline for processing. **_Learn more about this mode [here >>][ss-mode-doc]_**\n\n- **Real-time Frames Mode \ud83c\udf9e\ufe0f :** In this mode, StreamGear directly **transcodes frame-by-frame** _(as opposed to a entire video file)_, into a sequence of multiple smaller chunks/segments for streaming. This mode works exceptionally well when you desire to flexibility manipulate or transform [`numpy.ndarray`](https://numpy.org/doc/1.18/reference/generated/numpy.ndarray.html#numpy-ndarray) frames in real-time before sending them onto FFmpeg Pipeline for processing. But on the downside, audio has to added manually _(as separate source)_ for streams. **_Learn more about this mode [here >>][rtf-mode-doc]_**\n\n### StreamGear API Guide:\n\n[**>>> Usage Guide**][streamgear-doc]\n\n&nbsp;\n\n&nbsp;\n\n## NetGear\n\n<p align=\"center\">\n  <img src=\"https://abhitronix.github.io/vidgear/latest/assets/images/netgear.png\" alt=\"NetGear API\" width=65%/>\n</p>\n\n> _NetGear is exclusively designed to transfer video-frames & data synchronously between interconnecting systems over the network in real-time._\n\nNetGear implements a high-level wrapper around [**PyZmQ**][pyzmq] python library that contains python bindings for [**ZeroMQ**][zmq] - a high-performance asynchronous distributed messaging library.\n\nNetGear seamlessly supports additional [**bidirectional data transmission**][netgear_bidata_doc] between receiver(client) and sender(server) while transferring video-frames all in real-time.\n\nNetGear can also robustly handle [**Multiple Server-Systems**][netgear_multi_server_doc] and [**Multiple Client-Systems**][netgear_multi_client_doc] and at once, thereby providing access to a seamless exchange of video-frames & data between multiple devices across the network at the same time.\n\nNetGear allows remote connection over [**SSH Tunnel**][netgear_sshtunnel_doc] that allows us to connect NetGear client and server via secure SSH connection over the untrusted network and access its intranet services across firewalls.\n\nNetGear also enables real-time [**JPEG Frame Compression**][netgear_compression_doc] capabilities for boosting performance significantly while sending video-frames over the network in real-time.\n\nFor security, NetGear implements easy access to ZeroMQ's powerful, smart & secure Security Layers that enable [**Strong encryption on data**][netgear_security_doc] and unbreakable authentication between the Server and the Client with the help of custom certificates.\n\n**NetGear as of now seamlessly supports three ZeroMQ messaging patterns:**\n\n- [**`zmq.PAIR`**][zmq-pair] _(ZMQ Pair Pattern)_\n- [**`zmq.REQ/zmq.REP`**][zmq-req-rep] _(ZMQ Request/Reply Pattern)_\n- [**`zmq.PUB/zmq.SUB`**][zmq-pub-sub] _(ZMQ Publish/Subscribe Pattern)_\n\nWhereas supported protocol are: `tcp` and `ipc`.\n\n### NetGear API Guide:\n\n[**>>> Usage Guide**][netgear-doc]\n\n&nbsp;\n\n&nbsp;\n\n## WebGear\n\n> _WebGear is a powerful [ASGI](https://asgi.readthedocs.io/en/latest/) Video-Broadcaster API ideal for transmitting [Motion-JPEG](https://en.wikipedia.org/wiki/Motion_JPEG)-frames from a single source to multiple recipients via the browser._\n\nWebGear API works on [**Starlette**](https://www.starlette.io/)'s ASGI application and provides a highly extensible and flexible async wrapper around its complete framework. WebGear can flexibly interact with Starlette's ecosystem of shared middleware, mountable applications, [Response classes](https://www.starlette.io/responses/), [Routing tables](https://www.starlette.io/routing/), [Static Files](https://www.starlette.io/staticfiles/), [Templating engine(with Jinja2)](https://www.starlette.io/templates/), etc.\n\nWebGear API uses an intraframe-only compression scheme under the hood where the sequence of video-frames are first encoded as JPEG-DIB (JPEG with Device-Independent Bit compression) and then streamed over HTTP using Starlette's Multipart [Streaming Response](https://www.starlette.io/responses/#streamingresponse) and a [Uvicorn](https://www.uvicorn.org/#quickstart) ASGI Server. This method imposes lower processing and memory requirements, but the quality is not the best, since JPEG compression is not very efficient for motion video.\n\nIn layman's terms, WebGear acts as a powerful **Video Broadcaster** that transmits live video-frames to any web-browser in the network. Additionally, WebGear API also provides a special internal wrapper around [VideoGear](https://github.com/abhiTronix/vidgear#videogear), which itself provides internal access to both [CamGear](https://github.com/abhiTronix/vidgear#camgear) and [PiGear](https://github.com/abhiTronix/vidgear#pigear) APIs, thereby granting it exclusive power of broadcasting frames from any incoming stream. It also allows us to define our custom Server as source to transform frames easily before sending them across the network(see this [doc][webgear-cs] example).\n\n**Below is a snapshot of a WebGear Video Server in action on Chrome browser:**\n\n<p align=\"center\">\n  <img src=\"https://user-images.githubusercontent.com/34266896/211500287-0c12bfdf-2cbb-417a-9f3c-7a8b03ca5b6a.gif\" alt=\"WebGear in action!\" width=\"80%\" />\n  <br>\n  <sub><i>WebGear Video Server at <a href=\"http://localhost:8000/\" title=\"default address\">http://localhost:8000/</a> address.</i></sub>\n</p>\n\n**Code to generate the above result:**\n\n```python\n# import required libraries\nimport uvicorn\nfrom vidgear.gears.asyncio import WebGear\n\n# various performance tweaks\noptions = {\n    \"frame_size_reduction\": 40,\n    \"jpeg_compression_quality\": 80,\n    \"jpeg_compression_fastdct\": True,\n    \"jpeg_compression_fastupsample\": False,\n}\n\n# initialize WebGear app\nweb = WebGear(source=\"foo.mp4\", logging=True, **options)\n\n# run this app on Uvicorn server at address http://localhost:8000/\nuvicorn.run(web(), host=\"localhost\", port=8000)\n\n# close app safely\nweb.shutdown()\n```\n\n### WebGear API Guide:\n\n[**>>> Usage Guide**][webgear-doc]\n\n&nbsp;\n\n&nbsp;\n\n## WebGear_RTC\n\n> _WebGear_RTC is similar to [WeGear API](https://github.com/abhiTronix/vidgear#webgear) in many aspects but utilizes [WebRTC][webrtc] technology under the hood instead of Motion JPEG, which makes it suitable for building powerful video-streaming solutions for all modern browsers as well as native clients available on all major platforms._\n\nWebGear_RTC is implemented with the help of [**aiortc**][aiortc] library which is built on top of asynchronous I/O framework for Web Real-Time Communication (WebRTC) and Object Real-Time Communication (ORTC) and supports many features like SDP generation/parsing, Interactive Connectivity Establishment with half-trickle and mDNS support, DTLS key and certificate generation, DTLS handshake, etc.\n\nWebGear_RTC can handle [multiple consumers][webgear_rtc-mc] seamlessly and provides native support for ICE _(Interactive Connectivity Establishment)_ protocol, STUN _(Session Traversal Utilities for NAT)_, and TURN _(Traversal Using Relays around NAT)_ servers that help us to seamlessly establish direct media connection with the remote peers for uninterrupted data flow. It also allows us to define our custom streaming class with suitable source to transform frames easily before sending them across the network(see this [doc][webgear_rtc-cs] example).\n\nWebGear_RTC API works in conjunction with [**Starlette**][starlette]'s ASGI application and provides easy access to its complete framework. WebGear_RTC can also flexibly interact with Starlette's ecosystem of shared middleware, mountable applications, [Response classes](https://www.starlette.io/responses/), [Routing tables](https://www.starlette.io/routing/), [Static Files](https://www.starlette.io/staticfiles/), [Templating engine(with Jinja2)](https://www.starlette.io/templates/), etc.\n\nAdditionally, WebGear_RTC API also provides a special internal wrapper around [VideoGear](https://github.com/abhiTronix/vidgear#videogear), which itself provides internal access to both [CamGear](https://github.com/abhiTronix/vidgear#camgear) and [PiGear](https://github.com/abhiTronix/vidgear#pigear) APIs.\n\n**Below is a snapshot of a WebGear_RTC Media Server in action on Chrome browser:**\n\n<p align=\"center\">\n  <img src=\"https://user-images.githubusercontent.com/34266896/211502451-6dc1fb24-2472-4e95-b38e-cab252071cc7.gif\" alt=\"WebGear_RTC in action!\" width=\"80%\" />\n  <br>\n  <sub><i>WebGear_RTC Video Server at <a href=\"http://localhost:8000/\" title=\"default address\">http://localhost:8000/</a> address.</i></sub>\n</p>\n\n**Code to generate the above result:**\n\n```python\n# import required libraries\nimport uvicorn\nfrom vidgear.gears.asyncio import WebGear_RTC\n\n# various performance tweaks\noptions = {\n    \"frame_size_reduction\": 30,\n}\n\n# initialize WebGear_RTC app\nweb = WebGear_RTC(source=\"foo.mp4\", logging=True, **options)\n\n# run this app on Uvicorn server at address http://localhost:8000/\nuvicorn.run(web(), host=\"localhost\", port=8000)\n\n# close app safely\nweb.shutdown()\n```\n\n### WebGear_RTC API Guide:\n\n[**>>> Usage Guide**][webgear_rtc-doc]\n\n&nbsp;\n\n&nbsp;\n\n## NetGear_Async\n\n<p align=\"center\">\n  <img src=\"https://abhitronix.github.io/vidgear/latest/assets/images/zmq_asyncio.png\" alt=\"WebGear in action!\" width=\"70%\"/>\n</p>\n.\n\n> _NetGear_Async can generate the same performance as [NetGear API](https://github.com/abhiTronix/vidgear#netgear) at about one-third the memory consumption, and also provide complete server-client handling with various options to use variable protocols/patterns similar to NetGear, but lacks in term of flexibility as it supports only a few [NetGear's Exclusive Modes][netgear-exm]._\n\nNetGear_Async is built on [`zmq.asyncio`][asyncio-zmq], and powered by a high-performance asyncio event loop called [**`uvloop`**][uvloop] to achieve unmatchable high-speed and lag-free video streaming over the network with minimal resource constraints. NetGear_Async can transfer thousands of frames in just a few seconds without causing any significant load on your system.\n\nNetGear_Async provides complete server-client handling and options to use variable protocols/patterns similar to [NetGear API](https://github.com/abhiTronix/vidgear#netgear). Furthermore, NetGear_Async allows us to define our custom Server as source to transform frames easily before sending them across the network(see this [doc][netgear_async-cs] example).\n\nNetGear_Async now supports additional [**bidirectional data transmission**][btm_netgear_async] between receiver(client) and sender(server) while transferring video-frames. Users can easily build complex applications such as like [Real-Time Video Chat][rtvc] in just few lines of code.\n\nNetGear_Async as of now supports all four ZeroMQ messaging patterns:\n\n- [**`zmq.PAIR`**][zmq-pair] _(ZMQ Pair Pattern)_\n- [**`zmq.REQ/zmq.REP`**][zmq-req-rep] _(ZMQ Request/Reply Pattern)_\n- [**`zmq.PUB/zmq.SUB`**][zmq-pub-sub] _(ZMQ Publish/Subscribe Pattern)_\n- [**`zmq.PUSH/zmq.PULL`**][zmq-pull-push] _(ZMQ Push/Pull Pattern)_\n\nWhereas supported protocol are: `tcp` and `ipc`.\n\n### NetGear_Async API Guide:\n\n[**>>> Usage Guide**][netgear_async-doc]\n\n&nbsp;\n\n&nbsp;\n\n# Contributions\n\n<div align=\"center\">\n   <h3>\ud83d\udc51 Contributor Hall of Fame \ud83d\udc51</h3><br>\n   <a href=\"https://github.com/abhiTronix/vidgear/graphs/contributors\">\n    <img src=\"https://contributors-img.web.app/image?repo=abhiTronix/vidgear\"/><br><br>\n  </a>\n  <p><i>We're happy to meet new contributors\ud83d\udc97</i></p><br>\n</div>\n\nWe welcome your contributions to help us improve and extend this project. If you want to get involved with VidGear development, checkout the **[Contribution Guidelines \u25b6\ufe0f][contribute]**\n\nWe're offering support for VidGear on [**Gitter Community Channel**](https://gitter.im/vidgear/community?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge). Come and join the conversation over there!\n\n&nbsp;\n\n&nbsp;\n\n# Donations\n\n<div align=\"center\">\n   <img src=\"https://abhitronix.github.io/vidgear/latest/assets/images/help_us.png\" alt=\"PiGear\" width=\"50%\" />\n   <p><i>VidGear is free and open source and will always remain so. \u2764\ufe0f</i></p>\n</div>\n\nIt is something I am doing with my own free time. But so much more needs to be done, and I need your help to do this. For just the price of a cup of coffee, you can make a difference \ud83d\ude42\n\n<a href='https://ko-fi.com/W7W8WTYO' target='_blank'><img height='36' style='border:0px;height:36px;' src='https://cdn.ko-fi.com/cdn/kofi1.png?v=3' border='0' alt='Buy Me a Coffee at ko-fi.com' /></a>\n\n&nbsp;\n\n&nbsp;\n\n# Citation\n\nHere is a Bibtex entry you can use to cite this project in a publication:\n\n[![DOI](https://zenodo.org/badge/DOI/10.5281/zenodo.8332548.svg)](https://doi.org/10.5281/zenodo.8332548)\n\n```BibTeX\n@software{vidgear,\n  author       = {Abhishek Thakur and\n                  Zoe Papakipos and\n                  Christian Clauss and\n                  Christian Hollinger and\n                  Ian Max Andolina and\n                  Vincent Boivin and\n                  Kyle Ahn and\n                  freol35241 and\n                  Benjamin Lowe and\n                  Micka\u00ebl Schoentgen and\n                  Renaud Bouckenooghe and\n                  Ibtsam Ahmad},\n  title        = {abhiTronix/vidgear: VidGear Stable v0.3.2},\n  month        = sep,\n  year         = 2023,\n  publisher    = {Zenodo},\n  version      = {vidgear-0.3.2},\n  doi          = {10.5281/zenodo.8332548},\n  url          = {https://doi.org/10.5281/zenodo.8332548}\n}\n```\n\n&nbsp;\n\n&nbsp;\n\n# Copyright\n\n**Copyright (c) abhiTronix 2019**\n\nThis library is released under the **[Apache 2.0 License][license]**.\n\n<!--\nBadges\n-->\n\n[appveyor]: https://img.shields.io/appveyor/ci/abhitronix/vidgear.svg?style=for-the-badge&logo=appveyor\n[codecov]: https://img.shields.io/codecov/c/github/abhiTronix/vidgear/testing?logo=codecov&style=for-the-badge\n[github-cli]: https://img.shields.io/github/actions/workflow/status/abhiTronix/vidgear/.github/workflows/ci_linux.yml?style=for-the-badge&logo=data:image/svg%2bxml;base64,PHN2ZyB3aWR0aD0iNDgiIGhlaWdodD0iNDgiIGZpbGw9Im5vbmUiIHhtbG5zPSJodHRwOi8vd3d3LnczLm9yZy8yMDAwL3N2ZyI+PHBhdGggY2xpcC1ydWxlPSJldmVub2RkIiBkPSJNMTAgMWE5IDkgMCAwMTkgOSA5IDkgMCAwMS05IDkgOSA5IDAgMDEtOS05IDkgOSAwIDAxOS05ek0yMyAxOWE2IDYgMCAxMTAgMTIgNiA2IDAgMDEwLTEyek0yMyAzNWE2IDYgMCAxMTAgMTIgNiA2IDAgMDEwLTEyeiIgc3Ryb2tlPSJ2YXIoLS1jb2xvci1tYXJrZXRpbmctaWNvbi1wcmltYXJ5LCAjMjA4OEZGKSIgc3Ryb2tlLXdpZHRoPSIyIiBzdHJva2UtbGluZWNhcD0icm91bmQiIHN0cm9rZS1saW5lam9pbj0icm91bmQiLz48cGF0aCBjbGlwLXJ1bGU9ImV2ZW5vZGQiIGQ9Ik00MSAzNWE2IDYgMCAxMTAgMTIgNiA2IDAgMDEwLTEyeiIgc3Ryb2tlPSJ2YXIoLS1jb2xvci1tYXJrZXRpbmctaWNvbi1zZWNvbmRhcnksICM3OUI4RkYpIiBzdHJva2Utd2lkdGg9IjIiIHN0cm9rZS1saW5lY2FwPSJyb3VuZCIgc3Ryb2tlLWxpbmVqb2luPSJyb3VuZCIvPjxwYXRoIGQ9Ik0yNS4wMzcgMjMuNjA3bC0zLjA3IDMuMDY1LTEuNDktMS40ODUiIHN0cm9rZT0idmFyKC0tY29sb3ItbWFya2V0aW5nLWljb24tcHJpbWFyeSwgIzIwODhGRikiIHN0cm9rZS13aWR0aD0iMiIgc3Ryb2tlLWxpbmVjYXA9InJvdW5kIiBzdHJva2UtbGluZWpvaW49InJvdW5kIi8+PHBhdGggY2xpcC1ydWxlPSJldmVub2RkIiBkPSJNNDEgMTlhNiA2IDAgMTEwIDEyIDYgNiAwIDAxMC0xMnoiIHN0cm9rZT0idmFyKC0tY29sb3ItbWFya2V0aW5nLWljb24tcHJpbWFyeSwgIzIwODhGRikiIHN0cm9rZS13aWR0aD0iMiIgc3Ryb2tlLWxpbmVjYXA9InJvdW5kIiBzdHJva2UtbGluZWpvaW49InJvdW5kIi8+PHBhdGggZD0iTTQzLjAzNiAyMy42MDdsLTMuMDY5IDMuMDY1LTEuNDktMS40ODVNNyA2LjgxMmExIDEgMCAwMTEuNTMzLS44NDZsNS4xMTMgMy4yMmExIDEgMCAwMS0uMDA2IDEuNjk3bC01LjExMyAzLjE3QTEgMSAwIDAxNyAxMy4yMDNWNi44MTN6TTkgMTl2MTVjMCAzLjg2NiAzLjE3NyA3IDcgN2gxIiBzdHJva2U9InZhcigtLWNvbG9yLW1hcmtldGluZy1pY29uLXByaW1hcnksICMyMDg4RkYpIiBzdHJva2Utd2lkdGg9IjIiIHN0cm9rZS1saW5lY2FwPSJyb3VuZCIgc3Ryb2tlLWxpbmVqb2luPSJyb3VuZCIvPjxwYXRoIGQ9Ik0xNi45NDkgMjZhMSAxIDAgMTAwLTJ2MnpNOCAxOS4wMzVBNi45NjUgNi45NjUgMCAwMDE0Ljk2NSAyNnYtMkE0Ljk2NSA0Ljk2NSAwIDAxMTAgMTkuMDM1SDh6TTE0Ljk2NSAyNmgxLjk4NHYtMmgtMS45ODR2MnoiIGZpbGw9InZhcigtLWNvbG9yLW1hcmtldGluZy1pY29uLXByaW1hcnksICMyMDg4RkYpIi8+PHBhdGggZD0iTTI5LjA1NSAyNWg1Ljk0NCIgc3Ryb2tlPSJ2YXIoLS1jb2xvci1tYXJrZXRpbmctaWNvbi1wcmltYXJ5LCAjMjA4OEZGKSIgc3Ryb2tlLXdpZHRoPSIyIiBzdHJva2UtbGluZWNhcD0icm91bmQiIHN0cm9rZS1saW5lam9pbj0icm91bmQiLz48cGF0aCBmaWxsLXJ1bGU9ImV2ZW5vZGQiIGNsaXAtcnVsZT0iZXZlbm9kZCIgZD0iTTIxIDQwYTEgMSAwIDExLS4wMDEgMi4wMDFBMSAxIDAgMDEyMSA0MHpNMjUgNDBhMSAxIDAgMTEtLjAwMSAyLjAwMUExIDEgMCAwMTI1IDQweiIgZmlsbD0idmFyKC0tY29sb3ItbWFya2V0aW5nLWljb24tc2Vjb25kYXJ5LCAjNzlCOEZGKSIvPjxwYXRoIGQ9Ik0zNC4wMDUgNDEuMDA3bC0xLjAxMy4wMzMiIHN0cm9rZT0idmFyKC0tY29sb3ItbWFya2V0aW5nLWljb24tc2Vjb25kYXJ5LCAjNzlCOEZGKSIgc3Ryb2tlLXdpZHRoPSIyIiBzdHJva2UtbGluZWNhcD0icm91bmQiLz48L3N2Zz4=\n[prs-badge]: https://img.shields.io/badge/PRs-welcome-brightgreen.svg?style=for-the-badge&logo=data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAACAAAAAgCAYAAABzenr0AAABC0lEQVRYhdWVPQoCMRCFX6HY2ghaiZUXsLW0EDyBrbWtN/EUHsHTWFnYyCL4gxibVZZlZzKTnWz0QZpk5r0vIdkF/kBPAMOKeddE+CQPKoc5Yt5cTjBMdQSwDQToWgBJAn3jmhqgltapAV6E6b5U17MGGAUaUj07TficMfIBZDV6vxowBm1BP9WbSQE4o5h9IjPJmy73TEPDDxVmoZdQrQ5jRhly9Q8tgMUXkIIWn0oG4GYQfAXQzz1PGoCiQndM7b4RgJay/h7zBLT3hASgoKjamQJMreKf0gfuAGyYtXEIAKcL/Dss15iq6ohXghozLYiAMxPuACwtIT4yeQUxAaLrZwAoqGRKGk7qDSYTfYQ8LuYnAAAAAElFTkSuQmCC\n[twitter-badge]: https://img.shields.io/badge/Tweet-Now-blue.svg?style=for-the-badge&logo=twitter\n[azure-badge]: https://img.shields.io/azure-devops/build/abhiuna12/942b3b13-d745-49e9-8d7d-b3918ff43ac2/2/testing?logo=azure-pipelines&style=for-the-badge\n[pypi-badge]: https://img.shields.io/pypi/v/vidgear.svg?style=for-the-badge&logo=pypi\n[gitter-bagde]: https://img.shields.io/badge/Chat-Gitter-blueviolet.svg?style=for-the-badge&logo=gitter\n[coffee-badge]: https://abhitronix.github.io/img/vidgear/orange_img.png\n[kofi-badge]: https://www.ko-fi.com/img/githubbutton_sm.svg\n[black-badge]: https://img.shields.io/badge/code%20style-black-000000.svg?style=for-the-badge&logo=github\n\n<!--\nInternal URLs\n-->\n\n[release]: https://github.com/abhiTronix/vidgear/releases/latest\n[pypi]: https://pypi.org/project/vidgear/\n[gitter]: https://gitter.im/vidgear/community?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge\n[twitter-intent]: https://twitter.com/intent/tweet?url=https%3A%2F%2Fabhitronix.github.io%2Fvidgear&via%20%40abhi_una12&text=Checkout%20VidGear%20-%20A%20High-Performance%20Video-Processing%20Python%20Framework.&hashtags=vidgear%20%23videoprocessing%20%23python%20%23threaded%20%23asyncio\n[coffee]: https://www.buymeacoffee.com/2twOXFvlA\n[kofi]: https://ko-fi.com/W7W8WTYO\n[license]: https://github.com/abhiTronix/vidgear/blob/master/LICENSE\n[github-flow]: https://github.com/abhiTronix/vidgear/actions?query=workflow%3A%22Run+Linux+CI-Tests+for+vidgear%22\n[azure-pipeline]: https://dev.azure.com/abhiuna12/public/_build?definitionId=2\n[app]: https://ci.appveyor.com/project/abhiTronix/vidgear\n[code]: https://codecov.io/gh/abhiTronix/vidgear\n[btm_netgear_async]: https://abhitronix.github.io/vidgear/latest/gears/netgear_async/advanced/bidirectional_mode/\n[rtvc]: https://abhitronix.github.io/vidgear/latest/gears/netgear_async/advanced/bidirectional_mode/#using-bidirectional-mode-for-video-frames-transfer\n[test-4k]: https://github.com/abhiTronix/vidgear/blob/e0843720202b0921d1c26e2ce5b11fadefbec892/vidgear/tests/benchmark_tests/test_benchmark_playback.py#L65\n[bs_script_dataset]: https://github.com/abhiTronix/vidgear/blob/testing/scripts/bash/prepare_dataset.sh\n[faq]: https://abhitronix.github.io/vidgear/latest/help/get_help/#frequently-asked-questions\n[contribute]: https://abhitronix.github.io/vidgear/latest/contribution\n[rtsp-ex]: https://abhitronix.github.io/vidgear/latest/help/writegear_ex/#using-writegears-compression-mode-for-rtsprtp-live-streaming\n[doc-vidgear-purpose]: https://abhitronix.github.io/vidgear/latest/help/motivation/#why-is-vidgear-a-thing\n[live-stream]: https://abhitronix.github.io/vidgear/latest/gears/writegear/compression/usage/#using-compression-mode-for-live-streaming\n[live-audio-doc]: https://abhitronix.github.io/vidgear/latest/gears/writegear/compression/usage/#using-compression-mode-with-live-audio-input\n[piping-live-videos]: https://abhitronix.github.io/vidgear/latest/gears/camgear/usage/#using-camgear-with-streaming-websites\n[ffmpeg-doc]: https://abhitronix.github.io/vidgear/latest/gears/writegear/compression/advanced/ffmpeg_install/\n[youtube-doc]: https://abhitronix.github.io/vidgear/latest/gears/camgear/usage/#using-camgear-with-youtube-videos\n[tqm-doc]: https://abhitronix.github.io/vidgear/latest/bonus/TQM/#threaded-queue-mode\n[camgear-doc]: https://abhitronix.github.io/vidgear/latest/gears/camgear/overview/\n[stabilizer-doc]: https://abhitronix.github.io/vidgear/latest/gears/stabilizer/overview/\n[stabilizer-doc-ex]: https://abhitronix.github.io/vidgear/latest/gears/videogear/usage/#using-videogear-with-video-stabilizer-backend\n[videogear-doc]: https://abhitronix.github.io/vidgear/latest/gears/videogear/overview/\n[pigear-doc]: https://abhitronix.github.io/vidgear/latest/gears/pigear/overview/\n[cm-writegear-doc]: https://abhitronix.github.io/vidgear/latest/gears/writegear/compression/overview/\n[ncm-writegear-doc]: https://abhitronix.github.io/vidgear/latest/gears/writegear/non_compression/overview/\n[screengear-doc]: https://abhitronix.github.io/vidgear/latest/gears/screengear/overview/\n[streamgear-doc]: https://abhitronix.github.io/vidgear/latest/gears/streamgear/introduction/\n[writegear-doc]: https://abhitronix.github.io/vidgear/latest/gears/writegear/introduction/\n[netgear-doc]: https://abhitronix.github.io/vidgear/latest/gears/netgear/overview/\n[webgear-doc]: https://abhitronix.github.io/vidgear/latest/gears/webgear/overview/\n[webgear_rtc-doc]: https://abhitronix.github.io/vidgear/latest/gears/webgear_rtc/overview/\n[netgear_async-doc]: https://abhitronix.github.io/vidgear/latest/gears/netgear_async/overview/\n[drop35]: https://github.com/abhiTronix/vidgear/issues/99\n[custom-command-doc]: https://abhitronix.github.io/vidgear/latest/gears/writegear/compression/advanced/cciw/\n[advanced-webgear-doc]: https://abhitronix.github.io/vidgear/latest/gears/webgear/advanced/\n[netgear_bidata_doc]: https://abhitronix.github.io/vidgear/latest/gears/netgear/advanced/bidirectional_mode/\n[netgear_compression_doc]: https://abhitronix.github.io/vidgear/latest/gears/netgear/advanced/compression/\n[netgear_security_doc]: https://abhitronix.github.io/vidgear/latest/gears/netgear/advanced/secure_mode/\n[netgear_multi_server_doc]: https://abhitronix.github.io/vidgear/latest/gears/netgear/advanced/multi_server/\n[netgear_multi_client_doc]: https://abhitronix.github.io/vidgear/latest/gears/netgear/advanced/multi_client/\n[netgear_sshtunnel_doc]: https://abhitronix.github.io/vidgear/latest/gears/netgear/advanced/ssh_tunnel/\n[netgear-exm]: https://abhitronix.github.io/vidgear/latest/gears/netgear/overview/#modes-of-operation\n[stabilize_webgear_doc]: https://abhitronix.github.io/vidgear/latest/gears/webgear/advanced/#using-webgear-with-real-time-video-stabilization-enabled\n[netgear_async-cs]: https://abhitronix.github.io/vidgear/latest/gears/netgear_async/usage/#using-netgear_async-with-a-custom-sourceopencv\n[installation]: https://abhitronix.github.io/vidgear/latest/installation/\n[gears]: https://abhitronix.github.io/vidgear/latest/gears\n[switch_from_cv]: https://abhitronix.github.io/vidgear/latest/switch_from_cv/\n[ss-mode-doc]: https://abhitronix.github.io/vidgear/latest/gears/streamgear/ssm/#overview\n[rtf-mode-doc]: https://abhitronix.github.io/vidgear/latest/gears/streamgear/rtfm/#overview\n[webgear-cs]: https://abhitronix.github.io/vidgear/latest/gears/webgear/advanced/#using-webgear-with-a-custom-sourceopencv\n[webgear_rtc-cs]: https://abhitronix.github.io/vidgear/latest/gears/webgear_rtc/advanced/#using-webgear_rtc-with-a-custom-sourceopencv\n[webgear_rtc-mc]: https://abhitronix.github.io/vidgear/latest/gears/webgear_rtc/advanced/#using-webgear_rtc-as-real-time-broadcaster\n[docs]: https://abhitronix.github.io/vidgear\n\n<!--\nExternal URLs\n-->\n\n[asyncio-zmq]: https://pyzmq.readthedocs.io/en/latest/api/zmq.asyncio.html\n[uvloop]: https://github.com/MagicStack/uvloop\n[streamlink]: https://streamlink.github.io/\n[aiortc]: https://aiortc.readthedocs.io/en/latest/\n[pyscreenshot]: https://github.com/ponty/pyscreenshot\n[uvloop-ns]: https://github.com/MagicStack/uvloop/issues/14\n[ffmpeg]: https://www.ffmpeg.org/\n[flake8]: https://flake8.pycqa.org/en/latest/\n[dxcam]: https://github.com/ra1nty/DXcam\n[black]: https://github.com/psf/black\n[pytest]: https://docs.pytest.org/en/latest/\n[opencv-writer]: https://docs.opencv.org/master/dd/d9e/classcv_1_1VideoWriter.html#ad59c61d8881ba2b2da22cff5487465b5\n[opencv-windows]: https://www.learnopencv.com/install-opencv3-on-windows/\n[opencv-linux]: https://www.pyimagesearch.com/2018/05/28/ubuntu-18-04-how-to-install-opencv/\n[opencv-pi]: https://www.pyimagesearch.com/2018/09/26/install-opencv-4-on-your-raspberry-pi/\n[starlette]: https://www.starlette.io/\n[uvicorn]: http://www.uvicorn.org/\n[daphne]: https://github.com/django/daphne/\n[hypercorn]: https://pgjones.gitlab.io/hypercorn/\n[prs]: http://makeapullrequest.com\n[opencv]: https://github.com/opencv/opencv\n[picamera]: https://github.com/waveform80/picamera\n[pafy]: https://github.com/mps-youtube/pafy\n[pyzmq]: https://github.com/zeromq/pyzmq\n[zmq]: https://zeromq.org/\n[mss]: https://github.com/BoboTiG/python-mss\n[pip]: https://pip.pypa.io/en/stable/installing/\n[opencv-vc]: https://docs.opencv.org/master/d8/dfe/classcv_1_1VideoCapture.html#a57c0e81e83e60f36c83027dc2a188e80\n[ov5647-picam]: https://github.com/techyian/MMALSharp/doc/OmniVision-OV5647-Camera-Module\n[imx219-picam]: https://github.com/techyian/MMALSharp/doc/Sony-IMX219-Camera-Module\n[opencv-vw]: https://docs.opencv.org/3.4/d8/dfe/classcv_1_1VideoCapture.html\n[yt_dlp]: https://github.com/yt-dlp/yt-dlp\n[numpy]: https://github.com/numpy/numpy\n[zmq-pair]: https://learning-0mq-with-pyzmq.readthedocs.io/en/latest/pyzmq/patterns/pair.html\n[zmq-req-rep]: https://learning-0mq-with-pyzmq.readthedocs.io/en/latest/pyzmq/patterns/client_server.html\n[zmq-pub-sub]: https://learning-0mq-with-pyzmq.readthedocs.io/en/latest/pyzmq/patterns/pubsub.html\n[zmq-pull-push]: https://learning-0mq-with-pyzmq.readthedocs.io/en/latest/pyzmq/patterns/pushpull.html#push-pull\n[picamera2]:https://github.com/raspberrypi/picamera2\n[picamera-setting]: https://picamera.readthedocs.io/en/release-1.13/quickstart.html\n[webrtc]: https://webrtc.org/\n\n\n",
    "bugtrack_url": null,
    "license": "Apache License 2.0",
    "summary": "High-performance cross-platform Video Processing Python framework powerpacked with unique trailblazing features.",
    "version": "0.3.3",
    "project_urls": {
        "Bug Reports": "https://github.com/abhiTronix/vidgear/issues",
        "Changelog": "https://abhitronix.github.io/vidgear/latest/changelog/",
        "Documentation": "https://abhitronix.github.io/vidgear",
        "Funding": "https://ko-fi.com/W7W8WTYO",
        "Homepage": "https://abhitronix.github.io/vidgear",
        "Source": "https://github.com/abhiTronix/vidgear"
    },
    "split_keywords": [
        "opencv",
        " multithreading",
        " ffmpeg",
        " picamera2",
        " starlette",
        " mss",
        " pyzmq",
        " dxcam",
        " aiortc",
        " uvicorn",
        " uvloop",
        " yt-dlp",
        " asyncio",
        " dash",
        " hls",
        " video processing",
        " video stabilization",
        " computer vision",
        " video streaming",
        " raspberrypi",
        " youtube",
        " twitch",
        " webrtc"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "35762ad80ef6427a6e79473a2b72ef4db52fbc70a7ef312ead7a0088e731fb0e",
                "md5": "c94f8475f7fd976c7397f146ca7992b6",
                "sha256": "b5939b23b67bc1af6b21a78f58b771917b75f51b18207b75911a998a614208b1"
            },
            "downloads": -1,
            "filename": "vidgear-0.3.3-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "c94f8475f7fd976c7397f146ca7992b6",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.8",
            "size": 122033,
            "upload_time": "2024-06-22T19:12:02",
            "upload_time_iso_8601": "2024-06-22T19:12:02.911430Z",
            "url": "https://files.pythonhosted.org/packages/35/76/2ad80ef6427a6e79473a2b72ef4db52fbc70a7ef312ead7a0088e731fb0e/vidgear-0.3.3-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "bcce43c5eb89f9ec6d10ffe6c938c40bbec6ca18fd45954b83682269cabea689",
                "md5": "a01cb8a25c646e988d7dac1bbc0ae680",
                "sha256": "4f25c74d6e65a54e26d12fb518423432ed16950fc081665896fc6640ec12f24e"
            },
            "downloads": -1,
            "filename": "vidgear-0.3.3.tar.gz",
            "has_sig": false,
            "md5_digest": "a01cb8a25c646e988d7dac1bbc0ae680",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.8",
            "size": 138094,
            "upload_time": "2024-06-22T19:12:05",
            "upload_time_iso_8601": "2024-06-22T19:12:05.764995Z",
            "url": "https://files.pythonhosted.org/packages/bc/ce/43c5eb89f9ec6d10ffe6c938c40bbec6ca18fd45954b83682269cabea689/vidgear-0.3.3.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-06-22 19:12:05",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "abhiTronix",
    "github_project": "vidgear",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "appveyor": true,
    "lcname": "vidgear"
}
        
Elapsed time: 0.25829s