From 6e383e99e1fe784c832a36c8fe6dea1a24dd0725 Mon Sep 17 00:00:00 2001 From: abhiTronix Date: Sun, 22 Aug 2021 23:05:14 +0530 Subject: [PATCH 01/11] =?UTF-8?q?=E2=8F=AA=EF=B8=8F=20Docs:=20Reverted=20U?= =?UTF-8?q?I=20change=20in=20CSS.?= MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit --- docs/overrides/assets/stylesheets/custom.css | 1 + 1 file changed, 1 insertion(+) diff --git a/docs/overrides/assets/stylesheets/custom.css b/docs/overrides/assets/stylesheets/custom.css index 62d2fbac2..af2beccd0 100755 --- a/docs/overrides/assets/stylesheets/custom.css +++ b/docs/overrides/assets/stylesheets/custom.css @@ -189,6 +189,7 @@ th { overflow: hidden; } .md-version__current { + text-transform: uppercase; font-weight: bolder; } .md-typeset .task-list-control .task-list-indicator::before { From d260320755e860294abd106f23f23e90d8918695 Mon Sep 17 00:00:00 2001 From: abhiTronix Date: Mon, 23 Aug 2021 10:03:40 +0530 Subject: [PATCH 02/11] =?UTF-8?q?=F0=9F=92=84=20Docs:=20New=20assets=20and?= =?UTF-8?q?=20typos=20fixed.?= MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit --- README.md | 13 +++++----- docs/gears/camgear/advanced/source_params.md | 13 +++++++--- docs/gears/camgear/usage.md | 2 +- docs/gears/netgear_async/overview.md | 2 +- docs/gears/netgear_async/usage.md | 2 +- docs/gears/stabilizer/usage.md | 2 +- docs/gears/streamgear/introduction.md | 24 ++++++++---------- docs/gears/streamgear/rtfm/overview.md | 8 +++--- docs/gears/streamgear/ssm/overview.md | 10 +++++--- docs/gears/webgear/advanced.md | 2 +- docs/gears/webgear_rtc/advanced.md | 2 +- docs/gears/webgear_rtc/overview.md | 2 +- docs/gears/writegear/introduction.md | 2 +- docs/overrides/assets/images/stream_tweak.png | Bin 0 -> 44390 bytes docs/switch_from_cv.md | 2 +- vidgear/gears/asyncio/netgear_async.py | 2 +- vidgear/gears/asyncio/webgear_rtc.py | 2 +- 17 files changed, 47 insertions(+), 43 deletions(-) create mode 100644 docs/overrides/assets/images/stream_tweak.png diff --git a/README.md b/README.md index f4eadc415..b45a5dec1 100644 --- a/README.md +++ b/README.md @@ -418,7 +418,7 @@ In addition to this, WriteGear also provides flexible access to [**OpenCV's Vide * **Compression Mode:** In this mode, WriteGear utilizes powerful [**FFmpeg**][ffmpeg] inbuilt encoders to encode lossless multimedia files. This mode provides us the ability to exploit almost any parameter available within FFmpeg, effortlessly and flexibly, and while doing that it robustly handles all errors/warnings quietly. **You can find more about this mode [here ➶][cm-writegear-doc]** - * **Non-Compression Mode:** In this mode, WriteGear utilizes basic [**OpenCV's inbuilt VideoWriter API**][opencv-vw] tools. This mode also supports all parameters manipulation available within VideoWriter API, but it lacks the ability to manipulate encoding parameters and other important features like video compression, audio encoding, etc. **You can learn about this mode [here ➶][ncm-writegear-doc]** + * **Non-Compression Mode:** In this mode, WriteGear utilizes basic [**OpenCV's inbuilt VideoWriter API**][opencv-vw] tools. This mode also supports all parameter transformations available within OpenCV's VideoWriter API, but it lacks the ability to manipulate encoding parameters and other important features like video compression, audio encoding, etc. **You can learn about this mode [here ➶][ncm-writegear-doc]** ### WriteGear API Guide: @@ -449,10 +449,9 @@ SteamGear also creates a Manifest file _(such as MPD in-case of DASH)_ or a Mast **StreamGear primarily works in two Independent Modes for transcoding which serves different purposes:** - * **Single-Source Mode:** In this mode, StreamGear transcodes entire video/audio file _(as opposed to frames by frame)_ into a sequence of multiple smaller chunks/segments for streaming. This mode works exceptionally well, when you're transcoding lossless long-duration videos(with audio) for streaming and required no extra efforts or interruptions. But on the downside, the provided source cannot be changed or manipulated before sending onto FFmpeg Pipeline for processing. ***Learn more about this mode [here ➶][ss-mode-doc]*** - - * **Real-time Frames Mode:** In this mode, StreamGear directly transcodes video-frames _(as opposed to a entire file)_ into a sequence of multiple smaller chunks/segments for streaming. In this mode, StreamGear supports real-time [`numpy.ndarray`](https://numpy.org/doc/1.18/reference/generated/numpy.ndarray.html#numpy-ndarray) frames, and process them over FFmpeg pipeline. But on the downside, audio has to added manually _(as separate source)_ for streams. ***Learn more about this mode [here ➶][rtf-mode-doc]*** + * **Single-Source Mode:** In this mode, StreamGear **transcodes entire video file** _(as opposed to frame-by-frame)_ into a sequence of multiple smaller chunks/segments for streaming. This mode works exceptionally well when you're transcoding long-duration lossless videos(with audio) for streaming that required no interruptions. But on the downside, the provided source cannot be flexibly manipulated or transformed before sending onto FFmpeg Pipeline for processing. ***Learn more about this mode [here ➶][ss-mode-doc]*** + * **Real-time Frames Mode:** In this mode, StreamGear directly **transcodes frame-by-frame** _(as opposed to a entire video file)_, into a sequence of multiple smaller chunks/segments for streaming. This mode works exceptionally well when you desire to flexibility manipulate or transform [`numpy.ndarray`](https://numpy.org/doc/1.18/reference/generated/numpy.ndarray.html#numpy-ndarray) frames in real-time before sending them onto FFmpeg Pipeline for processing. But on the downside, audio has to added manually _(as separate source)_ for streams. ***Learn more about this mode [here ➶][rtf-mode-doc]*** ### StreamGear API Guide: @@ -507,7 +506,7 @@ WebGear API works on [**Starlette**](https://www.starlette.io/)'s ASGI applicati WebGear API uses an intraframe-only compression scheme under the hood where the sequence of video-frames are first encoded as JPEG-DIB (JPEG with Device-Independent Bit compression) and then streamed over HTTP using Starlette's Multipart [Streaming Response](https://www.starlette.io/responses/#streamingresponse) and a [Uvicorn](https://www.uvicorn.org/#quickstart) ASGI Server. This method imposes lower processing and memory requirements, but the quality is not the best, since JPEG compression is not very efficient for motion video. -In layman's terms, WebGear acts as a powerful **Video Broadcaster** that transmits live video-frames to any web-browser in the network. Additionally, WebGear API also provides a special internal wrapper around [VideoGear](#videogear), which itself provides internal access to both [CamGear](#camgear) and [PiGear](#pigear) APIs, thereby granting it exclusive power of broadcasting frames from any incoming stream. It also allows us to define our custom Server as source to manipulate frames easily before sending them across the network(see this [doc][webgear-cs] example). +In layman's terms, WebGear acts as a powerful **Video Broadcaster** that transmits live video-frames to any web-browser in the network. Additionally, WebGear API also provides a special internal wrapper around [VideoGear](#videogear), which itself provides internal access to both [CamGear](#camgear) and [PiGear](#pigear) APIs, thereby granting it exclusive power of broadcasting frames from any incoming stream. It also allows us to define our custom Server as source to transform frames easily before sending them across the network(see this [doc][webgear-cs] example). **Below is a snapshot of a WebGear Video Server in action on Chrome browser:** @@ -558,7 +557,7 @@ web.shutdown() WebGear_RTC is implemented with the help of [**aiortc**][aiortc] library which is built on top of asynchronous I/O framework for Web Real-Time Communication (WebRTC) and Object Real-Time Communication (ORTC) and supports many features like SDP generation/parsing, Interactive Connectivity Establishment with half-trickle and mDNS support, DTLS key and certificate generation, DTLS handshake, etc. -WebGear_RTC can handle [multiple consumers][webgear_rtc-mc] seamlessly and provides native support for ICE _(Interactive Connectivity Establishment)_ protocol, STUN _(Session Traversal Utilities for NAT)_, and TURN _(Traversal Using Relays around NAT)_ servers that help us to easily establish direct media connection with the remote peers for uninterrupted data flow. It also allows us to define our custom Server as a source to manipulate frames easily before sending them across the network(see this [doc][webgear_rtc-cs] example). +WebGear_RTC can handle [multiple consumers][webgear_rtc-mc] seamlessly and provides native support for ICE _(Interactive Connectivity Establishment)_ protocol, STUN _(Session Traversal Utilities for NAT)_, and TURN _(Traversal Using Relays around NAT)_ servers that help us to easily establish direct media connection with the remote peers for uninterrupted data flow. It also allows us to define our custom Server as a source to transform frames easily before sending them across the network(see this [doc][webgear_rtc-cs] example). WebGear_RTC API works in conjunction with [**Starlette**][starlette]'s ASGI application and provides easy access to its complete framework. WebGear_RTC can also flexibly interact with Starlette's ecosystem of shared middleware, mountable applications, [Response classes](https://www.starlette.io/responses/), [Routing tables](https://www.starlette.io/routing/), [Static Files](https://www.starlette.io/staticfiles/), [Templating engine(with Jinja2)](https://www.starlette.io/templates/), etc. @@ -615,7 +614,7 @@ web.shutdown() NetGear_Async is built on [`zmq.asyncio`][asyncio-zmq], and powered by a high-performance asyncio event loop called [**`uvloop`**][uvloop] to achieve unmatchable high-speed and lag-free video streaming over the network with minimal resource constraints. NetGear_Async can transfer thousands of frames in just a few seconds without causing any significant load on your system. -NetGear_Async provides complete server-client handling and options to use variable protocols/patterns similar to [NetGear API](#netgear). Furthermore, NetGear_Async allows us to define our custom Server as source to manipulate frames easily before sending them across the network(see this [doc][netgear_Async-cs] example). +NetGear_Async provides complete server-client handling and options to use variable protocols/patterns similar to [NetGear API](#netgear). Furthermore, NetGear_Async allows us to define our custom Server as source to transform frames easily before sending them across the network(see this [doc][netgear_Async-cs] example). NetGear_Async now supports additional [**bidirectional data transmission**][btm_netgear_async] between receiver(client) and sender(server) while transferring video-frames. Users can easily build complex applications such as like [Real-Time Video Chat][rtvc] in just few lines of code. diff --git a/docs/gears/camgear/advanced/source_params.md b/docs/gears/camgear/advanced/source_params.md index 01cb9287d..0c91b9cdb 100644 --- a/docs/gears/camgear/advanced/source_params.md +++ b/docs/gears/camgear/advanced/source_params.md @@ -20,17 +20,22 @@ limitations under the License. # Source Tweak Parameters for CamGear API -  +
+ Source Tweak Parameters +
## Overview -The [`options`](../../params/#options) dictionary parameter in CamGear, gives user the ability to alter various **Source Tweak Parameters** available within [OpenCV's VideoCapture Class](https://docs.opencv.org/master/d8/dfe/classcv_1_1VideoCapture.html#a57c0e81e83e60f36c83027dc2a188e80). These tweak parameters can be used to manipulate input source Camera-Device properties _(such as its brightness, saturation, size, iso, gain etc.)_ seamlessly. Thereby, All Source Tweak Parameters supported by CamGear API are disscussed in this document. +The [`options`](../../params/#options) dictionary parameter in CamGear gives user the ability to alter various parameters available within [OpenCV's VideoCapture Class](https://docs.opencv.org/master/d8/dfe/classcv_1_1VideoCapture.html#a57c0e81e83e60f36c83027dc2a188e80). + +These tweak parameters can be used to transform input Camera-Source properties _(such as its brightness, saturation, size, iso, gain etc.)_ seamlessly. All parameters supported by CamGear API are disscussed in this document.   -!!! quote "" - ### Exclusive CamGear Parameters +### Exclusive CamGear Parameters + +!!! quote "" In addition to Source Tweak Parameters, CamGear also provides some exclusive attributes for its [`options`](../../params/#options) dictionary parameters. These attributes are as follows: diff --git a/docs/gears/camgear/usage.md b/docs/gears/camgear/usage.md index f40d4f10f..d2b377c13 100644 --- a/docs/gears/camgear/usage.md +++ b/docs/gears/camgear/usage.md @@ -186,7 +186,7 @@ stream.stop() ## Using CamGear with Variable Camera Properties -CamGear API also flexibly support various **Source Tweak Parameters** available within [OpenCV's VideoCapture API](https://docs.opencv.org/master/d4/d15/group__videoio__flags__base.html#gaeb8dd9c89c10a5c63c139bf7c4f5704d). These tweak parameters can be used to manipulate input source Camera-Device properties _(such as its brightness, saturation, size, iso, gain etc.)_ seamlessly, and can be easily applied in CamGear API through its `options` dictionary parameter by formatting them as its attributes. The complete usage example is as follows: +CamGear API also flexibly support various **Source Tweak Parameters** available within [OpenCV's VideoCapture API](https://docs.opencv.org/master/d4/d15/group__videoio__flags__base.html#gaeb8dd9c89c10a5c63c139bf7c4f5704d). These tweak parameters can be used to transform input source Camera-Device properties _(such as its brightness, saturation, size, iso, gain etc.)_ seamlessly, and can be easily applied in CamGear API through its `options` dictionary parameter by formatting them as its attributes. The complete usage example is as follows: !!! tip "All the supported Source Tweak Parameters can be found [here ➶](../advanced/source_params/#source-tweak-parameters-for-camgear-api)" diff --git a/docs/gears/netgear_async/overview.md b/docs/gears/netgear_async/overview.md index 162ff784a..fc2e505cf 100644 --- a/docs/gears/netgear_async/overview.md +++ b/docs/gears/netgear_async/overview.md @@ -30,7 +30,7 @@ limitations under the License. NetGear_Async is built on [`zmq.asyncio`](https://pyzmq.readthedocs.io/en/latest/api/zmq.asyncio.html), and powered by a high-performance asyncio event loop called [**`uvloop`**](https://github.com/MagicStack/uvloop) to achieve unmatchable high-speed and lag-free video streaming over the network with minimal resource constraints. NetGear_Async can transfer thousands of frames in just a few seconds without causing any significant load on your system. -NetGear_Async provides complete server-client handling and options to use variable protocols/patterns similar to [NetGear API](../../netgear/overview/). Furthermore, NetGear_Async allows us to define our custom Server as source to manipulate frames easily before sending them across the network(see this [doc](../usage/#using-netgear_async-with-a-custom-sourceopencv) example). +NetGear_Async provides complete server-client handling and options to use variable protocols/patterns similar to [NetGear API](../../netgear/overview/). Furthermore, NetGear_Async allows us to define our custom Server as source to transform frames easily before sending them across the network(see this [doc](../usage/#using-netgear_async-with-a-custom-sourceopencv) example). NetGear_Async now supports additional [**bidirectional data transmission**](../advanced/bidirectional_mode) between receiver(client) and sender(server) while transferring frames. Users can easily build complex applications such as like [Real-Time Video Chat](../advanced/bidirectional_mode/#using-bidirectional-mode-for-video-frames-transfer) in just few lines of code. diff --git a/docs/gears/netgear_async/usage.md b/docs/gears/netgear_async/usage.md index f0a123657..0220d1252 100644 --- a/docs/gears/netgear_async/usage.md +++ b/docs/gears/netgear_async/usage.md @@ -223,7 +223,7 @@ if __name__ == "__main__": ## Using NetGear_Async with a Custom Source(OpenCV) -NetGear_Async allows you to easily define your own custom Source at Server-end that you want to use to manipulate your frames before sending them onto the network. +NetGear_Async allows you to easily define your own custom Source at Server-end that you want to use to transform your frames before sending them onto the network. Let's implement a bare-minimum example with a Custom Source using NetGear_Async API and OpenCV: diff --git a/docs/gears/stabilizer/usage.md b/docs/gears/stabilizer/usage.md index 65167fe37..fd423ba41 100644 --- a/docs/gears/stabilizer/usage.md +++ b/docs/gears/stabilizer/usage.md @@ -145,7 +145,7 @@ stream.release() ## Using Stabilizer with Variable Parameters -Stabilizer class provide certain [parameters](../params/) which you can use to manipulate its internal properties. The complete usage example is as follows: +Stabilizer class provide certain [parameters](../params/) which you can use to tweak its internal properties. The complete usage example is as follows: ```python # import required libraries diff --git a/docs/gears/streamgear/introduction.md b/docs/gears/streamgear/introduction.md index 205e73f0c..027b1a0d3 100644 --- a/docs/gears/streamgear/introduction.md +++ b/docs/gears/streamgear/introduction.md @@ -39,7 +39,7 @@ SteamGear currently supports [**MPEG-DASH**](https://www.encoding.com/mpeg-dash/ SteamGear also creates a Manifest file _(such as MPD in-case of DASH)_ or a Master Playlist _(such as M3U8 in-case of Apple HLS)_ besides segments that describe these segment information _(timing, URL, media characteristics like video resolution and adaptive bit rates)_ and is provided to the client before the streaming session. -!!! tip "For streaming with older traditional protocols such as RTMP, RTSP/RTP you could use [WriteGear](../../writegear/introduction/) API instead." +!!! alert "For streaming with older traditional protocols such as RTMP, RTSP/RTP you could use [WriteGear](../../writegear/introduction/) API instead."   @@ -52,10 +52,17 @@ SteamGear also creates a Manifest file _(such as MPD in-case of DASH)_ or a Mast * StreamGear **MUST** requires FFmpeg executables for its core operations. Follow these dedicated [Platform specific Installation Instructions ➶](../ffmpeg_install/) for its installation. - * :warning: StreamGear API will throw **RuntimeError**, if it fails to detect valid FFmpeg executables on your system. + * :warning: StreamGear API will throw **RuntimeError**, if it fails to detect valid FFmpeg executable on your system. * It is advised to enable logging _([`logging=True`](../params/#logging))_ on the first run for easily identifying any runtime errors. +!!! tip "Useful Links" + + - Checkout [this detailed blogpost](https://ottverse.com/mpeg-dash-video-streaming-the-complete-guide/) on how MPEG-DASH works. + - Checkout [this detailed blogpost](https://ottverse.com/hls-http-live-streaming-how-does-it-work/) on how HLS works. + - Checkout [this detailed blogpost](https://ottverse.com/hls-http-live-streaming-how-does-it-work/) for HLS vs. MPEG-DASH comparison. + +   ## Mode of Operations @@ -68,9 +75,9 @@ StreamGear primarily operates in following independent modes for transcoding: Rather, you can enable live-streaming in Real-time Frames Mode by using using exclusive [`-livestream`](../params/#a-exclusive-parameters) attribute of `stream_params` dictionary parameter in StreamGear API. Checkout [this usage example](../rtfm/usage/#bare-minimum-usage-with-live-streaming) for more information. -- [**Single-Source Mode**](../ssm/overview): In this mode, StreamGear transcodes entire video/audio file _(as opposed to frames by frame)_ into a sequence of multiple smaller chunks/segments for streaming. This mode works exceptionally well, when you're transcoding lossless long-duration videos(with audio) for streaming and required no extra efforts or interruptions. But on the downside, the provided source cannot be changed or manipulated before sending onto FFmpeg Pipeline for processing. +- [**Single-Source Mode**](../ssm/overview): In this mode, StreamGear **transcodes entire video file** _(as opposed to frame-by-frame)_ into a sequence of multiple smaller chunks/segments for streaming. This mode works exceptionally well when you're transcoding long-duration lossless videos(with audio) for streaming that required no interruptions. But on the downside, the provided source cannot be flexibly manipulated or transformed before sending onto FFmpeg Pipeline for processing. -- [**Real-time Frames Mode**](../rtfm/overview): In this mode, StreamGear directly transcodes video-frames _(as opposed to a entire file)_, into a sequence of multiple smaller chunks/segments for streaming. In this mode, StreamGear supports real-time [`numpy.ndarray`](https://numpy.org/doc/1.18/reference/generated/numpy.ndarray.html#numpy-ndarray) frames, and process them over FFmpeg pipeline. But on the downside, audio has to added manually _(as separate source)_ for streams. +- [**Real-time Frames Mode**](../rtfm/overview): In this mode, StreamGear directly **transcodes frame-by-frame** _(as opposed to a entire video file)_, into a sequence of multiple smaller chunks/segments for streaming. This mode works exceptionally well when you desire to flexibility manipulate or transform [`numpy.ndarray`](https://numpy.org/doc/1.18/reference/generated/numpy.ndarray.html#numpy-ndarray) frames in real-time before sending them onto FFmpeg Pipeline for processing. But on the downside, audio has to added manually _(as separate source)_ for streams.   @@ -125,15 +132,6 @@ from vidgear.gears import StreamGear ## Recommended Players -!!! tip "Useful Links" - - - Checkout [this detailed blogpost](https://ottverse.com/mpeg-dash-video-streaming-the-complete-guide/) on how MPEG-DASH works. - - - Checkout [this detailed blogpost](https://ottverse.com/hls-http-live-streaming-how-does-it-work/) on how HLS works. - - - Checkout [this detailed blogpost](https://ottverse.com/hls-http-live-streaming-how-does-it-work/) for HLS vs. MPEG-DASH comparsion. - - === "GUI Players" - [x] **[MPV Player](https://mpv.io/):** _(recommended)_ MPV is a free, open source, and cross-platform media player. It supports a wide variety of media file formats, audio and video codecs, and subtitle types. - [x] **[VLC Player](https://www.videolan.org/vlc/releases/3.0.0.html):** VLC is a free and open source cross-platform multimedia player and framework that plays most multimedia files as well as DVDs, Audio CDs, VCDs, and various streaming protocols. diff --git a/docs/gears/streamgear/rtfm/overview.md b/docs/gears/streamgear/rtfm/overview.md index d1c07ab82..0f8649233 100644 --- a/docs/gears/streamgear/rtfm/overview.md +++ b/docs/gears/streamgear/rtfm/overview.md @@ -29,13 +29,13 @@ limitations under the License. ## Overview -When no valid input is received on [`-video_source`](../../params/#a-exclusive-parameters) attribute of [`stream_params`](../../params/#supported-parameters) dictionary parameter, StreamGear API activates this mode where it directly transcodes real-time [`numpy.ndarray`](https://numpy.org/doc/1.18/reference/generated/numpy.ndarray.html#numpy-ndarray) video-frames _(as opposed to a entire file)_ into a sequence of multiple smaller chunks/segments for streaming. +When no valid input is received on [`-video_source`](../../params/#a-exclusive-parameters) attribute of [`stream_params`](../../params/#supported-parameters) dictionary parameter, StreamGear API activates this mode where it directly transcodes real-time [`numpy.ndarray`](https://numpy.org/doc/1.18/reference/generated/numpy.ndarray.html#numpy-ndarray) video-frames _(as opposed to a entire video file)_ into a sequence of multiple smaller chunks/segments for adaptive streaming. -SteamGear supports both [**MPEG-DASH**](https://www.encoding.com/mpeg-dash/) _(Dynamic Adaptive Streaming over HTTP, ISO/IEC 23009-1)_ and [**Apple HLS**](https://developer.apple.com/documentation/http_live_streaming) _(HTTP Live Streaming)_ with this mode. +This mode works exceptionally well when you desire to flexibility manipulate or transform video-frames in real-time before sending them onto FFmpeg Pipeline for processing. But on the downside, StreamGear **DOES NOT** automatically maps video-source's audio to generated streams with this mode. You need to manually assign separate audio-source through [`-audio`](../../params/#a-exclusive-parameters) attribute of `stream_params` dictionary parameter. -In this mode, StreamGear **DOES NOT** automatically maps video-source audio to generated streams. You need to manually assign separate audio-source through [`-audio`](../../params/#a-exclusive-parameters) attribute of `stream_params` dictionary parameter. +SteamGear supports both [**MPEG-DASH**](https://www.encoding.com/mpeg-dash/) _(Dynamic Adaptive Streaming over HTTP, ISO/IEC 23009-1)_ and [**Apple HLS**](https://developer.apple.com/documentation/http_live_streaming) _(HTTP Live Streaming)_ with this mode. -This mode provide [`stream()`](../../../../bonus/reference/streamgear/#vidgear.gears.streamgear.StreamGear.stream) function for directly trancoding video-frames into streamable chunks over the FFmpeg pipeline. +For this mode, StreamGear API provides exclusive [`stream()`](../../../../bonus/reference/streamgear/#vidgear.gears.streamgear.StreamGear.stream) method for directly trancoding video-frames into streamable chunks.   diff --git a/docs/gears/streamgear/ssm/overview.md b/docs/gears/streamgear/ssm/overview.md index 99d985a04..b71ba4084 100644 --- a/docs/gears/streamgear/ssm/overview.md +++ b/docs/gears/streamgear/ssm/overview.md @@ -21,18 +21,20 @@ limitations under the License. # StreamGear API: Single-Source Mode
- StreamGear Flow Diagram -
StreamGear API's generalized workflow
+ Single-Source Mode Flow Diagram +
Single-Source Mode generalized workflow
## Overview -In this mode, StreamGear transcodes entire video/audio file _(as opposed to frames by frame)_ into a sequence of multiple smaller chunks/segments for streaming. This mode works exceptionally well, when you're transcoding lossless long-duration videos(with audio) for streaming and required no extra efforts or interruptions. But on the downside, the provided source cannot be changed or manipulated before sending onto FFmpeg Pipeline for processing. +In this mode, StreamGear transcodes entire audio-video file _(as opposed to frames-by-frame)_ into a sequence of multiple smaller chunks/segments for adaptive streaming. + +This mode works exceptionally well when you're transcoding long-duration lossless videos(with audio) files for streaming that requires no interruptions. But on the downside, the provided source cannot be flexibly manipulated or transformed before sending onto FFmpeg Pipeline for processing. SteamGear supports both [**MPEG-DASH**](https://www.encoding.com/mpeg-dash/) _(Dynamic Adaptive Streaming over HTTP, ISO/IEC 23009-1)_ and [**Apple HLS**](https://developer.apple.com/documentation/http_live_streaming) _(HTTP Live Streaming)_ with this mode. -This mode provide [`transcode_source()`](../../../../bonus/reference/streamgear/#vidgear.gears.streamgear.StreamGear.transcode_source) function to process audio-video files into streamable chunks. +For this mode, StreamGear API provides exclusive [`transcode_source()`](../../../../bonus/reference/streamgear/#vidgear.gears.streamgear.StreamGear.transcode_source) method to easily process audio-video files into streamable chunks. This mode can be easily activated by assigning suitable video path as input to [`-video_source`](../../params/#a-exclusive-parameters) attribute of [`stream_params`](../../params/#stream_params) dictionary parameter, during StreamGear initialization. diff --git a/docs/gears/webgear/advanced.md b/docs/gears/webgear/advanced.md index a29751622..ff7aea9dd 100644 --- a/docs/gears/webgear/advanced.md +++ b/docs/gears/webgear/advanced.md @@ -74,7 +74,7 @@ web.shutdown() !!! new "New in v0.2.1" This example was added in `v0.2.1`. -WebGear allows you to easily define your own custom Source that you want to use to manipulate your frames before sending them onto the browser. +WebGear allows you to easily define your own custom Source that you want to use to transform your frames before sending them onto the browser. !!! warning "JPEG Frame-Compression and all of its [performance enhancing attributes](../usage/#performance-enhancements) are disabled with a Custom Source!" diff --git a/docs/gears/webgear_rtc/advanced.md b/docs/gears/webgear_rtc/advanced.md index 99e094c61..5ca6f1624 100644 --- a/docs/gears/webgear_rtc/advanced.md +++ b/docs/gears/webgear_rtc/advanced.md @@ -64,7 +64,7 @@ web.shutdown() ## Using WebGear_RTC with a Custom Source(OpenCV) -WebGear_RTC allows you to easily define your own Custom Media Server with a custom source that you want to use to manipulate your frames before sending them onto the browser. +WebGear_RTC allows you to easily define your own Custom Media Server with a custom source that you want to use to transform your frames before sending them onto the browser. Let's implement a bare-minimum example with a Custom Source using WebGear_RTC API and OpenCV: diff --git a/docs/gears/webgear_rtc/overview.md b/docs/gears/webgear_rtc/overview.md index df6d7cb39..d7087f9aa 100644 --- a/docs/gears/webgear_rtc/overview.md +++ b/docs/gears/webgear_rtc/overview.md @@ -34,7 +34,7 @@ limitations under the License. WebGear_RTC is implemented with the help of [**aiortc**](https://aiortc.readthedocs.io/en/latest/) library which is built on top of asynchronous I/O framework for Web Real-Time Communication (WebRTC) and Object Real-Time Communication (ORTC) and supports many features like SDP generation/parsing, Interactive Connectivity Establishment with half-trickle and mDNS support, DTLS key and certificate generation, DTLS handshake, etc. -WebGear_RTC can handle [multiple consumers](../../webgear_rtc/advanced/#using-webgear_rtc-as-real-time-broadcaster) seamlessly and provides native support for ICE _(Interactive Connectivity Establishment)_ protocol, STUN _(Session Traversal Utilities for NAT)_, and TURN _(Traversal Using Relays around NAT)_ servers that help us to easily establish direct media connection with the remote peers for uninterrupted data flow. It also allows us to define our custom Server as a source to manipulate frames easily before sending them across the network(see this [doc](../../webgear_rtc/advanced/#using-webgear_rtc-with-a-custom-sourceopencv) example). +WebGear_RTC can handle [multiple consumers](../../webgear_rtc/advanced/#using-webgear_rtc-as-real-time-broadcaster) seamlessly and provides native support for ICE _(Interactive Connectivity Establishment)_ protocol, STUN _(Session Traversal Utilities for NAT)_, and TURN _(Traversal Using Relays around NAT)_ servers that help us to easily establish direct media connection with the remote peers for uninterrupted data flow. It also allows us to define our custom Server as a source to transform frames easily before sending them across the network(see this [doc](../../webgear_rtc/advanced/#using-webgear_rtc-with-a-custom-sourceopencv) example). WebGear_RTC API works in conjunction with [**Starlette**](https://www.starlette.io/) ASGI application and can also flexibly interact with Starlette's ecosystem of shared middleware, mountable applications, [Response classes](https://www.starlette.io/responses/), [Routing tables](https://www.starlette.io/routing/), [Static Files](https://www.starlette.io/staticfiles/), [Templating engine(with Jinja2)](https://www.starlette.io/templates/), etc. diff --git a/docs/gears/writegear/introduction.md b/docs/gears/writegear/introduction.md index 5deb1010e..2e5d7e609 100644 --- a/docs/gears/writegear/introduction.md +++ b/docs/gears/writegear/introduction.md @@ -45,7 +45,7 @@ WriteGear primarily operates in following modes: * [**Compression Mode**](../compression/overview/): In this mode, WriteGear utilizes powerful **FFmpeg** inbuilt encoders to encode lossless multimedia files. This mode provides us the ability to exploit almost any parameter available within FFmpeg, effortlessly and flexibly, and while doing that it robustly handles all errors/warnings quietly. -* [**Non-Compression Mode**](../non_compression/overview/): In this mode, WriteGear utilizes basic **OpenCV's inbuilt VideoWriter API** tools. This mode also supports all parameters manipulation available within VideoWriter API, but it lacks the ability to manipulate encoding parameters and other important features like video compression, audio encoding, etc. +* [**Non-Compression Mode**](../non_compression/overview/): In this mode, WriteGear utilizes basic **OpenCV's inbuilt VideoWriter API** tools. This mode also supports all parameter transformations available within OpenCV's VideoWriter API, but it lacks the ability to manipulate encoding parameters and other important features like video compression, audio encoding, etc.   diff --git a/docs/overrides/assets/images/stream_tweak.png b/docs/overrides/assets/images/stream_tweak.png new file mode 100644 index 0000000000000000000000000000000000000000..6a956fd32cfdd318f61ac68ecbaa9617c5b0cd88 GIT binary patch literal 44390 zcmXtfRaBcz*LEOCaRL0+}(>qad#28)?PZODdfi*_-$a%I9txE&Cho z3uGeD00Kd=`ua&yYw`M8taE7D%{|>~Z?xTIZ6&m2tPXEzbhyQGwBtr%FV{ygw}?GO zvKhtwb;6d6nGMJaKY>$bn+25d27I*LO1>EQ zt(>y@+e0EM_*@VqtOrqqg!q$FeQ5ymXh1j!6Ly?L{w@ z;2a43npg1^*f}jipA8)7INiH^8yxWnfkXkRD_!{3u}rT)DW#XsTF6=7P(Ga6ElCUO zLvme*k)UPAXAtQMP{NYTb{e==pu1ek(nrpR1Iw#Y9}){nDC4LQlG2$pgm}h@QN_Q= zLx_bWpr^W1u~NJpf+>>aX_A<*3jYXXvR~Mzxe=EjP@v!sq*jv1{vJ>uws%@||A}be z@f`ZPXH&$ZI%HM`pUI^vD?$_u_KA>BxOgjI<`2V9;FU2exd25RtQ+&TzGt9GjbHc_Znw{4#1*uoCvko&evf zCB#eu#B%9{3!97Rj|879V!}c6S*weT^zmOEvPco89UsLZBPAB9&{HQVa(VDTZUU*xm5J*jKT zYN8^o2Z4Oo9PFn^#H&@np0K^K?b*UA@DsEs9hbwg9wK2+`i4*Dk1I(6P{u4df|4F= zfNfiQy~fp!E+_=-2k@SJ7vt?_e#~=m{4<-|ldDiwN&ubt_sHW@Kz&nWe+?NtjQxQg zB)qmz$jdnY#fhTFGEKUaE_V0ZH8#`=AV}W%_~?7~FfcS^bMNEiRCB7Su0DP#2$>om z&-Xt#I4Ejt{Ag`$T?B&}K2_AzXgvM#S$<)BJKHHlP2u}!#1Y5V>JDxmI#4*1+FWY4rZQ#?@-u8hobn>oO8Y@j2|p_hGVX}yRNq(^Ez z^b}+YY(_2t*p{m16uIn;PS%*pS*^i3CUudtD{@+qKHB_u9MEBGb6!Xu$bWkOghiuU zw`+9!_vy_IT7(HMK%c+=oD+`q z1F1G02;kxzmpvZt2HG3XGUD*L`iC3wnev(OnFGj-cm0})$0e#ft;M0c_SESkr8<*oXA674U|M2c2okAf$(=qa6So6J5sl$s4w+SR&5+*;z z6(_G+Q`&^}_JLI1*_ELlob5BQ9!-PtNu)AYTQ!9wBlNeC|6TE96k$AJ<2F@}bTeN#3_gk%)>mR?bhGBr0-2mQ;NTphxRGtFb!TR@ar>qfcTCR- z=)`fodz-?Tz~0WLzt$}jsXLGxN9FcnOhQM%v?o|Z;%qaaR!C|!dc-7R(qlO2-!aT# zc9i*xj?BLaT7!b>$nz`*qMOzhxql<#{3r1#8nU(BezY^92^SzwlKT8P!?yG38q>AL zTQv@<}J&xj1Sb$8||_pY78=+Al-Rar&!JLhK~} zIOcX94rn)kGxId+HQS>kDKSgaWuT^aq`KMRgP|UmeF$a$=0f*e2HlK`_wU+{6TaV1d)x;eD0p|8&n%&A?O6op4c~j|R(l z2N1M>vFO&J7>jr%Lc>#j6=aJUbtfEWSilLl;yoIee+M7N5U9j;l{r(V*6r-cYXDTcqWjf7RQMXxm+adr&gXXE3hH03RU11r2;~mr5&Pm1SJl^&-Zhk|81X6m8zv> z*|#bjv2NZ%mevJvRnD;pe>a;A5M0$<^TBVdnT#!`BG}_Zx+eFe7{*h(NbninycQ<= zq58KXDQq_S$*wZuaJMKx>i|o4m{`x2PC6Xd4m0spO7dQq5Y}8#oU-dg#nIfbhTCSVC#;gqsoB9_jLl_Fsv1K#OI zI1Pu-Nk`M<{O3zf*lnS529FHGnG?Ei`Xs;KjnEBJ^HyJ6n$uP7JXw@K+WW|FUBb z4r@?*TNjP*Xa=ubWy>!4jtqVuHs}gSobV*?Y{p?S$LNz;4$4(dympZ>v-Sp%xS>j)5%Y(Zc>dh zIRE9M_5e)nIfH{5F=3Ro9mL7eUF^cd&!ZwIZV`^IS_o>s$ z7BI63F7eR;`tfNST|E2yPC4T-x2hZ z%{#i0^FiYWcQWld_?EEdo(KRWUDO^cZfVFS4pY}M;&1I?o_zs`aymb(8`G^R?M*;x zG!plR?)U;a$lISChqDCLD%+0VeTMmZfirz54OM2pkQ!8OO#~@rGV$PM{i*{yCnQx< zO_3!y?nS=3AUkP)8I*(0p}E-Du)6jWN1KFNmjSR?>qj z>D-V0$5=6tgxH)h_ZIqip4OS{ceB#;S1Mm}z!{}={Jw~m;b&X7An)8-zcO1eE9Kp8CH5$986$CXw$%|+s9 z^mt60g}1e|iF~ssk(yG(iftZSV9Esay-X>HFsVe7DDTg45}L*5rS;leU8>4=tW^ zG~Vr~xoTlk=bT^}W&HN=Xoq1ePwDF7Gfhgz1bWF=G3j3_HVEfKGA?sJGW{#C`GK(L zetN;%LX!lSCXbUyugmNO3agUltrpy1kXEnj%mZ!MBCn#%yhC!j77YK)Cd0 z8vpvC6!(Ezy3a{6!(su5lR}#-95>AKVnpyYt)=3k^z(W+jeqli65w5t(DpheVHklUW4pxT2A(#4AUj!lduQJA#!IrMYC_$Zcw5F?WL_H?dN|_OP3IWy zK4rhi#+n~{_lF*^85vnws&pGz6-V(^RahL{($uJ0=;sx!>Z?%}$#0R{LqM8(Sc1I2 zB^IYuMUIMwaTch5qFQ}->WxH=OZ8a~LKi^VZ##jpb>7cZAp=DLbjplPE3)y^VZF0H z2h(Mk^&RU&Zni)0L{9@Tw~8lD5Hx`^iAgQCcAgjZ<->YtN$d-!|4`-}8Lx%>8q81o z!yC?5rZfRZ4RhPEBFE|gVhdTts4=0foP9DtdLZ`l-|p3}f?%9mI*{;C(sz2R*$>|A z^g)d%Fmg|Y*t)GS?07HC>4;CJ8?wV~2W;o`B%JlCLK`UNThoSEwxt*~<`f5WCV%}A>oPQ)!7J#^l=T>f$<~eYjc~DQK<<~5n2$Q7w{)H2eZSqG z*+yi<&<03RPgw&^5(>Es&SV;}cE5B@q6bD0|98|Bg8H(}B%$#h%ESD^L6({ z-t4L;zJ+=^;xFut>IMC*PLeqLMR-1M8rRV3tZHIoID`u5C62R{iy*-bM^2}j-|-rn z3wKL1bN~`SvDSw+$OTp(>dW2(Iduv2j;rmmapJkD^!N;wk>{PPeCg!{coC6Z>0YMJs-&>uY+kGC1b_!|1e(kBKCEuuZ`RDX9}JOMM6$oTy&467*V;x zRje`WdhaK?O{2<}$2|$1oviju@j#gxC9_w7GEaXpWAy}XLI*-`dV$?ZgxyO!4`62m zx=O_1+^?(9L9Rgc-3Y$@3_ES!{KF|$suY}4yPW*@V+k4{YRd00+}}&PhcSR79C(Uq zEcB|Vi-I-^1Bv$Re6U~vu@!yD1Z+P|wVlPz8TwSb^E|hCYk@m2-wyorp0EnjH`H&E zy4U~7c1Q6*RUC3}jK99Uj+6AYZ9)yf_leXAbw{3bv1V!Lmycd`_#;3>o%pwCcKp3A zA!n0eA@#@YB(+CAWAMuJp_#0bbYdM<{>OV?rC`n{LXRkEwtYGl!fJg8vs45peYiDY zj*W_dA+F7Q;X#EBTZ$dwJi(jdJmlLyb4~v8&*<;mHP<=(^y{AgA_tmeM?cx~08en-NI&1Zm{Qh)t` z;OZ=G9L57}`3nUDG#<-N&Z}N~{B=iP-mR~5Xiic}7;}vfr?USTj~nIvxi-co=MHG+ z43Fd!Cv8puO{B6gakm$68~L979wd(awtVwlY*VZ|eBjbgrB_HVBOO`D84!4f-vUTq zS$m5fsGyLMO3y=9FW|1G^p#U~yidl=ArJD0A`(;)zK&sAY9g?z9}~2XjG8Ww)w1UQ zz`wwSq>uc{v7k_Z^aK@7?K=K`MD?NNRba-w`LMfHT^v> z{~^57a+%LNrDz@+y&j^GTxKm2LkX~8N~dlmi?Lp6DJzDjeR3y=Snuf%((Uct{B<3Uwvb5f zu>5<_1wgoDTE$sDdXtaTzl4rO*g3C|#3(a+xOtvs#q|jN0^X3UM(p{7#LYjxBqh3F z(-W$I?jMmMyg4K-i3;=C*M@zzU-7D$2x6CU>9+ zpNzr96Ik{0gu7D>)*w3NwXgoF?RY5TE`x8NSaGBfI^Ez=->9&>e&f`hZU%}jrZ2`} zC)e+Ly9;|{ume_K5Ep)= zuWRC_VEuEb!1Z2{4Z?+5x z6Ch5($@E?->p*h-;)X-($@gmnVu2tz=jhmO=|3q@RzNoQUY(sz?1Y~86VsQTYtsQksflYZ4l3U}&j#H{8IhBH=U6p&u|-CKgFCe5I+Xf{%u56RC&z`?CE0vLksr zceq^gE5^eP4P5ob`^MSX&9ND?+OAHVOaTe&#@iXwex&T!K;Fwk+IO+icD})9gsOUM z$&yMzCsi`r&V3fed@BgGrVQBl18!rwxs-3^a<*q}xDX$Wcz`+8FHQ>{T%|g84|0@))&W1aAgOA5e_`W4X`r8z z_yS36eMuW)h@U`rpG+KJh|Izz=$W;@jhB>IlEhN@_M7bdRgovo-@aKly~m||n_vq{ zJ2La25cz+!#eB)GN!ulTTZsNWCM#je$gPQgAjkDuJS(Z|MeJ5$sx@rk)yigF3yP8c z7k?(O#FV=sBb&Xs#-o>tc>xJCq5D5%Kl!13(s&Zh-Na@h3#Qq4v#`E~rt*|40A-RP z5ra1+CRh`up=qWAP;4T_VXPAL#hPp9DN2pFf|JNYL>{)gQK`3JArnpVBbokF+hiQ|f$uo!Hj zJt$o5#=m_Z2*aFl1748)4!ZX}TwTU_(dU^5l|iljxl)jB-Ur;B?_@aZ4ihB=%g|rB zNAUa*h6$3aE78n*M8e1_p9lwRBs1kDT%^;nm6=%AN5T2sz)Z{uGlc4&VpG{7d#Bpk zV62&G(%vxr|EIIWuKJ2XXZJWkj=F3Z`;6S5dvC8}nFt_*>(?*}Th_Leu zH5<69M8Q+nnm_Z(2IxECT-GPcOus~PQ2q34U~pP10a=8T!KIroxH*Z)E~M2Gh>mOf z@ebhKeDwI&=p_;1A98#ou;-mvY-6ib-aMqP^~0iQZ$ueK8wf(K6kPTE?wsT*xSG{Y zm>eyosiKQ}xIOMKw$Eg;Aq)~$g|OC9wOx9(5y%tx4opeLq(8|Ng=>b#jEU>z>E(2? zRR3-ZViu@sp)12s%#)%FgWX!*!Fml&A-25TG zS=m;42fJDI3C*)L-}1|SNec8^s5L*{7IBT<^+}Idw+=&2Zv4m^R0JofLteP9a!mBi zKjJwheP9V$km-~lXVI`-ppVB$Vvr%@Y(MS()@Ziu#?i7(kRtPwzrmKoYP_w;_2?Uq z^Y3jg>uF}~QTa6Q$#XPX7J%ZwkLr+H!x7}Hqz4Z<3BB47rSN7e{+jw zG+IaQSBdZiBgRuHYpH4Rf|PxE*8H#Q@&N!^W;H%~zR+3fg6#&Q~52Ri* z8pO-l)q38&sc(?NFvEh0uFYY*UR4+4mhP!T3g~YZ_U2Xj&=`P_c|Qy^?DwrWa!W6{ z2a99CY5Z?D=8I=~R%d*~vQSgYs*>%e6Sz7hqHOO@O<#%?8w6`a=T7T4@=0W9F;}?k z3x01aBY>uIqupY45_VMm|V;uc21BAVkaMes-omIp3Al z83o-1am5{9oCFn5)pN7SP024KiCWjcVwnvjGDLEqcywyq3Q~Dc@a?x%Oz@}ZBtIm^ zBVhv2+XH{jdF14fh}5PT`3wIYr0{R&858z(;RoDUgYNt?*XRb7`&T*^zE*L(z>FAE zk>t9=o$eIx<8c;k@0g34PhbxMK!V0E*#db1&v&&5!LbaM(JWL3g#8+@w0V~QGI2jJ z`cEsi``Q#^Zmfwm%pmIds-2YAVpx`jKk+;7i6bFz&5P{D>!ovq1<>It;g+bw?jagy z8^Kphrir}pmD3Q1^akDRfQB8AXV_@2AQ+fdn9P*t>Q?&C2gcJl-JeaWHI0I5EA zI8NfZc^Tne9n=dcM%R++W>(@DBF+hL>5rSm&AB)Z8h6~Jv)T=Gh~CowWdWAE{C4y2 z(6YIM3=mQSMRBi%nV zi;$=5B{l!6awCjm^VS0G>W;SaK{oTp+cW3c&zAfKeS7EMct_P29ZE1xV7=}H_kh;O zUIIQ_g}2rElZ^uolbTD!RgA6~;mvBRu2U*OG8!uabmk}pdnrQ;Sp3UVoxAv zwG8%a5W!JVrR8TOVc~x>0_%`?oxmnzwlypSwx;NP|7r4$Vw^Ai6xwins!EIniwopP z0G0vQC=(n3TE}0mssbvylOxLnNAOH)9lzri<7AfY8}~5(N?^An`z{xU zAXqn);l!YnVVbJ4m;)lqJikU~(9o{wajr+)A-3N;e<<76A+Pn|~kRb9pLf|#SULrS0@`&t^ zG;LwF#t?44Bt$GTfgObMv%Mi97-AcL6#_Z=+}&(DX$uN5JPWut-KAkuMdYp`?uX0) zcSA-3nEQ~a^>w?&daDBnSzQl;FU%Hs<(p@_FIo+s>^xT{Olv1MGTTOFav=$xgl#M= zrwpa5SFr{PDx!9ORX+Shl~)rzgQUm|cEp$x@CHS4O95aIIqMQ1G2@cjnP9pkTAK8e4=PkJ&zJC%V{z8BOho%(WV26 z21Q-4?(j~OLYFXvPvQL0FKv8%Y(W_|k?sg3N2nMcyf`R&r;m3RGj1oFEB0+i$ywv4 z8$E)~pWpaXpgO?}S7%`@QLgK0O$ZVmWIy zG5+5zCQ8j5cE@Irr1Eejd*#^g8o^}@2fP;E#-ZAUe{H^t6W@F^mX@~3;j{WsXGi@U z2c`Dc<#abWY%^TUR!qXr0^ktG*ef3*2BJyz%lyIVEY_4#QXa|yVKj}_p>Rzq8bBg9 zF6+y^dS+EA%-pe0+)@w%? zcf(cx7;BCQbXC~F^PsF$%hP8*TjKe4YbJa(Yb``rhL3UJImsGi*V+6%=fI3kIVv<+ z746?$Oj((OaHNDWre_8?17!@@bQ`0ir@r8JIxaY=*-E;eZx3Mlt{AyX65x;>L4b@e zU%qeQOJz|4&dV!X6=hai@T8h!iDQm%HG$-@ZHs7FM+hPRgRt&*Xve}Jolf06ah_d_ z{djChi}U?{X?a|gKa;Y0ws1A!APOv#kv66@#Ao$=^!DFj1>CSYEBnuNTha-yCj2pA z%?1!y8Uk3$i!atKe0A^ud3}&;Mv`rw2EZ7TxV~vip8TV-yU0x?a5E+e)+JE~IFS5p z{vob8;{m!4XW+;$$kcCGv`9vlT6YYB-f}W>3mn_de_h!{t2YP~)yxlN&&Np8nf}{& zYir{g95}0%JM4t}`RvH8lS$8amxbK6BuwEO-n){wH_O4q&szZAU5+VYMw?m8#wm8)J%Z}kc;XUE+x6A&XIKb`e+gmX^;bcjHbtQ%` zi3dN({)j_|%&=m3GV+7N6kjosv#m!d00I`(>dk2OB#LGn;!{0q7!%>R8S`3A)Zc}=ufM}{b^+6*BmOZVmqP*SDTeX;*2d0ulT7T-_E(nSMaPb zk}Qs>kmG0|ES*BW@WKUjR~!ok(l2~$v8zx}@vS$M1mTe4baB3>hbvRD^L<>orNv*2 zobBq$nH5@(wj_U(7%2)kNb(O8MNr z6-1EFyMD(oyYVK@;r<48BX-3ts4`6?aKkf#rEHOFP12mUZ}xY>xjc+J(NHik5aSlP zaSfDE1Ll#1Lj?|SPvQ~IsMX1q(*l51uRMUCRvMPZ zxhzI9i`&V@l`nsVGwz}d;{h7%_4~pJaej}RtjM312sSX}H~cQo`{}%pX7u8Fm~j$9 z%6s#)`}0P47m)%}#gcFpKR^?24$@?Z!gj3?9HaCf`#hJIHQILx+os1yL-5uv)jy7$ zAUbK)pd``K{m+cxYfcBEPecOWmFXo~p`_dLZkECqtKnF9s>lCaV$8;I!aNwa_B#4< zlf#+?4SwNL^~^@4X`4A)FzxdqxxtOk{H9oBjoIkc@k$%ZEmMS!BniWMMbqjB*j73~ z^m89iZBjBeSGXtcYIQ+^b>cV6vIV+G1yGPI=`F$RRd&Q-(Ua#B)YRITAd|YDfj@(f zGw{XG=cw4%KbU;<@4O#3l_!(9F7Gbdp{ISGp=~#1SbQ}T1tv$g4QX?3`&ob&XLPyYN_r0o2;T@-Khvl--v-g%H=1gyr|Im@>77bm7)S0(Dq28061CwVI*YAc*NqXCmd~$9&5_biyyvsgjc{$C; zd*~HJnsJ54?+@1rFjsn|M--KWZ&&jte_%YNI@-3oosH|2j_pC~j_fESbtwbgUhS;j zqn>^3RgGZQMq9>Ebwovu7p7G>j~`)VUi|d<-WexHqv8)5g>%h&UWJiaNDbXdL*HfK z959oU+;P#F9$X+K)}!jg)Y}duaSs(X#b3gT@z8@mg;)oUN=0g zF9NzWRD%xs@RMB6qu!7wqal^JF~3C3a;(K<%CT9c%85?YO?$veXmgQSp|wRJgx2hr@`)%Qkvf%qdMmF za7A>wV|W$3&?Jmm4KDYm3KRmOUN`Rv;Nk!ip(-Pd_Rhm_M~&JIC0DseIu#x}UTvJCsSy7J9HfzedWRS~w# zc)5)bdSgu3?c*vNm{o%#5bHFvHC<+7x_29|Pyuro5v4R@nC^OheI@-RcrF#~irF-U z&CqV3rD9^l-L`#f%em2cN0D&Hq9Wq(4S1qkEXl%?@Vg~4%v1PL38FfUZVw6ba=GbE zh>jVc31z&=M7Aetbuuf+8_ITqp4rV<)2}0$UqDeoT3U0OfL1Rp zjp?!ZDF%6?1(sX@uHDE4POXd9dyy(QuTl4joxlNrfhb@tYgFKaGXWwcfXNUF8Uo|NR7UXUWzKtl)jnF^9Icq6m)OrxMq!WRgBZ z{fs=$si8qLFAWW548jaYyoMD~FY0k;HR6OhNC(EqY4MQ z^E=PFE4h<*7le$2OiSV}6LO8Bt$9sdRXCYhJpJB%NvQQIb`_NM_BliX0`}Q{tOFS9 z`NqOtRMgE+UgtBn=T)3H(7AHH7da%BsyUw1v`keeaoR=B3JtS(Ik+LF75> z^K&!ZaNGk)4qFO}dV%Vj#4nR>e`Fc9jH3FDY&&!mn_8?6XhB~6Wed9^dGficj-hSs z-s$*u?BiNVE1%B)xA1yMpV&>RuocNE%-7DH_cORodlMR*HQ!wRH0_fVaU?YWF9VEp zLU+hPcTEx5WR0+?zh?~j5~>W%zDS+5LmV8K)-go>2&{XuOOaQ^qOD;;LcU%7&bUX=#1#LY<>A_mDOWnPDY*}0|Mw0bOt(vtaH`%WpG?mV z2K}AGLWq#5DiRB@o6^uwo?>bAt~cF9U2GC&LKmK*z8v75Y*u2%qDddHqGTL?Lsq3D z?mV)>`u;egnc)|_iGj%qA}q5ym@(PJ0Q|?__);{p9}>5lq@{x?+wF5)E278^7{P9-!nh`td|NB)Cms`yUp#MypDGlK-8*y_uqPcngU^$HXo`B%dXP=uYx<{E?=YuUL%7qSZ+Z0-- z0Q6kIE~awfRr;Fw->05k?#1H*M8}3Njt2Q}T@1y8$>M0D*Cc{g!(NMwSn(peNH~?O z??#bS{QI7wljESp^j&8|UF%F~cMJ?X-~D_{FM{0#w@yc_iV_|RB9=irIa59(4EbG6 z9TCdzg}RxnL~R3qP!5!trz*Y)Q(T|aw>a93+aui6nCzGvyiI;hBTDu-X3jMKnHh@- z8!xZP$|49FE+w;=&(#NbmXZ4Su-3-x7wxsmVe=@tH1`U-+UB5*1TKeF(JHmd6Ptpo z!Y2YG%fhD*QdswuzYesxV4orWa-!jc7PdCcf=aFS<&U&-^F1;sA$ArueN7~SydEuA zvmD$MSo3Hq>~a+KcH{NJ$1UrEiZswR)47VI(JT7f!DWE9v%!gob+V%nRk=Br4P0D~OhfRA-2a(5oeHp;c2+szXZ)9;e!kvS9l0GfF@nDahqU&e?b?V|-?Axa z_1R@ywmbXiAcAga`4b5`m2152n;P+;FnD>3OsB;$E$1&p(Qvd|Tg8@aF&w*lWe(@oQpXeHM3<>#N# z5<=T>*$7^%O!_hnW`}>O^lX@k%i>@Re=}Q=q>6@a$AIWjK|~bD6@(}JYF=ge3vz*I zo%ytt@fra&6cTiXzP*cS%c>%78C><&JM{X>vv!>S&BdxV}!n8%Ql}gLgLh z+#E;V-CDN-NUqIr4wR*EN!qX94pXIzH^)UvF`G$V`_W-%7M<{^It>jCY12@>b5$P` z%z5jhn%T1EhjX9J(W0Liwj%|kn5-j34IdkUFY zDJYSZ)zOmIi1-+FgUr~12&>IQhxgO&qHX4`JG{$;2eP}e$BSJo-%L^SdF~>T)+0ag zLH43$(&3U*!Kpzda@8{FiEqJ(jxz=m!IC*%CE8cM+=9ad2^Cjcf!OxE8jm<+aQRZ`<_ZB;|Ca!Mn5{HmLnK&XCy>3ni%}7^Sh3VEp_#SdcN;u!9Fwrn;3T`Uf@fMYlA~r;7RM( z(_d-B9#K|dd~gp$te@Q7)LSffcWF+czAn_C(uY$#+JyaJyYp{l8xa>z0)VexzR?un zh1+X3znODianotJ+s&GZbSAw;{^y*uA}cfUtt+t(Ba-4j#McuB@Zyu0Umd+>|AJ`! z8HBtKfHFV&PpJr&J#(4rd+!%JBf@v(d)#ckw;SQeFZ$@ycoL6}!sq>OUdvAjE=h?K zK;arY1dMw{m3vW~6i@T{Wv>B`K6rYahC42=Y*5s*ea{C}+E`DXhMAOgI3I6157d)fC6%U&wegbiyGkPh?NpiL?g+SW@lL8O-pWkP4%=;!^O!|E?r;WZ zG#bNhN%S3qCY@Jsyp{cLz5f)1CodqacOY!r?JcMy(l?ts3B+I-)7;q(g8hncI zn1TKdXkXK^bMz+CsTd;g+9C5oWTUJu>5A}OF`H|?8@Ml+!*XBE&jE}MM=4WH7*D@) z^6eAyz6xhm_5{4X_s2^6s~?OWNFI}^>1gZSVlMz6e86qIhg!8#)>5_D=h#{SV8eAT zqhd`Nk$~q6EZ!2ev^ep5==YXxx812Abm||&xc=+qU5R&1YdM;8x2{vrX;F zxJ&bLD~AX}e?!*89T+FBLiB8)_6EKte=CX5C!`R$ofjzbT9WF!GKDPHDOBH@sV|Lz z4vf2qaJr_^sZ7@M(%SPZ3hc58(NV*o+7$Lpr)Nq{qJq!0e)XTXVa4(u2j@AGDOWvH zR8;hKJ$TS;Q!`g2y1?f72YdA(>ipuS7)5KzdbBT>QO-d&_n(%YzQ^BXnkh}n(6(OAbjro8^TBu3Q5unTgEozPkyZAad>yr>sb zN&0dZ@bB|R)mOE}z7jQ&ZDfRqBhC+EMsVDr(KYAV4XjV69Czowq;Z?pP#ls0_KFV? zzlynv5nNU83AXG)D4vUEV9AT$sRwq-u7M;YNo%EQc^ndV18TlqX}3Obl6bQk0L-2f z3(C8X$V^s|`)zbt?|0Obs2a+M$Y-DPiB<6`zy>nD)%&;A>xirRRhu$2B3@&9~p=AGacf8`oc+4 zkqkuhjzI}NGW05yx#MP7h4~KQH0 zzD{Oi8$ITy@pS)<1*~`&<@Zb;}g1ft0aEIV7!QCOa z6WkqwySuvucXxNUFr7Cuzq1xAUAMaH)Y*G~#|(XV$0I<1p%CUw14Kd;kxjn^uK;eC zLH{Z%N6siXM2ywd^{`?&D&52If?)lH#{uXrXa0OY)hio-LS!}KC1CbrhzNJh*JJgn zw`gZz%!Tx`trQ$k6Zr2U%I&Iw4O5}Vuo~jVlhfefGUXgWmk>6l8d$5LKAtRs%q6;P zE&Js`9X&)Q3&DKSGa9c%MqB893v?^N?UK+ypw1$&4BknQ&Y&fYcc90*&DBAX2HCzP z{hcW{;IL&@8XVh=$Jp|AepOTu0hbyr_WO+pGM|cf&oNAO8*pqH1Lv_)VaOOmmv$-= zNRQRT8raqITc=syrhteEOTxj(WmVVT)|YkLS|KzZ=7wWCc!(n``aRz5RcHZqUH1Y! zhhbtkFMHTt(YO+S7VjP)igp3vw4V0w5&!Mv6;?t~Eu@LdjkY{tXccwa1n<#`Y3896 zfwC|J_|QOzoG#D3*w}yskO)%bJy*Z@Wn3}^q}!G~8Y_K5($8iiiNx=GEb7FfvZK^GGGgPad8WbE7#OsRaf6#$mD*N>_rm(tb%x^~MHv5f}0qJ)0MYSJs`62xwTr9Ut*KWKHTcGODZgD6N^oJK#g4Oc}0An&*tLBR{Yc2%YT)@z7MR&>f#B+q;Y!^who&Z z1J<$I+kF9iZa@oPXP$s?G0j>-YSvQd!B(%UC!+-u51O+BI0~C9)~#JO>#bDQtTwX& zKE1FMlpi`WNFd4w4cipP9I;GT=aPHLEogIusL(I5E1o!N z&WE*8!MCA|Rf^@Ih7_W+9ji0Zw<4;dmSd@w?u;!^ly$=RMxI6Xuji43-z2=;UP zmFv^rtpMamc*4QNs>gs>NPGrFEzm}q`@kJzW4o8g^>`>b`d-j8&FpshW}9b&%E9Bm z#MWi|t=v~c4Q>Sb3-}Klp`Q2}z=G#O$9lV}*1Eu%J$?uwhw)E%>7pQ#N>B z_lgol7;V5F(=-FeCDXPYjLRPWw&Q+7zvi|ZCADg*v}NDD_X;>xODpY>zp$j8yk2%y z@iKwTKnkmIT*dj(IbRQ#jKTCa;I9VGKvS=5<~#U8rmQ;T&~^jbJ%M68;^J-+S2pHD zQ{=5tuU25m{|+c}UfU?v7OR&z=B?^;-;?r_S({`>|)VjLR8RtVFmlS1%~FAQ3)nEn7h^Sh2fRh>ZYaa@ESVbj9t zKE@RpYLq!~#q%H7w5?CY)e2m5`PUDCN(;MpOn9C`hzW;=ozAEz=WB@EEJZR5*B!7)iX@;-1-Vi_xcPS0^@S)^T2~p9&~_neMhelr z`_HTPkCJI2Tq8KWaxnQVRH{GUYVXCQf4vs2aBOLjlNi%e*n4dG;jR#HdFK=0HA96A z!cz)C`1b114pZS14V3w=wXmM);Aw-|czya*)9;!j!k9^ps%bS@jkTLWkR_~F@|gW(_)%qibb3GWzl~zK2W>+`vAV4b2#$e~ z6vl9Nqa3DI6E_L5luue!0x5H`8)&g@8MW6m zvd0CFLDJobwav_1vqO_vB()u(3PDQi0l~${b-g2|KSL^@(j?<2vQBNOmH^BC#MezO ztl9=L2(;EEugAeR1+6ywuS&i|U~KKSjjJwN?uI*}8i%D8GaVAh7<3~md2(BGSi0uS ziMe}rzU{X_hhU<+(^MIWLqTY} ziQ^wDx%!ST^wZYnouF%5vA|&tf5tg1?5n5~b|Vs*_%K^xiZcSO_xk@sH~iCU$6b;I z>sP@N6KiF4)(7>m?XbnO;%u~R>1^AdO<7oQ4`DnnT8hr9cYocyB~XpxZTkS#I8PZ7 z;wW;^JQyS~6WcfqJ^O==T{O!b*e1ez%r|@rx8A6(3L9Lr+TW9uu*Rc}OdIw>yMl75 zxFEeoc}X^{h_Pq^T<->qG|4FS%Dpr?K~s-StD#7!*^*9Or1Ihzx@!lfEB@~H&rvgz zn|LO{0eZfZo_~dKa)04-53e|%{$>&hWhv&Y)8R=C!cy$-S-A1#5Fp^)8$S082-PO2 zNxyR;gf#}=tRcCvrPvH`__^IpmtG&YEaw%EQok%QQy*VwMv`rHA(^)M`?zIu0@_+u z8O#pov8(9P>)0cXi5F||!cT#>*R3?fyCV8>#LHYQOsg6(R|on>JsYTCS`|feqYguM zK0V4h%6vX?5svH|HCmn1Y|l4!^dsXFvARlQjZVv=(_KYfXVlf;bmPwfb1kcyq`uY2 zu!nS$M0=KNbGac_rRJp~?R$EBknArM`oz2S>bLuf8*+Y!lMD$|+g$jY(2n=-kfd(; zYk5cx+Q+LWL;w8UyQ6exT!4#@tabTHQ${tcl=@gli{>0b=Ci@Bfdf8df5;=C0h*Wy z2^lK~WTT2WrR$ksPj)!Hx1u>%^;&qy13o@*Km)4Fv@#2a)MTJ%lx%Y<_+o8yV9rhy zA13locc^cm={u_y^X`Z&DUPWf6eq?_i-Nw{7H({>Rw!(;Eb>8?k0tm6+i2>$h^D{& zNTl`)CW9nxJ8+hVS&C?O$sVx|;(jebLmEc2UlQ(oNK8vpzY(*t;(0mAc=e0BNv~td zVAn}VwZ2)k?*jBtG^XHOm!tgViYAVdPDT9|_#%tqkVaqAiL@x0h@PrKPW}2#D1W=b zDB=x8Hwi5Xx^zgW_8IzY}4>mzo_UsS{#>T0taT zO)^f0tb$YmHG>Y-?(b`@buvqJBdwc;!Sugl-vy^GE-h$R`8>L56d0n@=@$+ z^fY@osc)V)386ay&-f;1XV>UDLNrXCgF^>m0~B;28Y_;ynZ&?98-dMCdo2hg4o*3x zS+nl)!(5>dLOCY@`cZA~aI6Rl>IO_q`Sw;-3HQz46R=r6I6K$M0&;S5Z+tr0>hvez z7`D81EuLg1 zC?tipKfjGe)0pL`1=>!loty)2f@*nd5wL&|il;l}Mv<@7!D^}1Bjx(>y3S;%BM?LIoL}(6ACLP;C?-V6CiLuyHD-Ncp!{O3iF9#=@cx*tM!%` z$|2^fi+*j7$*35phMhr#*gO~AV?){2o4<0|q8LfKGp_A0kN0wwEu$OBp}RieP?Pa! zM}H_OPryGpLFFQXpNRw8#vvoC&C3y74(^CLWTxxY9{wT^6Q|a)LX85j^fsJX^>CGa z>WXk&w!~AI^xFM2mCQTr`3S6Jegm?DwJbFIQ-jc~6B*04=lsH`Phz4J#Qd~b;C$MR ziN&ue-Wu1B;^J?*&Ttwpe6}l1$DXa1W1{Jn#^878t^@tm>Se6W0ycCQR!q85u5Bd_g-1UdWd~PQ*?7 z@&iblr!s&fpjL^jaczUJR+`lLTN{n>RsXC-7X{Fy<5cm8h$5jY9(w<%`wAv-wQq|) z(Kf`y!P+K4*xKUAD~fM7>slkWSpReO)#^HRh7I5E>XlQF^o#Y-%8KcNf$b%toEV9B_0<94V*f6)N}4q)Aq7ECSIIM=v~^Mvg_l=i)~@ePFY z6=)1f%WXmo4%KZR(+fNW}85NW~U-E|6`kMCm1C_NmK zsIT?K!`*ohA@y-uVn$)Ir%Rp%641JXCHl}zOiH^8+Lp2VWLUYZc8BJRHT{y6)7WP| zlkRLBKe-J4v zQ>_-6+J=6X`XzAD97SC(7@}zjgk_c#lBh?2vox!}LJ6p8Qe*Jzeun-EZp)O3wI+dN zAb*^r@oU-MnVt9dWUzEMPZ7X;3tFk-II<^M<8-2-Cb=kbXTg86TgcnYU3XbB-RY8C zr*o)^`-{9tl5wm$r=XjaaCk^lv0LU3zCZViytl*&HDWv%VFhmDC?E49OorJ2uS;2- z!ia{NxzOu(;Z`%6+!bB4*c906O|q_sq-xr*3&I=8GCw_S!jJP6w&w@&L~(XrgTwnJ zPlH&g1Bt>AYefl?mv`NAVyAN>l;b4WkSnL|3u(VGD`)}kwjTLw8)ITe(Qy513FabgZ%3Jd%R2r)Oj6dmCx!e=C-{#~D1$yq~QfjHV665O8ED z%wxaS+`STb2lK5%!0gv}2AA@_z$XNsPevPu+~SDcuJ($t_C32e)5cm)nElkM#F2)>!%EB%8hWn5YaEzKEX; zi8R`OJJx;;dZLnTBX=qTw&Nknpm=>S%DN0HjO#>aerTe0$ZmZq-BDNUkNn+NZEmo;h5y^hZ;wj4T zD81aaFFc&~2Ncr%{bao)d4cyi2GXVAAw->-r>QPAB@#yxH9~Pm7{TXHN1uN@^?n_$ zRMFeD%V3KLo~*?1yac3}?fH!n{l?xBZ8>g@xYO@Hl671(9!r#qE;;}+vO!{EB9EH7 zy6t?WPTToHm7ZSC*@?S*>33TIJaM89WWt+u+FbNpPUe36Sgh316W9qQ)y-x!=uK<1 zNk&rIf8sF>@sT5oz!x>fIz)G(;rn7@Vp4O<>n6c^w%2Ou-EpR;NUU>mnvX%tn(=19 z5;-6=+N}3IQ_kPU%2^(EV(s!BYMe@b5oVmX*=iSXu3v+?wOgP;qely*eC3{Ao2lw(DSI zq@kf<9ZkTO1&D^T-ewv0gVO&%Hj8$2zwk6~QmegL7wwbxic~}|P-F9*OyHrBW<}=q zP82cG5w@6J1l?O4aNvzOiDTA}^0!_f@}=Dd^7s99a}Mx$zq+pcDB4FOm)Y8#CfCG%YSr`ioJ5Po6$^Z^vC!(v zG2no=(2zwcV#n;37V{-HH3F~LArmn?sSYZw$A|GP$_2v(w_|5EKh`k*7kGZ=BvjX} z^lFvG^>qO{=cZ7d7P_qL)aUjKes1F)N%TD7fN;Bhc5Z|T>tuXrg8Zq^MSG{WIzc2)Np^pQtV z2=q44bz`g4s5^*4Ar+5IW4Hd&sK3h&WYpu@+uQT?-1if4si~+5gPfOS7X)XDx1x6-m`0g2$=4EJ+z;RVml z4OfS{sK0qma=)JScPBpvPc~RC43|}PkV^~v<&68`>l8nkkSHlD8ODvS)#6|hAJ!8r z6JVbAj){e(j_-XpZAP!vc$Ny_Yv}H-tbu^!vvQ%#ajoG%csgJ)@Ng3m5s?@wOlMiR zy}dn!hl5*qIA4AIL$A}S&FB4cXX~)w-v28v5BgEL16_odUB5xB=ae}9m>5mo2R>Zx zWqxirbaZsMKHC=$% zF0yRod9@;aT&qU-J`>)WBNV58%irIhtuGkS6M*HOU;(&Zxbeqoz&wDpu(04`cCpqX zU#?MS3S`_w!V<%*1m~tVXd2gDR*$u;8|I%1Ue;3WNPsW@9!!r2$ zs}R1lJXh$~_b%OK_dd-OCVzf_*`}Vzs8xj6f6->QEu6}1B4szACc*-v_E7|QUyi}F zlvxs;(X*_l<^-{Vkh?rMb=T7c>jwY=&+lgEv(?JK6H6!4C(0+$DW_3uQlwpPOW+an0p2eg{!mW6Z&SAi>9(U1 z@R-H7(3f78rXS9<`bUg}z%*`#!#OsfCSbWKK( zt(e$AH30Ly?_slCxU(=f4-KGNu1>mMetUac0lKIROF#fCI*rrbundU&`tPJVitgF2 z+~vqihQ&`dI+s@^la7r+oiElOhn>u~KY!!`+`qQ7+K4&8+(;VDn|D(OsFgL;oPk!V zjtAo-ntl41{6Mpe20>Uwc5~#CUkiTbUx$Lji}YO}&Lk?z!AUI0K_YS{g%o&SVkh}O zw5dnO=C$o6NxNOowt>Tp;Ys|szXfaMcZnB(fKv4ppKSx~z>AqYVbs+hR11z+plS+L7uZ%&Mo+f4Y|J|{H+jj5zA>oseqsfH-4t0LG@0GS6jnb-yz47 zbt={h*po)5CAO1jZw7H~iUMa@p3D|aK4Iy)`5iT5 z+S&t-M>PK@QVNiW^&yyE7hcy!6M|+rUK(DVPlprJobGq*zF?$8zF;KP)#m+dAL>c| zTbbG8tLAvQxQfOl;e|>)hG^b5J7r2T@=t3Kp!1<@H(`FAPTo_$ZnmRbB>dkq6fB8U z_P9i*0jA`Lb9XxN3umx=61QxbJcVAE=$NJJvq(H=G1yNt9HtCNI~y-s5bi3X2bJ1X z&cU}E39ed?7#JAdEN)j@Wxyt9x&iDFe7U)~#6Cdoo!tBA;PB9>?e9Dp0D5zyhz5oR z{O^o_S-?iMv@;;I>(6lAu!(_k!i!Rq$EqtBC1ZAB+WIujPifQKRT2W|dE7z#hZ&A( zpls{)f4Rjo7Z8g#8l->Y=^rh)DuB_DVFmD^Z~ly@-mH{Wnd-2j2zBZxkz6)fVx=X| ztB@-m7W`8^RzS9StDRbk#9Cws`s4Y4x9+>g<3%Rr6E(=jJVwiGGRd+~F7YJCK}?ZC z2GFk(!8Ce+bK=;n8&5zRflfMwqlpKWKc0150K0`=e z!?__+vd#^)P-tqY)}-to$4K~Tezsa*4DT2Jbu)O>O23(^CR!05SYR>_>qam;dW@$q z)jDSLhXq`=JD=_qdtN5aCnko?{&AtVEuB2v-=BXxUv=F8&fo_RK;TFt>)Rb5`MQpo$5;lLNMdgo8Jw9|uxZ&;9Zce;vczWk4f zlW?t1av|Bl9zYa9=|e|}3PWJ$dw0Idq3B!w!z!VUTs|pT;7!4Tbo`kT$obqS3+n3X z&UgDkOlGr~{duRO$}ip;!`#&@s;{rglvs3y=W)BGSF@oxHq`FkQYz-9azQ~& z2Z0qR1xOx@Ln{uhUFwoMT()v^71uiC$81pQpB5lKT?ly8-(7ABC|-Hro>uMmzh1;U z1VoE9)=FmE()$60PcTph&@Nk6bwJBy@wT)U?4e{i8dk{em{IBTQ^RqpuJy2 zswid*H_u0Il8#xOKTA-SQ*<%wLbJ;1^CLKPFP(8XLBQu_<8*$AAP)x~V?)n+QSKli zA#lap8Y~5|8feE125$NzyNLxULc!1eO=WpKZ^}l%W9+_*oqD}8=R7P(-g|c4R5dx7 zP4a||i8R|Ajy(i6Mu1}(@(w@vK$A zKo^V{PeCDPHKV5SmFKd(4$r*iVCu44ws1S)6Pc_vMoB4TMu?F|oT;q4x-9ajae+;Aq9KXLy z1X(_VIB<5U=FtEQ(P4Cig639uFbU|Gs#PT|Uv@D7O#;;wyOfcyXnazBT~nRjc@520 zba+}$vRGOPuQF}}7d-Wa;t!$V7SKKoq!Paph(&@$4IWo@vj5UC$dhU5wY$}<-EDZ@ zEX@>zz22QT3xZH|`CqD2xhmod(ubKAB22G83#I@C{bGT1GKQ8-%l;ZLwt?*a{(kQl zPCHOaXD(D|No0L;07(p#MQUT9qtEF9WmMbSk%T~=mlC^&dU;dy3-h~}ac!s-;Q}sy zy>3%oj+xu3Vqzk7s@#&RIU}WYz$O+>n+E%g%ZUGviMwyze6$mhh`3ar zi#VChA-|E%rxldOtjGdhGuf-!gIQ1@ZW}XVKc>v@!Z3ngjfy5~&TMr=qO3sY`A&N)KgT=juGF_=xFOG{cF@+Ko^qTx<9eMy@~ro{@S>y)^s{+A z*=-`%WViO?iOp)aTBWx0I2Mi)R=ocyYja5yV9=v?W)gt(vC2PILj4v%|LEqunt8Z^ z&2F_cK%-vS4qSl+`06@4H>{K-VpJ2fq@|_1Z~(#tLnBa+c#;F$0CkdYuj9BpE>_p0 zi8NIiK&{4ba(WsswKm)Q4|KC)QC9KG`!Lhhfxrnx^=YjuJCt|t>BTgmq2wW={}E_^ zqm?8%ALVpoB8OCUJ&ObPN+QoF?A-c|V;0k4z)w=q`S#D2cVoI#sKtR1ijtHik0A!g zYJhrXMLA89CmdZO=PTY0R85@|k(gHp=AFt?(j3e@OQ0_RYMwR=VEAk;O2@;tngEiR zJZeEPLhFTVKIrz-x%y9E2o(Qv<81H3xa`|}sdvFysQL&V*CZJJS6Zfc2p&Nhuix2D zW4#sk+*MFCW?{qmpk)AuUzN#Wy8%U)7@Z4LI}=+y0rG^*#={B!U@qd_98O15d84DF zjt7(3ACE6DFV(xm`O^$EG-Y4-y`C0k3uVVzCT&%&_Jh6=KJrntb2j_Lpxv7qzOvOk zw^Hy5O3lUIbXb|2kMgT3YEZeZd%G$Vi&!&nkxrGVGdLT{Ms{XX`7y6B$flELd%u*- zB!!%Hw%u*mOBA%^%d*sPrFgn7&}uX~{n+ZJuhGA?jwpn{6>I|=>Xmgg?+Q^>9cc_+ zvBeg1(FB}KNvv+sjCb9TLtf+( zfm-xupTP78HN5Fah7>5>=ZY0|fQz+&^Zsb!NXFM-rqkA670%Io4^Q?Q8p{Gjvgw#5 zE0CbIsVTd6;C@E$Fp4U6wEVFEypS&4z?vN%zHHGUJ>5WK_l>E?pV``tuHc?Hl#XM730w)~== zL{l8Y!xP#xSZz&yE~kKG7g!jCz}cUoMB?qcZN0Qj_QmhlWDb(FzTOD{^Br*5EE1UU zj|QK+IrM(g0Y>NxXj>c!*riZC{q5d2V0P3fo|24o z0>o_H0N&)o0O-%TX0iU5DL*-}I5qKEi*&kJd+r0A$NSv?s4jx}UNHnRy08*J-)A=g zBPuH5IRkm#uKIq~0C)IPM{AEz03HUjWe5QsD#T>Po?-J2a|lIeMGi|*wP*0j;@bz% zy)6BUX^=;bC?5_u#}t=ar%0-&`GG$%L2mf!a;~Cb$7(3$iRL21$uy$W>U2RXf$v>^ zGn~R{pXYV`eAo%^4;6wK?Gu`jkB6c;(DAswXvNga;(U;hZEJ6Dd=&-3Ozu$YEI;$+ z(R8?cgD8X4>m9q+tnqo5Z`r_pu$vcjhRLA$)wb*Q7qjG#DODZp@%6Tqret}wiqUY2 zR^bvN406q?Hv4YzVz1!qZGBk1cSoRasN7{SUuyQRm1&Df5`Jgy6z^}CD%!kV_G3l`v*BW0@}L~En>3og6r{pYxr>p@M?VF50}q$L zv;3j!F7~5utxl&(xAtr|+WXyaJ>v>x+QkUM@i{<(lK2eG(DR8K*tbf z)tq)8`IsCG3z)zg;e0tcgZ#VQ8KTl0E`f_pcb6?0@_AorSwT_mbC10ElugT&Kv zavBsAwCM!g_=f@BRWP(om5*{GxHoLwBQmd^#obKk261{?-8%&2(I0Z&F6pUW_DNO6n; zgCKbCKO7o2Hu3m(HnX0zKak{Igp{h>Kgc?YU>ZT=-{hEVYI|*s^6M?FR7`waD$A5V zYRBwXH*pa8px?&iz(124OiKg>o#c!EMR-IQ>8H6Hc5!i1O?+;@8I6b*7V_gZM-2Cm zB-s&Im^aB3&`ig>va#dfrDe>-VJ$og0<2&kT`Ntt0y`~-S?M+=Kx+U6 z02_~&w!eyD>D2dGH?2SD8W|Ze7Gr<%*SfjAy|gwrU;5gejK3*};=YUzJM@6RYFij- z)|`HRHeqo9E>T6B+j45tHWD`wq1y0${b9w^7N_PgtJ52-vuV>l#9=Jg1WjRk|M2$B z%bi=AP_OSfZpoS*GgJNbY-5Tjz{0XX%e!OcOKxnM2MMV6ErtQ79~62BBpbrtyz(K% zL5%Zszltl_OJR@xVpEVCW94qDs+weWCa~25csnX6;jy%z(;j`i9(ZchqI-8h_=LB` z>-mNcu-E-rv>JyHcy->E7NLCj*G|p1z&DJjAq$p~xPg zkz^Y6(eOl?{n<(Wcnm$SX%#XxS#6{P8Sz<>WHVg`s@O-vB~R~t6n(Urv|N;EBm-P{ znn-d`o40qzDDy&MfV@aI;;MJfJ$Sz!kjFmTG1O2~$QegV{ZdQvv@=yoJrH3HV28#C zEiC*a8q>ykiVQg~Z1*Uq5u+B%V9kgm=;}m(lBwEwARNQta(Bp@7dR&|oza0)Eou>6QTh?I zR!*522}#Y{F>4`qMj*mE3CT7WrQ-7%Bz<+D-*vSg<1fMk&TbcO>!N9;A7bkHrl66k z?-9D<2JUBYNnSwiQ<^Y9+?UR;2jgUMd-I$3Kl8+}`tx(`uZrsEZCV8E{F|ZoLRNR= zLEFg{&Es5IcpPp7h}9ekcO-YOq-+LTOsp}E=jT*pE?nHi1D@8x!6hiVUm47NeQQsi zM)~36CpDeuvLh%gZ3QhYEzdiEaVN_Zl~ErT7dHv8?`npI z2!Cd%(lz`$B(M6z(3p~qom|=6OY-vaJS{CPIdFjSF$d^3ha@*2=b5l6s`H7Li!CDS z)H&F2sktB=Q80TDm>3k=016veNHM-)SBjG>_t(tFnRN!q5QI!Qv!h?v&NG=#88cyU zUqqRpnf{Pa`|tP?#WTxeqAT=eKD_#*J6_FIux#X>khuQT^G0&b<}2@b{8`l@9{4%koOFHoPj15BkA=E5r2PdnJSg;9dBpzM zP^)lp6A9YF?0p|r*jk)E---^t=2hjHA6zVnGq4JdV(WBPi#>xn8H_*a{;ll#0#*BMG1x9rl_`0!-=-;)wIltTV^Y5Ba zLRON)PrLvHZpRqFtMmCs`P2$$nk2H033LPye#)>i3 zde~RA+fR>!mDOvcehKbLb$hPXAn+Q=MgwFZ);-op&B6nrrGkWBtIa&4nbbqnRS?K1*jR7+rOVQ=vWezpgdb+qv$F=i2D!!UZ=w*43t+x>HatQD zFa}4O@e`xZBfpcDvAic%*19)Tw)x(DSg9w;QD;yE-vZ1-)Q3`>G`j@D*<8e7k4U1$ zxtf806RL_ijl~RJP}6y!kU?26G0a zaSkS?y5$FcB9d$t*+K(1&Y!%2jP!%QRAsaIJGfJ2bgLcqM?>O9jC+`8V}9?9mvwr- zR$_BG>>&b7;WYx`H+`V-vRe<_=6D~0!9@?W;^mvArKRKZ>U!(0K%4rzUcwHi$Byv# zm7ogS0)iK<*N}yY31xxgu|2`7uL4yK4Vj+BmWS?mbJb`g6_Zl2HBSxHu9q@I?tA4p+M_4C5zTjE9Q`;gV*QVLy5mvAp&!EXUn(ki8SgPn{#t}Rvvq+mfIM4 z)8_wx)eOK;#`8j8DbDV$=~)iOQ_+3`k=zjL&Cd4{d>$7B+F&gvX#f)(>WUiHuYrHQ z4hGgbHqEgxz!yj>N{S|ct5Y{>c zfLDW|1DMJ*7J`VoFaz94=6ZwR*|-jYwODSn)|v<8!!qEO_fR4hf%#tC(z47FD>7_v zBF}yNJNs<4>GC7(PXe(3LwW1RkMS<=_Z7@oEZHLXp`>bNqF599$FWL-zWC2IKe=_3 z3cf8-F*o)F6HB)`P2)wYo^?ZAWu?iD8Fy0=DYX8|@4j3fxg0Xo#I)cDf`_Y@+dWC1 z7~Q9%)6_W9@}C3#$6io)_kBxesHCBi^Id8Odc~{26IoH-CXMOfD=O)i{4&RkT^#G~sz)mCd8HU1v|qMPBA)frec9Hf9pJOKkN;l@O;$?P1K9MgY5`VmdDQWz-D%Cb>3SHcS8!}R35g^T^tkSe-flq8|DH7UHxs{7Q` z*IG0_0;H*PrY2CUPmsssIWcS}$>I%%=IKNC*LfjQwRq|&DZ~S1*KYXoZ=#J4kdsawBf1I0j16Y?E*Cm2#o7G{=jB}!!U@W?2$ zCa+YAX4`+1>U<3^3BtGf}hfEQp_Z*Ziz?bD}F_T-EfCOTUMpqZ!tC5f7_y6*Mql2E;RhN(>5Mg+NW z*uRo|;=t9~i{Ra7S`;qSJlYEIO};lCia)DK`yyX}fi@@vDeKN}WVB)DI@|+;_~olk zt4pFtR1`NDDN`7j0f{AsCgX#iicZCPyki8`upm5dU*VHoM-l|Q^=#HXmdjeb@@hw$ z*UN170utjw=T>$)hiwu?mK%ve1|&TWEuo#ur&{^}Xw3+uhv$5^KECfTSIF*6Aq7SS zK3Z_t!LeTieBLGjU&rnJpWm`E9p8taKJU?E#NPT-zS2E2G{)F~5ROo*_x+QSjF|KT z5EwsA9W)w2*=t}#kELHX8)<92_6Ys)G_bVP%|oH{0LTeuGQ-Y-U{0_*9l)qujRuQE z_EH8>-A{BbVkEVk#)F8)9UL4SZG+mT6K*{S*o=a~bBm{2;)pOvM_ar7ET25>=bVm7 zvY*Gz@ZzgyHu6FUlLBCnp6CT`_j{jpoEt8`ys%?d3x5_I_>3?WtEYlb%FAZjo{Lt-Kt?4q6wA{)v`%+e(`;egwui zQ*g2N^LGtZGQzaj+4jLc$2S&T)=FWbFhB0J5#Bq+J;+0x5Mno(SDQ> z*TMw#CM_%T$LG=gjS0(3@Zcdx?f^rk-ZZ!j*GOHRcF{KJi@9iM5Z-zO1UNS>_@!>N zoYiXk{I?&+N}0-)vanBqCcN6z@TFy9P&FT9*3QgSMJ-D@$k?0;^qE|<^A=Ec#EINO zmvr4XE6<_0HtpNcnSA{}e5OE-6+?iiFGHWhGXr?*U!N`sN;_@)eC9NS99kF;9nO=H+JlT?q|QJaRc) zfMlKojKC{Chsv03!sMgXrG>%uAJ`vce)w_94rhdd4cx)pf1e)we%rSkn<^~q`|>Ku77<|{8X93EhqlJG$}u$%vnA3hhvCY$Ys-16e!2uB&PmKcd@9CcebzW z$ixT#)iMrh3V^qleumevGC%~r0bUeYo;l{8YSIx%Q~33=NVv6%#_iAGRrmPz;Giw9 z7ION`&BZx$o6WM%*zeRFt0w5y*Ui6&(XGbDpsi0l9+S!{+m$RP*uGwz0fGKwSmt_r zU=G>vKTJHj7oRBAZ|jfU-DqicUyM(??wqkxU`qk*GQDRlIfADPzz56P^;5dDo z3JXn7fjY^|q;9#X;wnN^qO77KvkB06Y`?~j_#2%D=ekd|1{<(O;`2N(XliOESua)R zTJ7<;>BwgOzZXF0n&gNVz9hu%5*Q(l^ZSzvDEQU+*0+wcRG8c4m3`RHe=KWt4xPHQ zCp4CHKR)|O)YHAtn+VLL>Gkn48E6G)2UKD3W|?x*g7C>Lbhn_M_+!5fV`ACz$?Bot zBukC3BBwTDy?frBka1|cW?~YXUv-?)>v36k{e9mD^pK?HXyKS=I^Fi~{6IENX*EIH zcfTt;V#qmVNLVVy8>P{EZ2a@`KVz5%ST$@rp|)KQ6o=PU4@}i^z|`lhgVxi^(@tRi z;iLc%qTQtCK;lFV^I8HNiVT1uRiM>mEXi(iwmJg4)`!cFL4@Mz1stVi7-2NO2%;#| z3;lIb$?DAu3~#6Bx)%kS&W__>G`(@xnGVcv@r9sr#75fEKHIijgA~B*_*1Bd9vSG+ z7K$()*g&_vCAoF%-bB{)JPmWX{Mj4PY;pSVer#ycA0~zt^uDEQ`lgQ6e!zthQD#ssH%iq?jQD&Y+b{%1XF+@Okm zJA-AZ%e5Ak7&J-#cc^FRvf$}_i7Ff>!~Q59pgq2md~5)a@YzaKP6mMj*86fQR|G!^ zpVXB;_S@I_uAd)|uMg+1UC4Nx)%9zfx|Z$x(4CR+=i?xfC~4$_VX?rNI!+@fUZNW#_IF;5fN!)dk`ND6a!jdwG;7 ziGWPeEF!6#&>`SxKIsytQO`H{TWXCFLY*P{iYLx(&m)oWU zhfVwQH|Yq+`f_2IgIjNs*l%(foY$Rz;kuKSL<^uaIRgj-0=J)Q&GxdBytk2E>T+R2 z>VLMzmYg{CSRkM^Vxlpc!Rt@un0!m-6hp#%+-f(M*2-^vp{Uj^EKMHHmjA%@lYlcA z`B`e-9OZKpX6g|WoWIY{uiW&NeBaW_9(z7uG6i9i;b}7x*tOK~f^u|tTt`WzlgwKf zlTiPo_3&DyoTlAQj4tZVCLvcQF%^Qlh_F&id*ivi9jTUYrt-Xo67jTyKC^XyPg&Nf zALh=K!_D0HMTrtmZ>3DkEK%8mj(rLusoCJ~#jn?}4L2rWN7!{;wt`HrG^F8pKl-1R z&N8gY|83(V#^@4h5QZR~N=t)ugGe_>x5VghbVy2rfJh^Z9-~1Tm6nk1ZiMIlJ;#49 z_hQF=-23Kre$MkEGEoy{`99|)aM=Oe1aVX#BscP+xj`q5Gp1tOGwS8RM7@Kl{>0n| zXM*ihy4{UH>ZWh5B`Cuo$0dYty~wF2b+pC+QIUVW{$e4P7KYq9ZUNKQUpdC3^L9UE zTtlR}nNnS1NpW)h2MJ)3b5qdb?ie~+>K^vLrP(g0?ID>*00O>oLt#4MF%GIb@=Xxrt~#PZV>QAOUkZA9p`u` zJm4Lhb2sMQJ3_G0GO+us1C)v%D}CZ40KRlSnA#;&Cg+5LTBr=_l<7hco~*r!qV{5|j|TKU%Dlr8lX6ZZGP7 z@cYUhT6or>l{2zaTB_}#5V|wHYN_!|()4Gc$xtB%6u!I|*aT<8Kt8nYdHoV=>K6D{ zC?C5nOEih`(vT_{w=)jC|-l|F5@h`iYKSwe_&3fEfmKA7`0cofu<)L{f;$1{( zeK>^Z;bbtGsWa;h?p;_LkKHjd^}H1{rH7?4|GQ08R8j|%kQZeZPcatkBhfQ@Y~S+y zA={0)jp2>BI;)a#r58IbX!QY?hrfg8L(va$YEr+kJBx|iqd;6bn<3AH-K5BYN23*i z&3YCSua2l}4iBvVWJ1U?&Sez4A>SGUC{Sjd0gSraym7l%F;i)Jud||Jp@a=K*>n7dxf2n+<2#->4}8%gmp<~`g9^cfkWnI`)V;R4@N49 z`{&WJc;+avDW5#o2ZwyW|NYix5EW|pb2)0*62p$4I>%dApqb1gJIgptFGCoQwXj`S zde?F}AXqJ)w!uB3Wd*byR9Yh6NdpO>hoxX6*S&gbD}**jX>qzO6NGb2ij6JLSHEP5p!4d)D!-zUdoRI* zVIN%%Vq>UvS$W-U)X!tp&0WvI%4|dOyI&#h`0 zBNW2Alat%~>LITZiwO1>-FF7*uw{vQf6ED+eutE{0VUg+`70?9(y;XG?wx6~qaQmi z!F2NcZr_`2VW6&-b{)n!>1dk;e!kBlRoK}?&FM3f8b(Zfhm6II^=b3}ZL}SdN88uY z5K}8spA&K#r4{$p1Q>9n&*89fYQ!hlALeX%wl3niwx^(HbYK>n4Fk|50p{Bi&~lu% zDEjocJZ6Q1k3@gFwaPVGP70CRqa53fq>bPWz^Rx4K>2}+`cgm%tpo6HO?n*;#$T+w z!Iy$WM!glp?-UGsAUqjUV%c!g>EFJvU3DX(5me=FJJ){u5lwgwNGTZ_1|&wRwSfQ+ zf9ZaB`SiCLQ_lv|+F%V`q1=tGvdKKGb!^RXsZ*yb&sy<=n%{qZ`J*@X6Zkqe_zi)kX70G`g>nC;Q(MO-$7@=yLeEfAfex-q#&wF;Q`n>*;xC zY9=7;xS*+Rxi?$_>$_{48w8RlSUxhHD_!!0^xh}EDI8KXx3 zM#5p=^!MX1Qe#ubAGio`E)9*_7|Gfx)rAL_?t-0f>%y9sD6)N z8lqebSeP=vFGbFZc|v|fMwHR43}@n~?C57wfW(6qh0A@0&LG!7tGkc&=3QY{1p^8< zR^o>uZ#wIuZcL2wUkZuMzsumbt`Dw5cYlJ_Rk2)}%6ZO+>D0$G=<~XH*(f#T5l41{ zbS7S*_{y3;t0=rio1Gsx@KRO0t;wsi=!xQaDI|0SA6Rn$*dQ}MfFcQccesK#$M-{S z?Og5eY-aL!!b&L*^h4HMvewuV7j$j(b|5_#Lq$lf$!8g%pQ5#rH0U~ArtwU~@7nDj?Q(f%G^W5>R$eUgRj z#~(Xu{qLVNFsr0*BRXQ#EAp6*E{f2UUe>rer_qC1mvBW?=37cXxMl$$4q=0J%Nn%W zygGBnJ^5W-O1MKLO9NdePE- zDKR=d8=&Z*Qari+>PA75>C5H^E(O?jd21|)aK`ycLwH{Yg7jwpNsiIu0Ud&|PLgt_KHua0)r!W!D0C#3@ zw9!9MkMG)|1TAkFZ~HBKh)=^>(s)=)zO$sN{CAtJmxJ200Fj>qxUi~5IauT?Jigb&78^_A1E<}QXcj!& zS>f@0J?pyc#_dM?-fB`2M)J?mDBhU4b%V}Rhd$X%eNj5?eWJ1bXeR0P^?Y|ycDVlL zsN?Q($KG_%XTdEnh>uC`OBe5YC`g}B{UhcXG}U!|5D#Nf-8l`N^8NQAUTbN+G}o9f zGEyPKMDi9Bs4aME(pHAmJ07S@Ij5^-`ES30ochzYF=+#ZO{8^t;wO?havcTsdCwBg*5VSLd%b(jiOtbW~ z>@95_!nnVU@dBO9QMK!l2_eMV*&1%S+3uYepyCMzD=fzRm+nPliqdG6Xkr9Yw!}Fw z6p=j5PWTG`*-c&@{tw&6>q1e?Dw&@QiL%PiL=80c(#B@ekM7%B3E6E2or`n2GG44L z185%cYnUmU-;YRMaj|He(Ds@5O<|i~9L$GWrDq1m=#o229e4Gut)g#_a{;B@33&1j zEFyZ?$@g)IyAo@D5%DZlpSQ~1I2^5U3dK8s(`$60&T3$Q99Fe~l3tT0Vboz3`dRlH zZ6>yzn2;YIvX2;}z2@6#7*S+NEV1Ku)1bYhMS-2dd&9A(UReEk8a>#frl={sA&hG!P=Y?Akr}t3CsY=(N1PmYNFvZPUu52;-!b|m_1`83J#Iam z`KxCsyxn&5(bF;LyY?xOZ8c8-9U0$M^dSCN+Q2Nmj`Bb>S6PbU^YlhwpOLw8j>mg< z09byOJ%Lr$D&O~SG!JkzanlAs0tV2GO67_W52{HcU*Zhl8ZYV~X7*i3lpw9MZChZM zXFf)oC>|6p0Rw>_1vdGL3@#}VrhU8Q9@KCZDM4DeH@H5*J0pcC@ceL~$@5df+qEhdQS_yD+zwF6eS9O9K>_IQD-G%HICDgIkJ0_;BMQC$#e= z{%HLcx&6IW$@-BAmZZKM$NFs>=$NT@o0n(MG{yRc7nIJy9-)*4-|oN(p6ak zI7({uS$8#=k`-G4l;hnI#3MVB=+c(c=hxZPF+ZTh~Xi4uPnI~2f7{EER8{nMKZd=-vsqJXOEnTdq(;bt5mf&pJ3c z&CxW^p%fDmZgQ^sXc^N|ldvEBRn~qb&pA$cPy1`WOd@;EO8VGJNez@+=m+sMv3Vz!AemWChG> zpoKz*l(4S4FgPCRoHRFlE_GtwSGH6hD$j}|mu<;<75xG8EH@+ z%O{5DsF}u5;)+t1*!_%$|YXn5!&gC(~Z1Sx>7fhv78&twSn}XM}N$! zD%`S_q$N-Ht<2Pbt7w;AC&SyH#Q(TUhLh)#efk{4M>XQ!-EJ{-tnzb);hY%Q^~g@w z)=JEu2X~09@>=yfRq4yssvl9J>1IxPoBnx10<$pUlkM%D7`Dcwq7TO0eZK|8#x9P- zi0By%MsNs1&Kw~nz*+3-HSqK*=$NOTNXn!wfy>v$i^3e614io>#6?{OR8OxOVT z7>HE5?J&^o>^I3lyJ6UI#dIn$!hqV(+Z(}&ip8Qa6*d)TJ)>6e_HI1I^U>|yX%Xr> z>{lasrm2p(L?qX`ZaU~_OUJ*vhWrFP*t?e_Bbx0d8|D`%zk#+C*~nax>6e+1Mxu5{ z>cWp;3)Dx+qI@;=vW+iJh^mF$0R*l;xqas(@A{~{?zCL-d?<9giLRxh@nhZ`V}xBC zc_61-+Ai}R1U!Y{md%3O4$6De>Vbu)X)ck4eqbb$wYTt12(tU2qjWgz%>JncTta-b@; zB%PC5yBuknK=`;Khh+vIWs0pW5pO0|NsfsTqpvHX`#y{2H1~aUgzf(4q#NV2EM6lH zONq8RJo>y*-j|dg_MYl%y(6Z3O4x`is_w%%jyGfKI*qGhx`RMDb+oX zm0$0Nc`HWC$&kD(@4+=p(4YltuX*)IXKqT6HcCY=e&&J1@$QfdG=cT=zKr8*3tJW- z`|grB3I$$~OZU@$Qx~OA%umVzN_H-k87fLX7z}EHHmRM~BW_>|)n7?CV&DV!6gEQ* zx2e;ROJDK}ZRO+CJ=>^?FNCM+v^a#6Q80T-!}bsvwX6`D2o=Icc?p&@E(nflKl-58 zKU)HaBBwh;WMV8K*ep-ijF|6XQh0Ztw&*L1_e+k=Jw^#p{ik^wW_`RaQ;f}NGJnvC zAr=HhW(ZrY3fd^6?O@RneZ;l>bxOBRSo z?Lo&pna0K|pm`H}U$f{JRQ@uv<@4_BwQ|>Q#WGD3;NHg;N~}WN%I?08)pzds({OfJ z8zt8Md$=%AFF4pJMNJ z9^|-M_e}?Q$u2#4o}yDupiqix2&MkskjOLng>Wr$ z!DY9>5Em_Rra{x;SrvWMLMJ(2kON$>;F6t?+T@~A^mBw`umsb6G9}jZ=P`MHo~YF{WBUqL?<~avPLG?ojvnOdY2H9&-oWo!i*U}Z1dv%u!{2&i%{mOeB~K8a zv3m(j4o%olioBKpyBRiuGvIK+_*2HB3a2#pTQ;Z2wTJh8BHX=vU{W8tOTb(|c#g5@ zAd?iO#mxOaapU{lv5Nm&e2hO0G}os|A!au^>R`Ep;a*-|2_GKiz#iSEbpd4kO({AO z%njDi&8z-u8BP`gl9OlM^k?U!ssndVOYW@5B|kMX<}`Qw2PS3{u0r!L4r5yt*CDHZ zH7IlUJG#%3-5x2_iJ%RBku&Q-Qd}l>{jKb`R<@a!fzQJ-f5*^p{T`krWpBa<4mjbl zJ9PUw5l(<~yv)>zS*%5~U9b*rG>QAqguSrYRgO8QOG)+|*PuOYxcbYQW5nmk%Y&}B z=5(R>C%7b3&W5QsSYjA%s}Y`$66)srmCQa zyI3kf76i{NF;%*g@`K!Tn;QM>}wlNms`y@Uz!)aX-&Sc)&CH9rwH{_w; zjAw$Gq028f!qvs-4`VCxIP!%k61$bYwDLdmlU1epE5)X~uPi3_=+k^@BVUb|t*hQb-J+QGrnjNHX=7Y2&X0Q~wt3a)=5vxC0m0t6WB{B1 zG}`$j831@{YAE|Vp!`2EdU=S<>-B}HRAyzsuvj_c=C=)?ukIq-vuGG!>~(!Ux5f8@ zT1-#B%4lw{{8df2i7?szx>}U{0uq#L5aP6za_~4#I8uUXCzj!WR(YX;XU?@@KriL^ zj{W$r{U_|-@sp&<&bGI#skSyc%7jRw7|MK~If+rIL%Qhay|S-1Nt9TE?JiI>R0fxN zH&IQuyUhD5J)+}|(Q`lUT};vXx|l}UxV&zA=hywVuF-9Xl-|gR|En$XdBF z0v`u+kk98>aBF(y?AWV?gTym5Ydrgd%LY&do$diB=S5W^O<};QF`71iI#cTn%~{Hs zL;CR1yHDxhtf&nUpP6xJGTZ}a!ZfFv{s|xVhdEIK=q|hWOO;ZyE><)iv!+w{&R*Gv zD_*|1pI@t*Kp^}RB{^wb=F-T#pAg}949<9#Wp^fg_+*Y_%6274rzoAvY^k{iQ?zAL zZUQk$vv^T4!f4t}O;f7oSiyVPQ$O$zE6;(%7zk=TG9>xQiQp1nyo^>RBU0K4 zFz=E=hnAslsfcq_{%glwDuA=>FS$GKaYW|h6-BQhbWu%l8H6VoE3w#2ifB%t01cHG z1_Eyxy8eWLS$&e+c-l*x%k3oD0wzaaxFfst z#=zVgO%O+L%ExuI&z)M($6>pYQPyoy;zS*#?E1B5hLTYEo)+}+0E%n9#jJwH3^p+B z%k)q?zXsW7(gs)Rv!h+Dm6+2w3T5i$k**uVw_IChZVV`oT+Pg1ZwlYgax~kbW^aE= z|H6H0b4~QZi$IFJMJR--9~JFq`BnWKz2S8Tr-PUJp{p9+N0`~O!WSf);VI!fZ%K~w%l$wp^o-30;v#SpDkz7&I zpVfqtF%p@cd)ZitP}=TPwLU!_{pdh3qur|2AS_8mC7)M-n-8vF$f`vC_tTim}Hw^sb@ivJ|NZhp;HmH2`s~tH=V50 z_v?E>R_r%N%n-Aeka0L=k9hNPxN<<(P7_*GQYB^^#QTT<+AjnvDsU#(qNt!HIZU$z zTu>c|eGZRbT9%PigmeD2btcjNSl5OkYV$mi+Hk>UlZ0iDqJc9m;F@}cmQJaF{>7z@ zO7p%QP8>W=U-9Rtp7w|AqF6!Kr}PXQ zeY~3QeA5Dp$XYe&#o*U>u@-RxqnjNFe0A#Q#EhUnCr0cmMh}hO>bsgINRJ3l`6kqV? z5mLUxmkGFq_c1z4EQzkV+R-cWO+)3R$piG*+wt$^4?-c>-H=M+62*<3jP@nh?%cb( z=+8Y>p5^q&4k1j3S2eNi1fV@4#!z7#dzoFV6G6^cEw0 zE4j(dufK9xh@|{C*4lJYSJ$SUmT`HM`fiQZHiNIUFziW$%+=g9=uwXGL|QmMq+UL^ z4vp&Ri;J@^nLUgGpIeT!(avu-`xU#Guv^A)(uVwQ+qIYC$H#b$caMhC{p?gX@21i6 zUo}~t=NBVFCKJ4Q|A83{n4^Qk=Brn)^kkHM#BYX6-w2yS3*El(UddRYJBAh!`SG*w zD6>g`MG0Y4ce*4ub#V9UMD#J|JSCFG#dY6s;jeU#km#v#Rd7xlI9$w+p*D|8uqweoYyiAPIfM&d`Qc6+p7~}>_#KH(Y;U= zmF<;Xw*><};gMB@>&g3vt|+Jt?l~t*xz_a?rYa=VLh>to=*ZXp z=b1AT6PjqtDBTL(O3eacvTfgEn?)(y&4z2FYk}1X?j5(o9h9Dt>xVV?FNI~A0Pfv0DhrgcHifU z>|TLj0T;WkT!Am29&?TPeQVGzlrqQl!Y}eAfH34)#@M>MOEbr;K1=bfH%hAOc%GF| z8-+@ud}MS%|AnsDs+goS;MPv3k>@M^pv#`t`d`G2XN}Z;dX^dr`(`WiiB0ugQ%(2szCrn7ZZ>>?JkC^_x3H1 z#|=60V2(Vu4wLIzIqu3vvjQxWo+LR0958S0UlV~>mb!j4IAnK9I!II>Yp#+j^=c~ z3-_I~^-gF$U9>GcSw6ag{XgrT?tL)-n*?^?JeDsh2@p1k`b`MLG=LQ3*E=sDTswjA zjwe$)UM7Os1Yfi;VGw{+b_E^-SiL`Vu`lbfCJ}*^^Ezf#5z0%FF?7fKS@JQ=J^0EP z{hzY+B3-FIrmaS#8cYXoAyj^F&ew)rLJou!J3=ua+!f!xor28*jNr_lv$NY88XC@P zs;edb`)xD8!n(X|%^y8f>WBv0+uPf6o^L2eM@o~IX_v`_T52&QD}J4xo)X0eS*9*} zsp&6Cf~vnzrFz0`(CoNtHxmY^o51~-wjk48fkR6(S^%jv0;;J314{eE=6~V=M=2Kq zsHNsX2x%#^Ms0cis}{$9B&4H``%73C*%9w6o=5*bVM7-s1ajxlV<%&!2Df_c6n*~8 z%f+e|+J@O%+3t5~S}7iUFOY`oI)?gi1IkZZ|BWNeh1w7jI-I5!@?`5P0T80#>>+D7 zvB7bbFd(e${)ap)1Ni3{kY{kL2MjFa%PduBIvk@nA+CjZZ*Pz?p}b_jq~vtKP&3>`gJ5eFE!*J3g~xm_ z(nsEayNntOR&}y4u+EIf-B13lx*AzgQW8n47^H-Qdw2-60gMGOJWarCCux#Gpm~hV zL7>Ze%vU0{j0`ke0N!_2OH8JvT9d)burJ6M?|Q%9ROKGDGT%p-ehsC7b)W=hU?XEn z3mYhP{(Z|yCqN48{90A@Zx_hK(eXHTX7XU0#$-RxWz{SUkpyEAPm2K~wPTH>eJTG`i{?3N!H z#9IHGIZ&+?;0^87*-sYD02U;ncHPAqu#bzaMF>l<&z(`*AgW%XcBak~nI0Kov$_I7 z`e?~;Q&l1N4~#CVT1!Ypt}opBri#3Dwp_YmKTNv)7F;ST-wyQWQDvdaN6R%3HG_E0QbZ47s$O2I*rVoAOXWr zD8;WTiT%HL0Y=l0P2%UAgksO`EP1r$p~)ja-bFt0V%msD_U36)rF@k>+Hz6K_bG!& z^e$H3p!@6LWuRHgG6I3P)KpG;8Pc=>&*>%XQV2Gl(ROQ>jlzE?1|-zI>%9G4Bg#eR z0fr^+1?p1WpQwg9R<~^K117zDC%};j=wO&l2NP%ZY~MNgdQVh`L#)QBqb7~b%z_6N zTRekBgCAfsx&>$p9A6DTa{EX#Rl$r4Q3FZnlq>%^a)g}zv|bnuS*J{FGS`Knq06Q`gGY##(0fmZI41FX@ Xn2|?2v{t+X0UsrKb-8L8^RWK|nl$TR literal 0 HcmV?d00001 diff --git a/docs/switch_from_cv.md b/docs/switch_from_cv.md index 3bda0e224..e01aa7b2c 100644 --- a/docs/switch_from_cv.md +++ b/docs/switch_from_cv.md @@ -43,7 +43,7 @@ Switching OpenCV with VidGear APIs is fairly painless process, and will just req VidGear employs OpenCV at its backend and enhances its existing capabilities even further by introducing many new state-of-the-art functionalities such as: - [x] Accelerated [Multi-Threaded](../bonus/TQM/#what-does-threaded-queue-mode-exactly-do) Performance. -- [x] Out-of-the-box support for OpenCV API. +- [x] Out-of-the-box support for OpenCV APIs. - [x] Real-time [Stabilization](../gears/stabilizer/overview/) ready. - [x] Lossless hardware enabled video [encoding](../gears/writegear/compression/usage/#using-compression-mode-with-hardware-encoders) and [transcoding](../gears/streamgear/rtfm/usage/#usage-with-hardware-video-encoder). - [x] Inherited multi-backend support for various video sources and devices. diff --git a/vidgear/gears/asyncio/netgear_async.py b/vidgear/gears/asyncio/netgear_async.py index 3547d3d44..e6abce228 100755 --- a/vidgear/gears/asyncio/netgear_async.py +++ b/vidgear/gears/asyncio/netgear_async.py @@ -54,7 +54,7 @@ class NetGear_Async: system. NetGear_Async provides complete server-client handling and options to use variable protocols/patterns similar to NetGear API. Furthermore, NetGear_Async allows us to define - our custom Server as source to manipulate frames easily before sending them across the network. + our custom Server as source to transform frames easily before sending them across the network. NetGear_Async now supports additional [**bidirectional data transmission**](../advanced/bidirectional_mode) between receiver(client) and sender(server) while transferring frames. Users can easily build complex applications such as like [Real-Time Video Chat](../advanced/bidirectional_mode/#using-bidirectional-mode-for-video-frames-transfer) in just few lines of code. diff --git a/vidgear/gears/asyncio/webgear_rtc.py b/vidgear/gears/asyncio/webgear_rtc.py index c8b6efc72..bdb59a688 100644 --- a/vidgear/gears/asyncio/webgear_rtc.py +++ b/vidgear/gears/asyncio/webgear_rtc.py @@ -287,7 +287,7 @@ class WebGear_RTC: WebGear_RTC can handle multiple consumers seamlessly and provides native support for ICE (Interactive Connectivity Establishment) protocol, STUN (Session Traversal Utilities for NAT), and TURN (Traversal Using Relays around NAT) servers that help us to easily establish direct media connection with the remote peers for uninterrupted data flow. It also allows us to define our custom Server - as a source to manipulate frames easily before sending them across the network(see this doc example). + as a source to transform frames easily before sending them across the network(see this doc example). WebGear_RTC API works in conjunction with Starlette ASGI application and can also flexibly interact with Starlette's ecosystem of shared middleware, mountable applications, Response classes, Routing tables, Static Files, Templating engine(with Jinja2), etc. From a82d5af0848978b1f84fc128551c78648867d6b5 Mon Sep 17 00:00:00 2001 From: abhiTronix Date: Mon, 23 Aug 2021 22:51:23 +0530 Subject: [PATCH 03/11] =?UTF-8?q?=F0=9F=9A=B8=20Docs:=20Added=20`pip`=20up?= =?UTF-8?q?date=20instructions.?= MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit --- docs/installation/pip_install.md | 29 +++++++++++++++++++++++++++++ docs/installation/source_install.md | 29 +++++++++++++++++++++++++++++ 2 files changed, 58 insertions(+) diff --git a/docs/installation/pip_install.md b/docs/installation/pip_install.md index 8d6af3436..1b9b66663 100644 --- a/docs/installation/pip_install.md +++ b/docs/installation/pip_install.md @@ -29,6 +29,32 @@ limitations under the License. When installing VidGear with [pip](https://pip.pypa.io/en/stable/installing/), you need to check manually if following dependencies are installed: +??? alert "Latest `pip` Recommended" + + It advised to install latest `pip` version before installing vidgear to avoid any undesired errors. Python comes with an [`ensurepip`](https://docs.python.org/3/library/ensurepip.html#module-ensurepip) module[^1], which can easily install `pip` in any Python environment. + + === "Linux" + + ```sh + python -m ensurepip --upgrade + + ``` + + === "MacOS" + + ```sh + python -m ensurepip --upgrade + + ``` + + === "Windows" + + ```sh + py -m ensurepip --upgrade + + ``` + + ### Core Prerequisites * #### OpenCV @@ -50,6 +76,7 @@ When installing VidGear with [pip](https://pip.pypa.io/en/stable/installing/), y pip install opencv-python ``` + ### API Specific Prerequisites * #### FFmpeg @@ -162,3 +189,5 @@ pip install vidgear-0.2.2-py3-none-any.whl[asyncio] ```   + +[^1]: The `ensurepip` module was added to the Python standard library in Python 3.4. \ No newline at end of file diff --git a/docs/installation/source_install.md b/docs/installation/source_install.md index f71a97e09..d8b299be4 100644 --- a/docs/installation/source_install.md +++ b/docs/installation/source_install.md @@ -31,6 +31,32 @@ When installing VidGear from source, FFmpeg and Aiortc are the only two API spec !!! question "What about rest of the dependencies?" Any other python dependencies _(Core/API specific)_ will be automatically installed based on your OS specifications. + + +??? alert "Latest `pip` Recommended" + + It advised to install latest `pip` version before installing vidgear to avoid any undesired errors. Python comes with an [`ensurepip`](https://docs.python.org/3/library/ensurepip.html#module-ensurepip) module[^1], which can easily install `pip` in any Python environment. + + === "Linux" + + ```sh + python -m ensurepip --upgrade + + ``` + + === "MacOS" + + ```sh + python -m ensurepip --upgrade + + ``` + + === "Windows" + + ```sh + py -m ensurepip --upgrade + + ``` ### API Specific Prerequisites @@ -123,3 +149,6 @@ pip install git+git://github.com/abhiTronix/vidgear@testing#egg=vidgear[asyncio] ```   + + +[^1]: The `ensurepip` module was added to the Python standard library in Python 3.4. From 3ab426f015656082a3150291d8b129565d4de187 Mon Sep 17 00:00:00 2001 From: abhiTronix Date: Tue, 31 Aug 2021 10:24:02 +0530 Subject: [PATCH 04/11] =?UTF-8?q?=E2=9A=A1=EF=B8=8F=20VidGear=20Core:=20Vi?= =?UTF-8?q?rtually=20isolated=20API=20specific=20dependencies=20(Fixes=20#?= =?UTF-8?q?242)?= MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit - ⚡️ New behavior to virtually isolate optional API specific dependencies by silencing `ImportError` on all VidGear's APIs import. - 🎨 Implemented algorithm to cache all imports on startup but silence any `ImportError` on missing optional dependency. - ⚠️ Now `ImportError` will be raised only when certain API specific dependency is missing during given API's initialization. - ✨ New `import_dependency_safe` to imports specified dependency safely with `importlib` module. - ⚡️Replaced all APIs imports with `import_dependency_safe`. - ⚡️ Added support for relative imports in `import_dependency_safe`. - ✨ Implemented `error` parameter to by default `ImportError` with a meaningful message if a dependency is missing, Otherwise if `error = log` a warning will be logged and on `error = silent` everything will be quit. But If a dependency is present, but older than specified, an error is raised if specified. - ✨ Implemented behavior that if a dependency is present, but older than `min_version` specified, an error is raised always. - ✨ Implemented `custom_message` to display custom message on error instead of default one. - 🔥 Removed redundant `logger_handler`, `mkdir_safe`, `retrieve_best_interpolation`, `capPropId` helper functions from asyncio package. - 🎨 Relatively imported helper functions from non-asyncio package. - ⚡️ Implemented separate `import_core_dependency` function to import and check for specified core dependency. `ImportError` will be raised immediately if core dependency not found. - Setup.py: - 🔥 Removed version check on certain dependencies. - ⏪️ Re-added `aiortc` to auto-install latest version. - Docs: - 📝 Updated URL and context for CamGear example. - ✏️ Fixed typos in usage examples. - 💡 Updated code comments. - 📝 Other Minor updates. - CI: 👷 Imported correct `logger_handler` for asyncio tests. - Codecov: ☂️ Added `__init__.py` to ignore. --- codecov.yml | 1 + docs/gears/camgear/usage.md | 6 +- .../netgear/advanced/bidirectional_mode.md | 6 +- docs/gears/netgear/advanced/compression.md | 4 +- docs/gears/netgear/advanced/multi_client.md | 12 +- docs/gears/netgear/advanced/multi_server.md | 15 +- docs/gears/netgear/advanced/secure_mode.md | 6 +- docs/gears/netgear/advanced/ssh_tunnel.md | 6 +- docs/gears/screengear/usage.md | 4 +- docs/gears/webgear/advanced.md | 8 +- docs/gears/webgear_rtc/advanced.md | 2 +- setup.py | 12 +- vidgear/gears/__init__.py | 164 +++++++++++++++++- vidgear/gears/asyncio/__init__.py | 1 + vidgear/gears/asyncio/__main__.py | 2 +- vidgear/gears/asyncio/helper.py | 109 +----------- vidgear/gears/asyncio/netgear_async.py | 34 ++-- vidgear/gears/asyncio/webgear.py | 33 ++-- vidgear/gears/asyncio/webgear_rtc.py | 54 +++--- vidgear/gears/camgear.py | 14 +- vidgear/gears/helper.py | 150 ++++++++++++++-- vidgear/gears/netgear.py | 56 +++--- vidgear/gears/pigear.py | 33 ++-- vidgear/gears/screengear.py | 20 ++- vidgear/gears/stabilizer.py | 4 +- vidgear/gears/streamgear.py | 4 +- vidgear/gears/videogear.py | 3 + vidgear/gears/writegear.py | 10 +- .../asyncio_tests/test_helper.py | 2 +- .../asyncio_tests/test_netgear_async.py | 2 +- .../asyncio_tests/test_webgear.py | 2 +- .../asyncio_tests/test_webgear_rtc.py | 2 +- 32 files changed, 494 insertions(+), 287 deletions(-) diff --git a/codecov.yml b/codecov.yml index 544e750c8..7cef6987d 100644 --- a/codecov.yml +++ b/codecov.yml @@ -30,5 +30,6 @@ ignore: - "vidgear/tests" - "docs" - "scripts" + - "vidgear/gears/__init__.py" #trivial - "vidgear/gears/asyncio/__main__.py" #trivial - "setup.py" \ No newline at end of file diff --git a/docs/gears/camgear/usage.md b/docs/gears/camgear/usage.md index d2b377c13..0583fe944 100644 --- a/docs/gears/camgear/usage.md +++ b/docs/gears/camgear/usage.md @@ -66,7 +66,7 @@ stream.stop() ## Using Camgear with Streaming Websites -CamGear API provides direct support for piping video streams from various popular streaming services like [Twitch](https://www.twitch.tv/), [Livestream](https://livestream.com/), [Dailymotion](https://www.dailymotion.com/live), and [many more ➶](https://streamlink.github.io/plugin_matrix.html#plugins). All you have to do is to provide the desired Video's URL to its `source` parameter, and enable the [`stream_mode`](../params/#stream_mode) parameter. The complete usage example is as follows: +CamGear API provides direct support for piping video streams from various popular streaming services like [Twitch](https://www.twitch.tv/), [Vimeo](https://vimeo.com/), [Dailymotion](https://www.dailymotion.com), and [many more ➶](https://streamlink.github.io/plugin_matrix.html#plugins). All you have to do is to provide the desired Video's URL to its `source` parameter, and enable the [`stream_mode`](../params/#stream_mode) parameter. The complete usage example is as follows: !!! bug "Bug in OpenCV's FFmpeg" To workaround a [**FFmpeg bug**](https://github.com/abhiTronix/vidgear/issues/133#issuecomment-638263225) that causes video to freeze frequently, You must always use [GStreamer backend](../params/#backend) for Livestreams _(such as Twitch URLs)_. @@ -90,10 +90,10 @@ import cv2 options = {"STREAM_RESOLUTION": "720p"} # Add any desire Video URL as input source -# for e.g https://www.dailymotion.com/video/x7xsoud +# for e.g https://vimeo.com/151666798 # and enable Stream Mode (`stream_mode = True`) stream = CamGear( - source="https://www.dailymotion.com/video/x7xsoud", + source="https://vimeo.com/151666798", stream_mode=True, logging=True, **options diff --git a/docs/gears/netgear/advanced/bidirectional_mode.md b/docs/gears/netgear/advanced/bidirectional_mode.md index 6ea1df99a..3f8ac47ac 100644 --- a/docs/gears/netgear/advanced/bidirectional_mode.md +++ b/docs/gears/netgear/advanced/bidirectional_mode.md @@ -36,7 +36,7 @@ This mode can be easily activated in NetGear through `bidirectional_mode` attrib   -!!! danger "Important" +!!! danger "Important Information regarding Bidirectional Mode" * In Bidirectional Mode, `zmq.PAIR`(ZMQ Pair) & `zmq.REQ/zmq.REP`(ZMQ Request/Reply) are **ONLY** Supported messaging patterns. Accessing this mode with any other messaging pattern, will result in `ValueError`. @@ -69,7 +69,7 @@ This mode can be easily activated in NetGear through `bidirectional_mode` attrib   -## Method Parameters +## Exclusive Parameters To send data bidirectionally, NetGear API provides two exclusive parameters for its methods: @@ -364,7 +364,7 @@ server.close() In this example we are going to implement a bare-minimum example, where we will be sending video-frames _(3-Dimensional numpy arrays)_ of the same Video bidirectionally at the same time, for testing the real-time performance and synchronization between the Server and the Client using this(Bidirectional) Mode. -!!! tip "This feature is great for building applications like Real-Time Video Chat." +!!! tip "This example is useful for building applications like Real-Time Video Chat." !!! info "We're also using [`reducer()`](../../../../../bonus/reference/helper/#vidgear.gears.helper.reducer--reducer) method for reducing frame-size on-the-go for additional performance." diff --git a/docs/gears/netgear/advanced/compression.md b/docs/gears/netgear/advanced/compression.md index d09bf2070..21e6c00c8 100644 --- a/docs/gears/netgear/advanced/compression.md +++ b/docs/gears/netgear/advanced/compression.md @@ -49,9 +49,9 @@ Frame Compression is enabled by default in NetGear, and can be easily controlled   -## Supported Attributes +## Exclusive Attributes -For implementing Frame Compression, NetGear API currently provide following attribute for its [`options`](../../params/#options) dictionary parameter to leverage performance with Frame Compression: +For implementing Frame Compression, NetGear API currently provide following exclusive attribute for its [`options`](../../params/#options) dictionary parameter to leverage performance with Frame Compression: * `jpeg_compression`: _(bool/str)_ This internal attribute is used to activate/deactivate JPEG Frame Compression as well as to specify incoming frames colorspace with compression. Its usage is as follows: diff --git a/docs/gears/netgear/advanced/multi_client.md b/docs/gears/netgear/advanced/multi_client.md index b8ab39be1..9c7e6016f 100644 --- a/docs/gears/netgear/advanced/multi_client.md +++ b/docs/gears/netgear/advanced/multi_client.md @@ -37,7 +37,7 @@ The supported patterns for this mode are Publish/Subscribe (`zmq.PUB/zmq.SUB`) a   -!!! danger "Multi-Clients Mode Requirements" +!!! danger "Important Information regarding Multi-Clients Mode" * A unique PORT address **MUST** be assigned to each Client on the network using its [`port`](../../params/#port) parameter. @@ -45,6 +45,8 @@ The supported patterns for this mode are Publish/Subscribe (`zmq.PUB/zmq.SUB`) a * Patterns `1` _(i.e. Request/Reply `zmq.REQ/zmq.REP`)_ and `2` _(i.e. Publish/Subscribe `zmq.PUB/zmq.SUB`)_ are the only supported pattern values for this Mode. Therefore, calling any other pattern value with is mode will result in `ValueError`. + * Multi-Clients and Multi-Servers exclusive modes **CANNOT** be enabled simultaneously, Otherwise NetGear API will throw `ValueError`. + * The [`address`](../../params/#address) parameter value of each Client **MUST** exactly match the Server.   @@ -71,12 +73,10 @@ The supported patterns for this mode are Publish/Subscribe (`zmq.PUB/zmq.SUB`) a ## Usage Examples -!!! alert "Important Information" +!!! alert "Important" * ==Frame/Data transmission will **NOT START** until all given Client(s) are connected to the Server.== - * Multi-Clients and Multi-Servers exclusive modes **CANNOT** be enabled simultaneously, Otherwise NetGear API will throw `ValueError`. - * For sake of simplicity, in these examples we will use only two unique Clients, but the number of these Clients can be extended to **SEVERAL** numbers depending upon your Network bandwidth and System Capabilities. @@ -85,7 +85,9 @@ The supported patterns for this mode are Publish/Subscribe (`zmq.PUB/zmq.SUB`) a ### Bare-Minimum Usage -In this example, we will capturing live video-frames from a source _(a.k.a Servers)_ with a webcam connected to it. Afterwards, those captured frame will be transferred over the network to a two independent system _(a.k.a Client)_ at the same time, and will be displayed in Output Window at real-time. All this by using this Multi-Clients Mode in NetGear API. +In this example, we will capturing live video-frames from a source _(a.k.a Server)_ with a webcam connected to it. Afterwards, those captured frame will be sent over the network to two independent system _(a.k.a Clients)_ using this Multi-Clients Mode in NetGear API. Finally, both Clients will be displaying recieved frames in Output Windows in real time. + +!!! tip "This example is useful for building applications like Real-Time Video Broadcasting to multiple clients in local network." #### Server's End diff --git a/docs/gears/netgear/advanced/multi_server.md b/docs/gears/netgear/advanced/multi_server.md index 8d6dc61c1..d0fa4caaa 100644 --- a/docs/gears/netgear/advanced/multi_server.md +++ b/docs/gears/netgear/advanced/multi_server.md @@ -35,7 +35,7 @@ The supported patterns for this mode are Publish/Subscribe (`zmq.PUB/zmq.SUB`) a   -!!! danger "Multi-Servers Mode Requirements" +!!! danger "Important Information regarding Multi-Servers Mode" * A unique PORT address **MUST** be assigned to each Server on the network using its [`port`](../../params/#port) parameter. @@ -43,6 +43,8 @@ The supported patterns for this mode are Publish/Subscribe (`zmq.PUB/zmq.SUB`) a * Patterns `1` _(i.e. Request/Reply `zmq.REQ/zmq.REP`)_ and `2` _(i.e. Publish/Subscribe `zmq.PUB/zmq.SUB`)_ are the only supported values for this Mode. Therefore, calling any other pattern value with is mode will result in `ValueError`. + * Multi-Servers and Multi-Clients exclusive modes **CANNOT** be enabled simultaneously, Otherwise NetGear API will throw `ValueError`. + * The [`address`](../../params/#address) parameter value of each Server **MUST** exactly match the Client.   @@ -68,23 +70,24 @@ The supported patterns for this mode are Publish/Subscribe (`zmq.PUB/zmq.SUB`) a ## Usage Examples -!!! alert "Important Information" +!!! alert "Example Assumptions" * For sake of simplicity, in these examples we will use only two unique Servers, but, the number of these Servers can be extended to several numbers depending upon your system hardware limits. - * All of Servers will be transferring frames to a single Client system at the same time, which will be displaying received frames as a montage _(multiple frames concatenated together)_. + * All of Servers will be transferring frames to a single Client system at the same time, which will be displaying received frames as a live montage _(multiple frames concatenated together)_. * For building Frames Montage at Client's end, We are going to use `imutils` python library function to build montages, by concatenating together frames recieved from different servers. Therefore, Kindly install this library with `pip install imutils` terminal command. - * Multi-Servers and Multi-Clients exclusive modes **CANNOT** be enabled simultaneously, Otherwise NetGear API will throw `ValueError`. -   ### Bare-Minimum Usage -In this example, we will capturing live video-frames on two independent sources _(a.k.a Servers)_, each with a webcam connected to it. Then, those frames will be transferred over the network to a single system _(a.k.a Client)_ at the same time, and will be displayed as a real-time montage. All this by using this Multi-Servers Mode in NetGear API. +In this example, we will capturing live video-frames on two independent sources _(a.k.a Servers)_, each with a webcam connected to it. Afterwards, these frames will be sent over the network to a single system _(a.k.a Client)_ using this Multi-Servers Mode in NetGear API in real time, and will be displayed as a live montage. + + +!!! tip "This example is useful for building applications like Real-Time Security System with multiple cameras." #### Client's End diff --git a/docs/gears/netgear/advanced/secure_mode.md b/docs/gears/netgear/advanced/secure_mode.md index 1fee3c798..200ebd92a 100644 --- a/docs/gears/netgear/advanced/secure_mode.md +++ b/docs/gears/netgear/advanced/secure_mode.md @@ -48,7 +48,7 @@ Secure mode supports the two most powerful ZMQ security layers:   -!!! danger "Secure Mode Requirements" +!!! danger "Important Information regarding Secure Mode" * The `secure_mode` attribute value at the Client's end **MUST** match exactly the Server's end _(i.e. **IronHouse** security layer is only compatible with **IronHouse**, and **NOT** with **StoneHouse**)_. @@ -83,9 +83,9 @@ Secure mode supports the two most powerful ZMQ security layers:   -## Supported Attributes +## Exclusive Attributes -For implementing Secure Mode, NetGear API currently provide following attribute for its [`options`](../../params/#options) dictionary parameter: +For implementing Secure Mode, NetGear API currently provide following exclusive attribute for its [`options`](../../params/#options) dictionary parameter: * `secure_mode` (_integer_) : This attribute activates and sets the ZMQ security Mechanism. Its possible values are: `1`(_StoneHouse_) & `2`(_IronHouse_), and its default value is `0`(_Grassland(no security)_). Its usage is as follows: diff --git a/docs/gears/netgear/advanced/ssh_tunnel.md b/docs/gears/netgear/advanced/ssh_tunnel.md index 150d79fa8..8b0d41a5c 100644 --- a/docs/gears/netgear/advanced/ssh_tunnel.md +++ b/docs/gears/netgear/advanced/ssh_tunnel.md @@ -80,12 +80,12 @@ SSH Tunnel Mode requires [`pexpect`](http://www.noah.org/wiki/pexpect) or [`para   -## Supported Attributes +## Exclusive Attributes !!! warning "All these attributes will work on Server end only whereas Client end will simply discard them." -For implementing SSH Tunneling Mode, NetGear API currently provide following attribute for its [`options`](../../params/#options) dictionary parameter: +For implementing SSH Tunneling Mode, NetGear API currently provide following exclusive attribute for its [`options`](../../params/#options) dictionary parameter: * **`ssh_tunnel_mode`** (_string_) : This attribute activates SSH Tunneling Mode and sets the fully specified `"@:"` SSH URL for tunneling at Server end. Its usage is as follows: @@ -138,7 +138,7 @@ For implementing SSH Tunneling Mode, NetGear API currently provide following att ## Usage Example -??? alert "Assumptions for this Example" +???+ alert "Assumptions for this Example" In this particular example, we assume that: diff --git a/docs/gears/screengear/usage.md b/docs/gears/screengear/usage.md index c8ccfa8ec..dea324021 100644 --- a/docs/gears/screengear/usage.md +++ b/docs/gears/screengear/usage.md @@ -122,7 +122,7 @@ from vidgear.gears import ScreenGear import cv2 # open video stream with defined parameters with monitor at index `1` selected -stream = ScreenGear(monitor=1, logging=True, **options).start() +stream = ScreenGear(monitor=1, logging=True).start() # loop over while True: @@ -167,7 +167,7 @@ from vidgear.gears import ScreenGear import cv2 # open video stream with defined parameters and `mss` backend for extracting frames. -stream = ScreenGear(backend="mss", logging=True, **options).start() +stream = ScreenGear(backend="mss", logging=True).start() # loop over while True: diff --git a/docs/gears/webgear/advanced.md b/docs/gears/webgear/advanced.md index ff7aea9dd..507f42fc5 100644 --- a/docs/gears/webgear/advanced.md +++ b/docs/gears/webgear/advanced.md @@ -42,7 +42,7 @@ Let's implement a bare-minimum example using WebGear, where we will be sending [ ```python # import required libraries import uvicorn -from vidgear.gears.asyncio import WebGear_RTC +from vidgear.gears.asyncio import WebGear # various performance tweaks and enable grayscale input options = { @@ -53,8 +53,8 @@ options = { "jpeg_compression_fastupsample": True, } -# initialize WebGear_RTC app and change its colorspace to grayscale -web = WebGear_RTC( +# initialize WebGear app and change its colorspace to grayscale +web = WebGear( source="foo.mp4", colorspace="COLOR_BGR2GRAY", logging=True, **options ) @@ -248,7 +248,7 @@ WebGear natively supports ASGI middleware classes with Starlette for implementin !!! new "New in v0.2.2" This example was added in `v0.2.2`. -!!! info "All supported middlewares can be [here ➶](https://www.starlette.io/middleware/)" +!!! info "All supported middlewares can be found [here ➶](https://www.starlette.io/middleware/)" For this example, let's use [`CORSMiddleware`](https://www.starlette.io/middleware/#corsmiddleware) for implementing appropriate [CORS headers](https://developer.mozilla.org/en-US/docs/Web/HTTP/CORS) to outgoing responses in our application in order to allow cross-origin requests from browsers, as follows: diff --git a/docs/gears/webgear_rtc/advanced.md b/docs/gears/webgear_rtc/advanced.md index 5ca6f1624..da0887954 100644 --- a/docs/gears/webgear_rtc/advanced.md +++ b/docs/gears/webgear_rtc/advanced.md @@ -262,7 +262,7 @@ WebGear_RTC also natively supports ASGI middleware classes with Starlette for im !!! new "New in v0.2.2" This example was added in `v0.2.2`. -!!! info "All supported middlewares can be [here ➶](https://www.starlette.io/middleware/)" +!!! info "All supported middlewares can be found [here ➶](https://www.starlette.io/middleware/)" For this example, let's use [`CORSMiddleware`](https://www.starlette.io/middleware/#corsmiddleware) for implementing appropriate [CORS headers](https://developer.mozilla.org/en-US/docs/Web/HTTP/CORS) to outgoing responses in our application in order to allow cross-origin requests from browsers, as follows: diff --git a/setup.py b/setup.py index 9185b83db..af187226f 100644 --- a/setup.py +++ b/setup.py @@ -91,11 +91,9 @@ def latest_version(package_name): install_requires=[ "pafy{}".format(latest_version("pafy")), "mss{}".format(latest_version("mss")), - "numpy{}".format( - "<=1.19.5" if sys.version_info[:2] < (3, 7) else "" - ), # dropped support for 3.6.x legacies + "numpy", "youtube-dl{}".format(latest_version("youtube-dl")), - "streamlink{}".format(latest_version("streamlink")), + "streamlink", "requests", "pyzmq{}".format(latest_version("pyzmq")), "simplejpeg{}".format(latest_version("simplejpeg")), @@ -119,12 +117,8 @@ def latest_version(package_name): "aiohttp", "uvicorn{}".format(latest_version("uvicorn")), "msgpack_numpy", + "aiortc{}".format(latest_version("aiortc")), ] - + ( - ["aiortc{}".format(latest_version("aiortc"))] - if (platform.system() != "Windows") - else [] - ) + ( ( ["uvloop{}".format(latest_version("uvloop"))] diff --git a/vidgear/gears/__init__.py b/vidgear/gears/__init__.py index 766af43ce..5fb7f212a 100644 --- a/vidgear/gears/__init__.py +++ b/vidgear/gears/__init__.py @@ -1,8 +1,168 @@ # import the necessary packages -from .pigear import PiGear +import sys +import types +import logging +import importlib +from distutils.version import LooseVersion + +# define custom logger +FORMAT = "%(name)s :: %(levelname)s :: %(message)s" +logging.basicConfig(format=FORMAT) +logger = logging.getLogger("VidGear CORE") +logger.propagate = False + + +def get_module_version(module=None): + """ + ## get_module_version + + Retrieves version of specified module + + Parameters: + name (ModuleType): module of datatype `ModuleType`. + + **Returns:** version of specified module as string + """ + # check if module type is valid + assert not (module is None) and isinstance( + module, types.ModuleType + ), "[VidGear CORE:ERROR] :: Invalid module!" + + # get version from attribute + version = getattr(module, "__version__", None) + # retry if failed + if version is None: + # some modules uses a capitalized attribute name + version = getattr(module, "__VERSION__", None) + # raise if still failed + if version is None: + raise ImportError( + "[VidGear CORE:ERROR] :: Can't determine version for module: `{}`!".format( + module.__name__ + ) + ) + return str(version) + + +def import_core_dependency( + name, pkg_name=None, custom_message=None, version=None, mode="gte" +): + """ + ## import_core_dependency + + Imports specified core dependency. By default(`error = raise`), if a dependency is missing, + an ImportError with a meaningful message will be raised. Also, If a dependency is present, + but version is different than specified, an error is raised. + + Parameters: + name (string): name of dependency to be imported. + pkg_name (string): (Optional) package name of dependency(if different `pip` name). Otherwise `name` will be used. + custom_message (string): (Optional) custom Import error message to be raised or logged. + version(string): (Optional) required minimum/maximum version of the dependency to be imported. + mode(boolean): (Optional) Possible values "gte"(greater then equal), "lte"(less then equal), "exact"(exact). Default is "gte". + + **Returns:** `None` + """ + # check specified parameters + assert name and isinstance( + name, str + ), "[VidGear CORE:ERROR] :: Kindly provide name of the dependency." + + # extract name in case of relative import + sub_class = "" + name = name.strip() + if name.startswith("from"): + name = name.split(" ") + name, sub_class = (name[1].strip(), name[-1].strip()) + + # check mode of operation + assert mode in ["gte", "lte", "exact"], "[VidGear CORE:ERROR] :: Invalid mode!" + + # specify package name of dependency(if defined). Otherwise use name + install_name = pkg_name if not (pkg_name is None) else name + + # create message + msg = ( + custom_message + if not (custom_message is None) + else "Failed to find its core dependency '{}'. Install it with `pip install {}` command.".format( + name, install_name + ) + ) + # try importing dependency + try: + module = importlib.import_module(name) + if sub_class: + module = getattr(module, sub_class) + except ImportError: + # raise + raise ImportError(msg) from None + + # check if minimum required version + if not (version) is None: + # Handle submodules + parent_module = name.split(".")[0] + if parent_module != name: + # grab parent module + module_to_get = sys.modules[parent_module] + else: + module_to_get = module + + # extract version + module_version = get_module_version(module_to_get) + # verify + if mode == "exact": + if LooseVersion(module_version) != LooseVersion(version): + # create message + msg = "Unsupported version '{}' found. Vidgear requires '{}' dependency with exact version '{}' installed!".format( + module_version, parent_module, version + ) + # raise + raise ImportError(msg) + elif mode == "lte": + if LooseVersion(module_version) > LooseVersion(version): + # create message + msg = "Unsupported version '{}' found. Vidgear requires '{}' dependency installed with older version '{}' or smaller!".format( + module_version, parent_module, version + ) + # raise + raise ImportError(msg) + else: + if LooseVersion(module_version) < LooseVersion(version): + # create message + msg = "Unsupported version '{}' found. Vidgear requires '{}' dependency installed with newer version '{}' or greater!".format( + module_version, parent_module, version + ) + # raise + raise ImportError(msg) + return module + + +# import core dependencies +import_core_dependency( + "cv2", + pkg_name="opencv-python", + version="3", + custom_message="Failed to find core dependency '{}'. Install it with `pip install opencv-python` command.", +) +import_core_dependency( + "numpy", + version="1.19.5" + if sys.version_info[:2] < (3, 7) + else None, # dropped support for 3.6.x legacies + mode="lte", +) +import_core_dependency( + "colorlog", +) +import_core_dependency("requests") +import_core_dependency("from tqdm import tqdm", pkg_name="tqdm") + +# import all APIs from .camgear import CamGear -from .netgear import NetGear +from .pigear import PiGear from .videogear import VideoGear +from .netgear import NetGear from .writegear import WriteGear from .screengear import ScreenGear from .streamgear import StreamGear diff --git a/vidgear/gears/asyncio/__init__.py b/vidgear/gears/asyncio/__init__.py index 2b6a0e845..0febd8916 100644 --- a/vidgear/gears/asyncio/__init__.py +++ b/vidgear/gears/asyncio/__init__.py @@ -1,3 +1,4 @@ +# import all APIs from .webgear import WebGear from .webgear_rtc import WebGear_RTC from .netgear_async import NetGear_Async diff --git a/vidgear/gears/asyncio/__main__.py b/vidgear/gears/asyncio/__main__.py index 67b07704e..cd8bcef52 100644 --- a/vidgear/gears/asyncio/__main__.py +++ b/vidgear/gears/asyncio/__main__.py @@ -19,7 +19,7 @@ """ if __name__ == "__main__": - # import libs + # import neccessary libs import yaml import argparse diff --git a/vidgear/gears/asyncio/helper.py b/vidgear/gears/asyncio/helper.py index a88611b02..ab6abcad6 100755 --- a/vidgear/gears/asyncio/helper.py +++ b/vidgear/gears/asyncio/helper.py @@ -21,7 +21,6 @@ # Contains all the support functions/modules required by Vidgear Asyncio packages # import the necessary packages - import os import cv2 import sys @@ -37,51 +36,8 @@ from requests.adapters import HTTPAdapter from requests.packages.urllib3.util.retry import Retry - -def logger_handler(): - """ - ## logger_handler - - Returns the logger handler - - **Returns:** A logger handler - """ - # logging formatter - formatter = ColoredFormatter( - "%(bold_cyan)s%(asctime)s :: %(bold_blue)s%(name)s%(reset)s :: %(log_color)s%(levelname)s%(reset)s :: %(message)s", - datefmt="%H:%M:%S", - reset=False, - log_colors={ - "INFO": "bold_green", - "DEBUG": "bold_yellow", - "WARNING": "bold_purple", - "ERROR": "bold_red", - "CRITICAL": "bold_red,bg_white", - }, - ) - # check if VIDGEAR_LOGFILE defined - file_mode = os.environ.get("VIDGEAR_LOGFILE", False) - # define handler - handler = log.StreamHandler() - if file_mode and isinstance(file_mode, str): - file_path = os.path.abspath(file_mode) - if (os.name == "nt" or os.access in os.supports_effective_ids) and os.access( - os.path.dirname(file_path), os.W_OK - ): - file_path = ( - os.path.join(file_path, "vidgear.log") - if os.path.isdir(file_path) - else file_path - ) - handler = log.FileHandler(file_path, mode="a") - formatter = log.Formatter( - "%(asctime)s :: %(name)s :: %(levelname)s :: %(message)s", - datefmt="%H:%M:%S", - ) - - handler.setFormatter(formatter) - return handler - +# import helper packages +from ..helper import logger_handler, mkdir_safe # define logger logger = log.getLogger("Helper Asyncio") @@ -113,67 +69,6 @@ def send(self, request, **kwargs): return super().send(request, **kwargs) -def mkdir_safe(dir, logging=False): - """ - ## mkdir_safe - - Safely creates directory at given path. - - Parameters: - logging (bool): enables logging for its operations - - """ - try: - os.makedirs(dir) - if logging: - logger.debug("Created directory at `{}`".format(dir)) - except OSError as e: - if e.errno != errno.EEXIST: - raise - if logging: - logger.debug("Directory already exists at `{}`".format(dir)) - - -def capPropId(property, logging=True): - """ - ## capPropId - - Retrieves the OpenCV property's Integer(Actual) value from string. - - Parameters: - property (string): inputs OpenCV property as string. - logging (bool): enables logging for its operations - - **Returns:** Resultant integer value. - """ - integer_value = 0 - try: - integer_value = getattr(cv2, property) - except Exception as e: - if logging: - logger.exception(str(e)) - logger.critical("`{}` is not a valid OpenCV property!".format(property)) - return None - return integer_value - - -def retrieve_best_interpolation(interpolations): - """ - ## retrieve_best_interpolation - Retrieves best interpolation for resizing - - Parameters: - interpolations (list): list of interpolations as string. - **Returns:** Resultant integer value of found interpolation. - """ - if isinstance(interpolations, list): - for intp in interpolations: - interpolation = capPropId(intp, logging=False) - if not (interpolation is None): - return interpolation - return None - - def create_blank_frame(frame=None, text="", logging=False): """ ## create_blank_frame diff --git a/vidgear/gears/asyncio/netgear_async.py b/vidgear/gears/asyncio/netgear_async.py index e6abce228..ddb10b993 100755 --- a/vidgear/gears/asyncio/netgear_async.py +++ b/vidgear/gears/asyncio/netgear_async.py @@ -18,25 +18,31 @@ =============================================== """ # import the necessary packages - import cv2 import sys -import zmq import numpy as np import asyncio import inspect import logging as log -import msgpack import string import secrets import platform -import zmq.asyncio -import msgpack_numpy as m from collections import deque -from .helper import logger_handler +# import helper packages +from ..helper import logger_handler, import_dependency_safe + +# import additional API(s) from ..videogear import VideoGear +# safe import critical Class modules +zmq = import_dependency_safe("zmq", error="silent", min_version="4.0") +if not (zmq is None): + import zmq.asyncio +msgpack = import_dependency_safe("msgpack", error="silent") +m = import_dependency_safe("msgpack_numpy", error="silent") +uvloop = import_dependency_safe("uvloop", error="silent") + # define logger logger = log.getLogger("NetGear_Async") logger.propagate = False @@ -120,6 +126,10 @@ def __init__( time_delay (int): time delay (in sec) before start reading the frames. options (dict): provides ability to alter Tweak Parameters of NetGear_Async, CamGear, PiGear & Stabilizer. """ + # raise error(s) for critical Class imports + import_dependency_safe("zmq" if zmq is None else "", min_version="4.0") + import_dependency_safe("msgpack" if msgpack is None else "") + import_dependency_safe("msgpack_numpy" if m is None else "") # enable logging if specified self.__logging = logging @@ -284,14 +294,12 @@ def __init__( if sys.version_info[:2] >= (3, 8): asyncio.set_event_loop_policy(asyncio.WindowsSelectorEventLoopPolicy()) else: - try: - # import library - import uvloop - - # Latest uvloop eventloop is only available for UNIX machines & python>=3.7. + if not (uvloop is None): + # Latest uvloop eventloop is only available for UNIX machines. asyncio.set_event_loop_policy(uvloop.EventLoopPolicy()) - except ImportError: - pass + else: + # log if not present + import_dependency_safe("uvloop", error="log") # Retrieve event loop and assign it self.loop = asyncio.get_event_loop() diff --git a/vidgear/gears/asyncio/webgear.py b/vidgear/gears/asyncio/webgear.py index bf7f2b2a6..be6e3f634 100755 --- a/vidgear/gears/asyncio/webgear.py +++ b/vidgear/gears/asyncio/webgear.py @@ -18,32 +18,38 @@ =============================================== """ # import the necessary packages - import os import cv2 import sys import asyncio import inspect -import simplejpeg import numpy as np import logging as log from collections import deque -from starlette.routing import Mount, Route -from starlette.responses import StreamingResponse -from starlette.templating import Jinja2Templates -from starlette.staticfiles import StaticFiles -from starlette.applications import Starlette -from starlette.middleware import Middleware +from os.path import expanduser +# import helper packages from .helper import ( reducer, - logger_handler, generate_webdata, create_blank_frame, - retrieve_best_interpolation, ) +from ..helper import logger_handler, retrieve_best_interpolation, import_dependency_safe + +# import additional API(s) from ..videogear import VideoGear +# safe import critical Class modules +starlette = import_dependency_safe("starlette", error="silent") +if not (starlette is None): + from starlette.routing import Mount, Route + from starlette.responses import StreamingResponse + from starlette.templating import Jinja2Templates + from starlette.staticfiles import StaticFiles + from starlette.applications import Starlette + from starlette.middleware import Middleware +simplejpeg = import_dependency_safe("simplejpeg", error="silent", min_version="1.6.1") + # define logger logger = log.getLogger("WebGear") logger.propagate = False @@ -100,6 +106,11 @@ def __init__( time_delay (int): time delay (in sec) before start reading the frames. options (dict): provides ability to alter Tweak Parameters of WebGear, CamGear, PiGear & Stabilizer. """ + # raise error(s) for critical Class imports + import_dependency_safe("starlette" if starlette is None else "") + import_dependency_safe( + "simplejpeg" if simplejpeg is None else "", min_version="1.6.1" + ) # initialize global params # define frame-compression handler @@ -227,8 +238,6 @@ def __init__( ) else: # otherwise generate suitable path - from os.path import expanduser - data_path = generate_webdata( os.path.join(expanduser("~"), ".vidgear"), c_name="webgear", diff --git a/vidgear/gears/asyncio/webgear_rtc.py b/vidgear/gears/asyncio/webgear_rtc.py index bdb59a688..3ae39b931 100644 --- a/vidgear/gears/asyncio/webgear_rtc.py +++ b/vidgear/gears/asyncio/webgear_rtc.py @@ -18,7 +18,6 @@ =============================================== """ # import the necessary packages - import os import cv2 import sys @@ -27,33 +26,40 @@ import asyncio import logging as log from collections import deque -from starlette.routing import Mount, Route -from starlette.templating import Jinja2Templates -from starlette.staticfiles import StaticFiles -from starlette.applications import Starlette -from starlette.middleware import Middleware -from starlette.responses import JSONResponse, PlainTextResponse - -from aiortc.rtcrtpsender import RTCRtpSender -from aiortc import ( - RTCPeerConnection, - RTCSessionDescription, - VideoStreamTrack, -) -from aiortc.contrib.media import MediaRelay -from aiortc.mediastreams import MediaStreamError -from av import VideoFrame - +from os.path import expanduser +# import helper packages from .helper import ( reducer, - logger_handler, generate_webdata, create_blank_frame, - retrieve_best_interpolation, ) +from ..helper import logger_handler, retrieve_best_interpolation, import_dependency_safe + +# import additional API(s) from ..videogear import VideoGear +# safe import critical Class modules +starlette = import_dependency_safe("starlette", error="silent") +if not (starlette is None): + from starlette.routing import Mount, Route + from starlette.templating import Jinja2Templates + from starlette.staticfiles import StaticFiles + from starlette.applications import Starlette + from starlette.middleware import Middleware + from starlette.responses import JSONResponse, PlainTextResponse +aiortc = import_dependency_safe("aiortc", error="silent") +if not (aiortc is None): + from aiortc.rtcrtpsender import RTCRtpSender + from aiortc import ( + RTCPeerConnection, + RTCSessionDescription, + VideoStreamTrack, + ) + from aiortc.contrib.media import MediaRelay + from aiortc.mediastreams import MediaStreamError + from av import VideoFrame # aiortc dependency + # define logger logger = log.getLogger("WebGear_RTC") if logger.hasHandlers(): @@ -109,6 +115,9 @@ def __init__( super().__init__() # don't forget this! + # raise error(s) for critical Class import + import_dependency_safe("aiortc" if aiortc is None else "") + # initialize global params self.__logging = logging self.__enable_inf = False # continue frames even when video ends. @@ -329,6 +338,9 @@ def __init__( time_delay (int): time delay (in sec) before start reading the frames. options (dict): provides ability to alter Tweak Parameters of WebGear_RTC, CamGear, PiGear & Stabilizer. """ + # raise error(s) for critical Class imports + import_dependency_safe("starlette" if starlette is None else "") + import_dependency_safe("aiortc" if aiortc is None else "") # initialize global params self.__logging = logging @@ -394,8 +406,6 @@ def __init__( ) else: # otherwise generate suitable path - from os.path import expanduser - data_path = generate_webdata( os.path.join(expanduser("~"), ".vidgear"), c_name="webgear_rtc", diff --git a/vidgear/gears/camgear.py b/vidgear/gears/camgear.py index 251425fbe..2736a5729 100644 --- a/vidgear/gears/camgear.py +++ b/vidgear/gears/camgear.py @@ -17,14 +17,15 @@ limitations under the License. =============================================== """ -# import the necessary packages +# import the necessary packages import cv2 import time import queue import logging as log from threading import Thread, Event +# import helper packages from .helper import ( capPropId, logger_handler, @@ -34,6 +35,7 @@ get_supported_resolution, check_gstreamer_support, dimensions_to_resolutions, + import_dependency_safe, ) # define logger @@ -104,8 +106,7 @@ def __init__( video_url = youtube_url_validator(source) if video_url: # import backend library - import pafy - + pafy = import_dependency_safe("pafy") logger.info("Using Youtube-dl Backend") # create new pafy object source_object = pafy.new(video_url, ydl_opts=stream_params) @@ -183,8 +184,9 @@ def __init__( ) else: # import backend library - from streamlink import Streamlink - + Streamlink = import_dependency_safe( + "from streamlink import Streamlink" + ) restore_levelnames() logger.info("Using Streamlink Backend") # check session @@ -442,7 +444,7 @@ def stop(self): if self.__threaded_queue_mode: self.__threaded_queue_mode = False - # indicate that the thread + # indicate that the thread # should be terminated immediately self.__terminate.set() self.__stream_read.set() diff --git a/vidgear/gears/helper.py b/vidgear/gears/helper.py index 1e18dd2f2..f57bb905b 100755 --- a/vidgear/gears/helper.py +++ b/vidgear/gears/helper.py @@ -21,16 +21,18 @@ # Contains all the support functions/modules required by Vidgear packages # import the necessary packages - import os import re import sys +import cv2 +import types import errno import shutil +import importlib +import requests import numpy as np import logging as log import platform -import requests import socket from tqdm import tqdm from contextlib import closing @@ -40,20 +42,6 @@ from requests.adapters import HTTPAdapter from requests.packages.urllib3.util.retry import Retry -try: - # import OpenCV Binaries - import cv2 - - # check whether OpenCV Binaries are 3.x+ - if LooseVersion(cv2.__version__) < LooseVersion("3"): - raise ImportError( - "[Vidgear:ERROR] :: Installed OpenCV API version(< 3.0) is not supported!" - ) -except ImportError: - raise ImportError( - "[Vidgear:ERROR] :: Failed to detect correct OpenCV executables, install it with `pip install opencv-python` command." - ) - def logger_handler(): """ @@ -106,6 +94,134 @@ def logger_handler(): logger.addHandler(logger_handler()) logger.setLevel(log.DEBUG) + +def get_module_version(module=None): + """ + ## get_module_version + + Retrieves version of specified module + + Parameters: + name (ModuleType): module of datatype `ModuleType`. + + **Returns:** version of specified module as string + """ + # check if module type is valid + assert not (module is None) and isinstance( + module, types.ModuleType + ), "[Vidgear:ERROR] :: Invalid module!" + + # get version from attribute + version = getattr(module, "__version__", None) + # retry if failed + if version is None: + # some modules uses a capitalized attribute name + version = getattr(module, "__VERSION__", None) + # raise if still failed + if version is None: + raise ImportError( + "[Vidgear:ERROR] :: Can't determine version for module: `{}`!".format( + module.__name__ + ) + ) + return str(version) + + +def import_dependency_safe( + name, + error="raise", + pkg_name=None, + min_version=None, + custom_message=None, +): + """ + ## import_dependency_safe + + Imports specified dependency safely. By default(`error = raise`), if a dependency is missing, + an ImportError with a meaningful message will be raised. Otherwise if `error = log` a warning + will be logged and on `error = silent` everything will be quit. But If a dependency is present, + but older than specified, an error is raised if specified. + + Parameters: + name (string): name of dependency to be imported. + error (string): raise or Log or silence ImportError. Possible values are `"raise"`, `"log"` and `silent`. Default is `"raise"`. + pkg_name (string): (Optional) package name of dependency(if different `pip` name). Otherwise `name` will be used. + min_version(string): (Optional) required minimum version of the dependency to be imported. + custom_message (string): (Optional) custom Import error message to be raised or logged. + + **Returns:** The imported module, when found and the version is correct(if specified). Otherwise `None`. + """ + # check specified parameters + sub_class = "" + if not name or not isinstance(name, str): + return None + else: + # extract name in case of relative import + name = name.strip() + if name.startswith("from"): + name = name.split(" ") + name, sub_class = (name[1].strip(), name[-1].strip()) + + assert error in [ + "raise", + "log", + "silent", + ], "[Vidgear:ERROR] :: Invalid value at `error` parameter." + + # specify package name of dependency(if defined). Otherwise use name + install_name = pkg_name if not (pkg_name is None) else name + + # create message + msg = ( + custom_message + if not (custom_message is None) + else "Failed to find required dependency '{}'. Install it with `pip install {}` command.".format( + name, install_name + ) + ) + # try importing dependency + try: + module = importlib.import_module(name) + if sub_class: + module = getattr(module, sub_class) + except Exception: + # handle errors. + if error == "raise": + raise ImportError(msg) from None + elif error == "log": + logger.error(msg) + return None + else: + return None + + # check if minimum required version + if not (min_version) is None: + # Handle submodules + parent_module = name.split(".")[0] + if parent_module != name: + # grab parent module + module_to_get = sys.modules[parent_module] + else: + module_to_get = module + # extract version + version = get_module_version(module_to_get) + # verify + if LooseVersion(version) < LooseVersion(min_version): + # create message + msg = """Unsupported version '{}' found. Vidgear requires '{}' dependency installed with version '{}' or greater. + Update it with `pip install -U {}` command.""".format( + parent_module, min_version, version, install_name + ) + # handle errors. + if error == "silent": + return None + else: + # raise + raise ImportError(msg) + + return module + + # set default timer for download requests DEFAULT_TIMEOUT = 3 @@ -1017,7 +1133,7 @@ def check_output(*args, **kwargs): stdout=sp.PIPE, stderr=sp.DEVNULL if not (retrieve_stderr) else sp.PIPE, *args, - **kwargs + **kwargs, ) output, stderr = process.communicate() retcode = process.poll() diff --git a/vidgear/gears/netgear.py b/vidgear/gears/netgear.py index aacad21b0..cba317bac 100644 --- a/vidgear/gears/netgear.py +++ b/vidgear/gears/netgear.py @@ -18,31 +18,36 @@ =============================================== """ # import the necessary packages - import os import cv2 -import zmq import time import string import secrets -import simplejpeg - import numpy as np import logging as log - -from zmq import ssh -from zmq import auth -from zmq.auth.thread import ThreadAuthenticator -from zmq.error import ZMQError from threading import Thread from collections import deque +from os.path import expanduser + +# import helper packages from .helper import ( logger_handler, generate_auth_certificates, check_WriteAccess, check_open_port, + import_dependency_safe, ) +# safe import critical Class modules +zmq = import_dependency_safe("zmq", error="silent", min_version="4.0") +if not (zmq is None): + from zmq import ssh + from zmq import auth + from zmq.auth.thread import ThreadAuthenticator + from zmq.error import ZMQError +simplejpeg = import_dependency_safe("simplejpeg", error="silent", min_version="1.6.1") +paramiko = import_dependency_safe("paramiko", error="silent") + # define logger logger = log.getLogger("NetGear") logger.propagate = False @@ -127,6 +132,12 @@ def __init__( logging (bool): enables/disables logging. options (dict): provides the flexibility to alter various NetGear internal properties. """ + # raise error(s) for critical Class imports + import_dependency_safe("zmq" if zmq is None else "", min_version="4.0") + import_dependency_safe( + "simplejpeg" if simplejpeg is None else "", error="log", min_version="1.6.1" + ) + # enable logging if specified self.__logging = True if logging else False @@ -176,7 +187,7 @@ def __init__( self.__ssh_tunnel_mode = None # handles ssh_tunneling mode state self.__ssh_tunnel_pwd = None self.__ssh_tunnel_keyfile = None - self.__paramiko_present = False + self.__paramiko_present = False if paramiko is None else True # define Multi-Server mode self.__multiserver_mode = False # handles multi-server mode state @@ -197,7 +208,9 @@ def __init__( custom_cert_location = "" # handles custom ZMQ certificates path # define frame-compression handler - self.__jpeg_compression = True # enabled by default for all connections + self.__jpeg_compression = ( + True if not (simplejpeg is None) else False + ) # enabled by default for all connections if simplejpeg is installed self.__jpeg_compression_quality = 90 # 90% quality self.__jpeg_compression_fastdct = True # fastest DCT on by default self.__jpeg_compression_fastupsample = False # fastupsample off by default @@ -286,12 +299,6 @@ def __init__( and isinstance(value, int) and (value in valid_security_mech) ): - # check if installed libzmq version is valid - assert zmq.zmq_version_info() >= ( - 4, - 0, - ), "[NetGear:ERROR] :: ZMQ Security feature is not supported in libzmq version < 4.0." - # assign valid mode self.__secure_mode = value elif key == "custom_cert_location" and isinstance(value, str): @@ -328,7 +335,11 @@ def __init__( ) # handle jpeg compression - elif key == "jpeg_compression" and isinstance(value, (bool, str)): + elif ( + key == "jpeg_compression" + and not (simplejpeg is None) + and isinstance(value, (bool, str)) + ): if isinstance(value, str) and value.strip().upper() in [ "RGB", "BGR", @@ -416,8 +427,6 @@ def __init__( ) else: # otherwise auto-generate suitable path - from os.path import expanduser - ( auth_cert_dir, self.__auth_secretkeys_dir, @@ -473,13 +482,6 @@ def __init__( ssh_address, ssh_port ) - # import packages - import importlib - - self.__paramiko_present = ( - True if bool(importlib.util.find_spec("paramiko")) else False - ) - # Handle multiple exclusive modes if enabled if self.__multiclient_mode and self.__multiserver_mode: raise ValueError( diff --git a/vidgear/gears/pigear.py b/vidgear/gears/pigear.py index c322fff8e..975284406 100644 --- a/vidgear/gears/pigear.py +++ b/vidgear/gears/pigear.py @@ -17,16 +17,21 @@ limitations under the License. =============================================== """ - -# import the packages - +# import the necessary packages import cv2 import sys import time import logging as log from threading import Thread -from .helper import capPropId, logger_handler +# import helper packages +from .helper import capPropId, logger_handler, import_dependency_safe + +# safe import critical Class modules +picamera = import_dependency_safe("picamera", error="silent") +if not (picamera is None): + from picamera import PiCamera + from picamera.array import PiRGBArray # define logger logger = log.getLogger("PiGear") @@ -69,22 +74,10 @@ def __init__( time_delay (int): time delay (in sec) before start reading the frames. options (dict): provides ability to alter Source Tweak Parameters. """ - - try: - import picamera - from picamera import PiCamera - from picamera.array import PiRGBArray - except Exception as error: - if isinstance(error, ImportError): - # Output expected ImportErrors. - raise ImportError( - '[PiGear:ERROR] :: Failed to detect Picamera executables, install it with "pip3 install picamera" command.' - ) - else: - # Handle any API errors - raise RuntimeError( - "[PiGear:ERROR] :: Picamera API failure: {}".format(error) - ) + # raise error(s) for critical Class imports + import_dependency_safe( + "picamera" if picamera is None else "", + ) # enable logging if specified self.__logging = False diff --git a/vidgear/gears/screengear.py b/vidgear/gears/screengear.py index 004186be3..094dd1d0b 100644 --- a/vidgear/gears/screengear.py +++ b/vidgear/gears/screengear.py @@ -18,20 +18,24 @@ =============================================== """ # import the necessary packages - import cv2 import time import queue import numpy as np import logging as log -from mss import mss -import pyscreenshot as pysct from threading import Thread, Event from collections import deque, OrderedDict -from mss.exception import ScreenShotError -from pyscreenshot.err import FailedBackendError -from .helper import capPropId, logger_handler +# import helper packages +from .helper import import_dependency_safe, capPropId, logger_handler + +# safe import critical Class modules +mss = import_dependency_safe("from mss import mss", error="silent") +if not (mss is None): + from mss.exception import ScreenShotError +pysct = import_dependency_safe("pyscreenshot", error="silent") +if not (pysct is None): + from pyscreenshot.err import FailedBackendError # define logger logger = log.getLogger("ScreenGear") @@ -61,6 +65,10 @@ def __init__( logging (bool): enables/disables logging. options (dict): provides the flexibility to manually set the dimensions of capture screen area. """ + # raise error(s) for critical Class imports + import_dependency_safe("mss.mss" if mss is None else "") + import_dependency_safe("pyscreenshot" if pysct is None else "") + # enable logging if specified: self.__logging = logging if isinstance(logging, bool) else False diff --git a/vidgear/gears/stabilizer.py b/vidgear/gears/stabilizer.py index 91746de1b..2bf34e6eb 100644 --- a/vidgear/gears/stabilizer.py +++ b/vidgear/gears/stabilizer.py @@ -21,12 +21,12 @@ =============================================== """ # import the necessary packages - import cv2 import numpy as np import logging as log from collections import deque +# import helper packages from .helper import logger_handler, check_CV_version, retrieve_best_interpolation # define logger @@ -140,7 +140,7 @@ def __init__( # retrieve best interpolation self.__interpolation = retrieve_best_interpolation( - ["INTER_LINEAR_EXACT", "INTER_LINEAR", "INTER_CUBIC"] + ["INTER_LINEAR_EXACT", "INTER_LINEAR", "INTER_AREA"] ) # define normalized box filter diff --git a/vidgear/gears/streamgear.py b/vidgear/gears/streamgear.py index dfd550901..e1ef5cb33 100644 --- a/vidgear/gears/streamgear.py +++ b/vidgear/gears/streamgear.py @@ -18,7 +18,6 @@ =============================================== """ # import the necessary packages - import os import cv2 import sys @@ -31,6 +30,7 @@ from fractions import Fraction from collections import OrderedDict +# import helper packages from .helper import ( capPropId, dict2Args, @@ -47,6 +47,7 @@ # define logger logger = log.getLogger("StreamGear") +logger.propagate = False logger.addHandler(logger_handler()) logger.setLevel(log.DEBUG) @@ -79,7 +80,6 @@ def __init__( logging (bool): enables/disables logging. stream_params (dict): provides the flexibility to control supported internal parameters and FFmpeg properities. """ - # checks if machine in-use is running windows os or not self.__os_windows = True if os.name == "nt" else False # enable logging if specified diff --git a/vidgear/gears/videogear.py b/vidgear/gears/videogear.py index 681a9abdd..bef898dc5 100644 --- a/vidgear/gears/videogear.py +++ b/vidgear/gears/videogear.py @@ -21,7 +21,10 @@ # import the necessary packages import logging as log +# import helper packages from .helper import logger_handler + +# import additional API(s) from .camgear import CamGear # define logger diff --git a/vidgear/gears/writegear.py b/vidgear/gears/writegear.py index 0e26af04c..a6d8e1ba9 100644 --- a/vidgear/gears/writegear.py +++ b/vidgear/gears/writegear.py @@ -18,7 +18,6 @@ =============================================== """ # import the necessary packages - import os import cv2 import sys @@ -26,6 +25,7 @@ import logging as log import subprocess as sp +# import helper packages from .helper import ( capPropId, dict2Args, @@ -46,13 +46,13 @@ class WriteGear: """ - WriteGear handles various powerful Video-Writer Tools that provide us the freedom to do almost anything imaginable with multimedia data. + WriteGear handles various powerful Video-Writer Tools that provide us the freedom to do almost anything imaginable with multimedia data. - WriteGear API provides a complete, flexible, and robust wrapper around FFmpeg, a leading multimedia framework. WriteGear can process real-time frames into a lossless - compressed video-file with any suitable specification (such asbitrate, codec, framerate, resolution, subtitles, etc.). It is powerful enough to perform complex tasks such as + WriteGear API provides a complete, flexible, and robust wrapper around FFmpeg, a leading multimedia framework. WriteGear can process real-time frames into a lossless + compressed video-file with any suitable specification (such asbitrate, codec, framerate, resolution, subtitles, etc.). It is powerful enough to perform complex tasks such as Live-Streaming (such as for Twitch) and Multiplexing Video-Audio with real-time frames in way fewer lines of code. - Best of all, WriteGear grants users the complete freedom to play with any FFmpeg parameter with its exclusive Custom Commands function without relying on any + Best of all, WriteGear grants users the complete freedom to play with any FFmpeg parameter with its exclusive Custom Commands function without relying on any third-party API. In addition to this, WriteGear also provides flexible access to OpenCV's VideoWriter API tools for video-frames encoding without compression. diff --git a/vidgear/tests/network_tests/asyncio_tests/test_helper.py b/vidgear/tests/network_tests/asyncio_tests/test_helper.py index 5741146c0..3ff1630e7 100644 --- a/vidgear/tests/network_tests/asyncio_tests/test_helper.py +++ b/vidgear/tests/network_tests/asyncio_tests/test_helper.py @@ -29,9 +29,9 @@ from vidgear.gears.asyncio.helper import ( reducer, create_blank_frame, - logger_handler, retrieve_best_interpolation, ) +from vidgear.gears.helper import logger_handler # define test logger logger = log.getLogger("Test_Asyncio_Helper") diff --git a/vidgear/tests/network_tests/asyncio_tests/test_netgear_async.py b/vidgear/tests/network_tests/asyncio_tests/test_netgear_async.py index 0a8b2c311..3d7a8c6a3 100644 --- a/vidgear/tests/network_tests/asyncio_tests/test_netgear_async.py +++ b/vidgear/tests/network_tests/asyncio_tests/test_netgear_async.py @@ -32,7 +32,7 @@ import tempfile from vidgear.gears.asyncio import NetGear_Async -from vidgear.gears.asyncio.helper import logger_handler +from vidgear.gears.helper import logger_handler # define test logger logger = log.getLogger("Test_NetGear_Async") diff --git a/vidgear/tests/streamer_tests/asyncio_tests/test_webgear.py b/vidgear/tests/streamer_tests/asyncio_tests/test_webgear.py index de46afbc4..e05d5d426 100644 --- a/vidgear/tests/streamer_tests/asyncio_tests/test_webgear.py +++ b/vidgear/tests/streamer_tests/asyncio_tests/test_webgear.py @@ -33,7 +33,7 @@ from starlette.testclient import TestClient from vidgear.gears.asyncio import WebGear -from vidgear.gears.asyncio.helper import logger_handler +from vidgear.gears.helper import logger_handler # define test logger logger = log.getLogger("Test_webgear") diff --git a/vidgear/tests/streamer_tests/asyncio_tests/test_webgear_rtc.py b/vidgear/tests/streamer_tests/asyncio_tests/test_webgear_rtc.py index a1a188810..cec35673f 100644 --- a/vidgear/tests/streamer_tests/asyncio_tests/test_webgear_rtc.py +++ b/vidgear/tests/streamer_tests/asyncio_tests/test_webgear_rtc.py @@ -44,7 +44,7 @@ ) from av import VideoFrame from vidgear.gears.asyncio import WebGear_RTC -from vidgear.gears.asyncio.helper import logger_handler +from vidgear.gears.helper import logger_handler # define test logger From 3d45f5d61c5ea7c27e172bd6510f9c70f1a75b3d Mon Sep 17 00:00:00 2001 From: abhiTronix Date: Tue, 31 Aug 2021 19:14:08 +0530 Subject: [PATCH 05/11] =?UTF-8?q?=F0=9F=93=9D=20Docs:=20Added=20docs=20and?= =?UTF-8?q?=20other=20related=20tweaks.?= MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit - 📝 Added docs for installing vidgear with only selective dependencies - 💄 Added new `advance`/`experiment` admonition with new background color. - 🍱 Added new icons svg for `advance` and `warning` admonition. - ⚰️ Removed redundant data table tweaks from `custom.css`. - 🎨 Beautify `custom.css`. - ✏️ Fixed typos in URL links. - Setup.py: - ➖ Removed all redundant dependencies like `colorama`, `aiofiles`, `aiohttp`. - ➕ Added new `cython` and `msgpack` dependency. - 🎨 Added `msgpack` and `msgpack_numpy` to auto-install latest. - Helper: - ⚰️ Removed unused `aiohttp` dependency. - 🔊 Removed `asctime` from logging. --- docs/installation/pip_install.md | 72 +++++++++++++++--- docs/installation/source_install.md | 78 ++++++++++++++++++-- docs/overrides/assets/stylesheets/custom.css | 32 ++++---- setup.py | 7 +- vidgear/gears/asyncio/helper.py | 1 - vidgear/gears/asyncio/netgear_async.py | 2 +- vidgear/gears/helper.py | 2 +- vidgear/gears/netgear.py | 2 +- 8 files changed, 159 insertions(+), 37 deletions(-) diff --git a/docs/installation/pip_install.md b/docs/installation/pip_install.md index 1b9b66663..69106f7f5 100644 --- a/docs/installation/pip_install.md +++ b/docs/installation/pip_install.md @@ -29,9 +29,9 @@ limitations under the License. When installing VidGear with [pip](https://pip.pypa.io/en/stable/installing/), you need to check manually if following dependencies are installed: -??? alert "Latest `pip` Recommended" +!!! alert "Upgrade your `pip`" - It advised to install latest `pip` version before installing vidgear to avoid any undesired errors. Python comes with an [`ensurepip`](https://docs.python.org/3/library/ensurepip.html#module-ensurepip) module[^1], which can easily install `pip` in any Python environment. + It strongly advised to upgrade to latest `pip` before installing vidgear to avoid any undesired installation error(s). Python comes with an [`ensurepip`](https://docs.python.org/3/library/ensurepip.html#module-ensurepip) module[^1], which can easily install `pip` in any Python environment. === "Linux" @@ -81,7 +81,7 @@ When installing VidGear with [pip](https://pip.pypa.io/en/stable/installing/), y * #### FFmpeg - Require for the video compression and encoding compatibilities within [**StreamGear**](#streamgear) API and [**WriteGear API's Compression Mode**](../../gears/writegear/compression/overview/). + Require only for the video compression and encoding compatibility within [**StreamGear API**](../../gears/streamgear/overview/) API and [**WriteGear API's Compression Mode**](../../gears/writegear/compression/overview/). !!! tip "FFmpeg Installation" @@ -104,7 +104,7 @@ When installing VidGear with [pip](https://pip.pypa.io/en/stable/installing/), y ??? error "Microsoft Visual C++ 14.0 is required." - Installing `aiortc` on windows requires Microsoft Build Tools for Visual C++ libraries installed. You can easily fix this error by installing any **ONE** of these choices: + Installing `aiortc` on windows may sometimes require Microsoft Build Tools for Visual C++ libraries installed. You can easily fix this error by installing any **ONE** of these choices: !!! info "While the error is calling for VC++ 14.0 - but newer versions of Visual C++ libraries works as well." @@ -153,7 +153,7 @@ When installing VidGear with [pip](https://pip.pypa.io/en/stable/installing/), y python -m pip install vidgear[asyncio] ``` - If you don't have the privileges to the directory you're installing package. Then use `--user` flag, that makes pip install packages in your home directory instead: + And, If you don't have the privileges to the directory you're installing package. Then use `--user` flag, that makes pip install packages in your home directory instead: ``` sh python -m pip install --user vidgear @@ -162,12 +162,66 @@ When installing VidGear with [pip](https://pip.pypa.io/en/stable/installing/), y python -m pip install --user vidgear[asyncio] ``` + Or, If you're using `py` as alias for installed python, then: + + ``` sh + py -m pip install --user vidgear + + # or with asyncio support + py -m pip install --user vidgear[asyncio] + ``` + +??? experiment "Installing vidgear with only selective dependencies" + + Starting with version `v0.2.2`, you can now run any VidGear API by installing only just specific dependencies required by the API in use(except for some Core dependencies). + + This is useful when you want to manually review, select and install minimal API-specific dependencies on bare-minimum vidgear from scratch on your system: + + - To install bare-minimum vidgear without any dependencies, use [`--no-deps`](https://pip.pypa.io/en/stable/cli/pip_install/#cmdoption-no-deps) pip flag as follows: + + ```sh + # Install stable release without any dependencies + pip install --no-deps --upgrade vidgear + ``` + + - Then, you must install all **Core dependencies**: + + ```sh + # Install core dependencies + pip install cython, numpy, requests, tqdm, colorlog + + # Install opencv(only if not installed previously) + pip install opencv-python + ``` + + - Finally, manually install your **API-specific dependencies** as required by your API(in use): + + + | APIs | Dependencies | + |:---:|:---| + | CamGear | `pafy`, `youtube-dl`, `streamlink` | + | PiGear | `picamera` | + | VideoGear | - | + | ScreenGear | `mss`, `pyscreenshot`, `Pillow` | + | WriteGear | **FFmpeg:** See [this doc ➶](https://abhitronix.github.io/vidgear/v0.2.2-dev/gears/writegear/compression/advanced/ffmpeg_install/#ffmpeg-installation-instructions) | + | StreamGear | **FFmpeg:** See [this doc ➶](https://abhitronix.github.io/vidgear/v0.2.2-dev/gears/streamgear/ffmpeg_install/#ffmpeg-installation-instructions) | + | NetGear | `pyzmq`, `simplejpeg` | + | WebGear | `starlette`, `jinja2`, `uvicorn`, `simplejpeg` | + | WebGear_RTC | `aiortc`, `starlette`, `jinja2`, `uvicorn` | + | NetGear_Async | `pyzmq`, `msgpack`, `msgpack_numpy`, `uvloop` | + + ```sh + # Just copy-&-paste from above table + pip install + ``` + + ```sh -# Install stable release -pip install vidgear +# Install latest stable release +pip install -U vidgear -# Or Install stable release with Asyncio support -pip install vidgear[asyncio] +# Or Install latest stable release with Asyncio support +pip install -U vidgear[asyncio] ``` **And if you prefer to install VidGear directly from the repository:** diff --git a/docs/installation/source_install.md b/docs/installation/source_install.md index d8b299be4..243f497a6 100644 --- a/docs/installation/source_install.md +++ b/docs/installation/source_install.md @@ -33,9 +33,9 @@ When installing VidGear from source, FFmpeg and Aiortc are the only two API spec Any other python dependencies _(Core/API specific)_ will be automatically installed based on your OS specifications. -??? alert "Latest `pip` Recommended" +!!! alert "Upgrade your `pip`" - It advised to install latest `pip` version before installing vidgear to avoid any undesired errors. Python comes with an [`ensurepip`](https://docs.python.org/3/library/ensurepip.html#module-ensurepip) module[^1], which can easily install `pip` in any Python environment. + It strongly advised to upgrade to latest `pip` before installing vidgear to avoid any undesired installation error(s). Python comes with an [`ensurepip`](https://docs.python.org/3/library/ensurepip.html#module-ensurepip) module[^1], which can easily install `pip` in any Python environment. === "Linux" @@ -63,7 +63,7 @@ When installing VidGear from source, FFmpeg and Aiortc are the only two API spec * #### FFmpeg - Require for the video compression and encoding compatibilities within [**StreamGear**](#streamgear) API and [**WriteGear API's Compression Mode**](../../gears/writegear/compression/overview/). + Require only for the video compression and encoding compatibility within [**StreamGear API**](../../gears/streamgear/overview/) API and [**WriteGear API's Compression Mode**](../../gears/writegear/compression/overview/). !!! tip "FFmpeg Installation" @@ -76,7 +76,7 @@ When installing VidGear from source, FFmpeg and Aiortc are the only two API spec ??? error "Microsoft Visual C++ 14.0 is required." - Installing `aiortc` on windows requires Microsoft Build Tools for Visual C++ libraries installed. You can easily fix this error by installing any **ONE** of these choices: + Installing `aiortc` on windows may sometimes requires Microsoft Build Tools for Visual C++ libraries installed. You can easily fix this error by installing any **ONE** of these choices: !!! info "While the error is calling for VC++ 14.0 - but newer versions of Visual C++ libraries works as well." @@ -111,7 +111,7 @@ When installing VidGear from source, FFmpeg and Aiortc are the only two API spec * Use following commands to clone and install VidGear: - ```sh + ```sh # clone the repository and get inside git clone https://github.com/abhiTronix/vidgear.git && cd vidgear @@ -123,7 +123,73 @@ When installing VidGear from source, FFmpeg and Aiortc are the only two API spec # OR install with asyncio support python - m pip install .[asyncio] - ``` + ``` + + * If you're using `py` as alias for installed python, then: + + ``` sh + # clone the repository and get inside + git clone https://github.com/abhiTronix/vidgear.git && cd vidgear + + # checkout the latest testing branch + git checkout testing + + # install normally + python -m pip install . + + # OR install with asyncio support + python - m pip install .[asyncio] + ``` + +??? experiment "Installing vidgear with only selective dependencies" + + Starting with version `v0.2.2`, you can now run any VidGear API by installing only just specific dependencies required by the API in use(except for some Core dependencies). + + This is useful when you want to manually review, select and install minimal API-specific dependencies on bare-minimum vidgear from scratch on your system: + + - To install bare-minimum vidgear without any dependencies, use [`--no-deps`](https://pip.pypa.io/en/stable/cli/pip_install/#cmdoption-no-deps) pip flag as follows: + + ```sh + # clone the repository and get inside + git clone https://github.com/abhiTronix/vidgear.git && cd vidgear + + # checkout the latest testing branch + git checkout testing + + # Install without any dependencies + pip install --no-deps . + ``` + + - Then, you must install all **Core dependencies**: + + ```sh + # Install core dependencies + pip install cython, numpy, requests, tqdm, colorlog + + # Install opencv(only if not installed previously) + pip install opencv-python + ``` + + - Finally, manually install your **API-specific dependencies** as required by your API(in use): + + + | APIs | Dependencies | + |:---:|:---| + | CamGear | `pafy`, `youtube-dl`, `streamlink` | + | PiGear | `picamera` | + | VideoGear | - | + | ScreenGear | `mss`, `pyscreenshot`, `Pillow` | + | WriteGear | **FFmpeg:** See [this doc ➶](https://abhitronix.github.io/vidgear/v0.2.2-dev/gears/writegear/compression/advanced/ffmpeg_install/#ffmpeg-installation-instructions) | + | StreamGear | **FFmpeg:** See [this doc ➶](https://abhitronix.github.io/vidgear/v0.2.2-dev/gears/streamgear/ffmpeg_install/#ffmpeg-installation-instructions) | + | NetGear | `pyzmq`, `simplejpeg` | + | WebGear | `starlette`, `jinja2`, `uvicorn`, `simplejpeg` | + | WebGear_RTC | `aiortc`, `starlette`, `jinja2`, `uvicorn` | + | NetGear_Async | `pyzmq`, `msgpack`, `msgpack_numpy`, `uvloop` | + + ```sh + # Just copy-&-paste from above table + pip install + ``` ```sh # clone the repository and get inside diff --git a/docs/overrides/assets/stylesheets/custom.css b/docs/overrides/assets/stylesheets/custom.css index af2beccd0..411c08edd 100755 --- a/docs/overrides/assets/stylesheets/custom.css +++ b/docs/overrides/assets/stylesheets/custom.css @@ -22,7 +22,7 @@ limitations under the License. --md-admonition-icon--new: url("data:image/svg+xml,%3C%3Fxml version='1.0' encoding='UTF-8'%3F%3E%3C!DOCTYPE svg PUBLIC '-//W3C//DTD SVG 1.1//EN' 'http://www.w3.org/Graphics/SVG/1.1/DTD/svg11.dtd'%3E%3Csvg xmlns='http://www.w3.org/2000/svg' xmlns:xlink='http://www.w3.org/1999/xlink' version='1.1' width='24' height='24' viewBox='0 0 24 24'%3E%3Cpath fill='%23000000' d='M13 2V3H12V9H11V10H9V11H8V12H7V13H5V12H4V11H3V9H2V15H3V16H4V17H5V18H6V22H8V21H7V20H8V19H9V18H10V19H11V22H13V21H12V17H13V16H14V15H15V12H16V13H17V11H15V9H20V8H17V7H22V3H21V2M14 3H15V4H14Z' /%3E%3C/svg%3E"); --md-admonition-icon--alert: url("data:image/svg+xml,%3C%3Fxml version='1.0' encoding='UTF-8'%3F%3E%3C!DOCTYPE svg PUBLIC '-//W3C//DTD SVG 1.1//EN' 'http://www.w3.org/Graphics/SVG/1.1/DTD/svg11.dtd'%3E%3Csvg xmlns='http://www.w3.org/2000/svg' xmlns:xlink='http://www.w3.org/1999/xlink' version='1.1' width='24' height='24' viewBox='0 0 24 24'%3E%3Cpath fill='%23000000' d='M6,6.9L3.87,4.78L5.28,3.37L7.4,5.5L6,6.9M13,1V4H11V1H13M20.13,4.78L18,6.9L16.6,5.5L18.72,3.37L20.13,4.78M4.5,10.5V12.5H1.5V10.5H4.5M19.5,10.5H22.5V12.5H19.5V10.5M6,20H18A2,2 0 0,1 20,22H4A2,2 0 0,1 6,20M12,5A6,6 0 0,1 18,11V19H6V11A6,6 0 0,1 12,5Z' /%3E%3C/svg%3E"); --md-admonition-icon--xquote: url("data:image/svg+xml,%3C%3Fxml version='1.0' encoding='UTF-8'%3F%3E%3C!DOCTYPE svg PUBLIC '-//W3C//DTD SVG 1.1//EN' 'http://www.w3.org/Graphics/SVG/1.1/DTD/svg11.dtd'%3E%3Csvg xmlns='http://www.w3.org/2000/svg' xmlns:xlink='http://www.w3.org/1999/xlink' version='1.1' width='24' height='24' viewBox='0 0 24 24'%3E%3Cpath fill='%23000000' d='M20 2H4C2.9 2 2 2.9 2 4V16C2 17.1 2.9 18 4 18H8V21C8 21.6 8.4 22 9 22H9.5C9.7 22 10 21.9 10.2 21.7L13.9 18H20C21.1 18 22 17.1 22 16V4C22 2.9 21.1 2 20 2M11 13H7V8.8L8.3 6H10.3L8.9 9H11V13M17 13H13V8.8L14.3 6H16.3L14.9 9H17V13Z' /%3E%3C/svg%3E"); - --md-admonition-icon--xwarning: url("data:image/svg+xml,%3C%3Fxml version='1.0' encoding='UTF-8'%3F%3E%3C!DOCTYPE svg PUBLIC '-//W3C//DTD SVG 1.1//EN' 'http://www.w3.org/Graphics/SVG/1.1/DTD/svg11.dtd'%3E%3Csvg xmlns='http://www.w3.org/2000/svg' xmlns:xlink='http://www.w3.org/1999/xlink' version='1.1' width='24' height='24' viewBox='0 0 24 24'%3E%3Cpath fill='%23000000' d='M13 13H11V7H13M11 15H13V17H11M15.73 3H8.27L3 8.27V15.73L8.27 21H15.73L21 15.73V8.27L15.73 3Z' /%3E%3C/svg%3E"); + --md-admonition-icon--xwarning: url("data:image/svg+xml,%3C%3Fxml version='1.0' encoding='UTF-8'%3F%3E%3C!DOCTYPE svg PUBLIC '-//W3C//DTD SVG 1.1//EN' 'http://www.w3.org/Graphics/SVG/1.1/DTD/svg11.dtd'%3E%3Csvg xmlns='http://www.w3.org/2000/svg' xmlns:xlink='http://www.w3.org/1999/xlink' version='1.1' width='24' height='24' viewBox='0 0 24 24'%3E%3Cpath d='M23,12L20.56,9.22L20.9,5.54L17.29,4.72L15.4,1.54L12,3L8.6,1.54L6.71,4.72L3.1,5.53L3.44,9.21L1,12L3.44,14.78L3.1,18.47L6.71,19.29L8.6,22.47L12,21L15.4,22.46L17.29,19.28L20.9,18.46L20.56,14.78L23,12M13,17H11V15H13V17M13,13H11V7H13V13Z' /%3E%3C/svg%3E"); --md-admonition-icon--xdanger: url("data:image/svg+xml,%3C%3Fxml version='1.0' encoding='UTF-8'%3F%3E%3C!DOCTYPE svg PUBLIC '-//W3C//DTD SVG 1.1//EN' 'http://www.w3.org/Graphics/SVG/1.1/DTD/svg11.dtd'%3E%3Csvg xmlns='http://www.w3.org/2000/svg' xmlns:xlink='http://www.w3.org/1999/xlink' version='1.1' width='24' height='24' viewBox='0 0 24 24'%3E%3Cpath fill='%23000000' d='M12,2A9,9 0 0,0 3,11C3,14.03 4.53,16.82 7,18.47V22H9V19H11V22H13V19H15V22H17V18.46C19.47,16.81 21,14 21,11A9,9 0 0,0 12,2M8,11A2,2 0 0,1 10,13A2,2 0 0,1 8,15A2,2 0 0,1 6,13A2,2 0 0,1 8,11M16,11A2,2 0 0,1 18,13A2,2 0 0,1 16,15A2,2 0 0,1 14,13A2,2 0 0,1 16,11M12,14L13.5,17H10.5L12,14Z' /%3E%3C/svg%3E"); --md-admonition-icon--xtip: url("data:image/svg+xml,%3C%3Fxml version='1.0' encoding='UTF-8'%3F%3E%3C!DOCTYPE svg PUBLIC '-//W3C//DTD SVG 1.1//EN' 'http://www.w3.org/Graphics/SVG/1.1/DTD/svg11.dtd'%3E%3Csvg xmlns='http://www.w3.org/2000/svg' xmlns:xlink='http://www.w3.org/1999/xlink' version='1.1' width='24' height='24' viewBox='0 0 24 24'%3E%3Cpath fill='%23000000' d='M12,6A6,6 0 0,1 18,12C18,14.22 16.79,16.16 15,17.2V19A1,1 0 0,1 14,20H10A1,1 0 0,1 9,19V17.2C7.21,16.16 6,14.22 6,12A6,6 0 0,1 12,6M14,21V22A1,1 0 0,1 13,23H11A1,1 0 0,1 10,22V21H14M20,11H23V13H20V11M1,11H4V13H1V11M13,1V4H11V1H13M4.92,3.5L7.05,5.64L5.63,7.05L3.5,4.93L4.92,3.5M16.95,5.63L19.07,3.5L20.5,4.93L18.37,7.05L16.95,5.63Z' /%3E%3C/svg%3E"); --md-admonition-icon--xfail: url("data:image/svg+xml,%3C%3Fxml version='1.0' encoding='UTF-8'%3F%3E%3C!DOCTYPE svg PUBLIC '-//W3C//DTD SVG 1.1//EN' 'http://www.w3.org/Graphics/SVG/1.1/DTD/svg11.dtd'%3E%3Csvg xmlns='http://www.w3.org/2000/svg' xmlns:xlink='http://www.w3.org/1999/xlink' version='1.1' width='24' height='24' viewBox='0 0 24 24'%3E%3Cpath fill='%23000000' d='M8.27,3L3,8.27V15.73L8.27,21H15.73L21,15.73V8.27L15.73,3M8.41,7L12,10.59L15.59,7L17,8.41L13.41,12L17,15.59L15.59,17L12,13.41L8.41,17L7,15.59L10.59,12L7,8.41' /%3E%3C/svg%3E"); @@ -33,6 +33,11 @@ limitations under the License. --md-admonition-icon--xabstract: url("data:image/svg+xml,%3C%3Fxml version='1.0' encoding='UTF-8'%3F%3E%3C!DOCTYPE svg PUBLIC '-//W3C//DTD SVG 1.1//EN' 'http://www.w3.org/Graphics/SVG/1.1/DTD/svg11.dtd'%3E%3Csvg xmlns='http://www.w3.org/2000/svg' xmlns:xlink='http://www.w3.org/1999/xlink' version='1.1' width='24' height='24' viewBox='0 0 24 24'%3E%3Cpath fill='%23000000' d='M3,3H21V5H3V3M3,7H15V9H3V7M3,11H21V13H3V11M3,15H15V17H3V15M3,19H21V21H3V19Z' /%3E%3C/svg%3E"); --md-admonition-icon--xnote: url("data:image/svg+xml,%3C%3Fxml version='1.0' encoding='UTF-8'%3F%3E%3C!DOCTYPE svg PUBLIC '-//W3C//DTD SVG 1.1//EN' 'http://www.w3.org/Graphics/SVG/1.1/DTD/svg11.dtd'%3E%3Csvg xmlns='http://www.w3.org/2000/svg' xmlns:xlink='http://www.w3.org/1999/xlink' version='1.1' width='24' height='24' viewBox='0 0 24 24'%3E%3Cpath fill='%23000000' d='M20.71,7.04C20.37,7.38 20.04,7.71 20.03,8.04C20,8.36 20.34,8.69 20.66,9C21.14,9.5 21.61,9.95 21.59,10.44C21.57,10.93 21.06,11.44 20.55,11.94L16.42,16.08L15,14.66L19.25,10.42L18.29,9.46L16.87,10.87L13.12,7.12L16.96,3.29C17.35,2.9 18,2.9 18.37,3.29L20.71,5.63C21.1,6 21.1,6.65 20.71,7.04M3,17.25L12.56,7.68L16.31,11.43L6.75,21H3V17.25Z' /%3E%3C/svg%3E"); --md-admonition-icon--xinfo: url("data:image/svg+xml,%3C%3Fxml version='1.0' encoding='UTF-8'%3F%3E%3C!DOCTYPE svg PUBLIC '-//W3C//DTD SVG 1.1//EN' 'http://www.w3.org/Graphics/SVG/1.1/DTD/svg11.dtd'%3E%3Csvg xmlns='http://www.w3.org/2000/svg' xmlns:xlink='http://www.w3.org/1999/xlink' version='1.1' width='24' height='24' viewBox='0 0 24 24'%3E%3Cpath fill='%23000000' d='M18 2H12V9L9.5 7.5L7 9V2H6C4.9 2 4 2.9 4 4V20C4 21.1 4.9 22 6 22H18C19.1 22 20 21.1 20 20V4C20 2.89 19.1 2 18 2M17.68 18.41C17.57 18.5 16.47 19.25 16.05 19.5C15.63 19.79 14 20.72 14.26 18.92C14.89 15.28 16.11 13.12 14.65 14.06C14.27 14.29 14.05 14.43 13.91 14.5C13.78 14.61 13.79 14.6 13.68 14.41S13.53 14.23 13.67 14.13C13.67 14.13 15.9 12.34 16.72 12.28C17.5 12.21 17.31 13.17 17.24 13.61C16.78 15.46 15.94 18.15 16.07 18.54C16.18 18.93 17 18.31 17.44 18C17.44 18 17.5 17.93 17.61 18.05C17.72 18.22 17.83 18.3 17.68 18.41M16.97 11.06C16.4 11.06 15.94 10.6 15.94 10.03C15.94 9.46 16.4 9 16.97 9C17.54 9 18 9.46 18 10.03C18 10.6 17.54 11.06 16.97 11.06Z' /%3E%3C/svg%3E"); + --md-admonition-icon--xadvance: url("data:image/svg+xml,%3C%3Fxml version='1.0' encoding='UTF-8'%3F%3E%3C!DOCTYPE svg PUBLIC '-//W3C//DTD SVG 1.1//EN' 'http://www.w3.org/Graphics/SVG/1.1/DTD/svg11.dtd'%3E%3Csvg xmlns='http://www.w3.org/2000/svg' xmlns:xlink='http://www.w3.org/1999/xlink' version='1.1' width='24' height='24' viewBox='0 0 24 24'%3E%3Cpath d='M7,2V4H8V18A4,4 0 0,0 12,22A4,4 0 0,0 16,18V4H17V2H7M11,16C10.4,16 10,15.6 10,15C10,14.4 10.4,14 11,14C11.6,14 12,14.4 12,15C12,15.6 11.6,16 11,16M13,12C12.4,12 12,11.6 12,11C12,10.4 12.4,10 13,10C13.6,10 14,10.4 14,11C14,11.6 13.6,12 13,12M14,7H10V4H14V7Z' /%3E%3C/svg%3E"); +} +.md-typeset .admonition.advance, +.md-typeset details.advance { + border-color: rgb(27,77,62); } .md-typeset .admonition.new, .md-typeset details.new { @@ -45,7 +50,7 @@ limitations under the License. } .md-typeset .new > .admonition-title::before, .md-typeset .new > summary::before { - background-color: rgb(43, 155, 70); + background-color: rgb(228,24,30); -webkit-mask-image: var(--md-admonition-icon--new); mask-image: var(--md-admonition-icon--new); } @@ -55,12 +60,12 @@ limitations under the License. } .md-typeset .alert > .admonition-title, .md-typeset .alert > summary { - background-color: rgba(255, 0, 255), 0.1); - border-color: rgb(255, 0, 255)); + background-color: rgba(255, 0, 255, 0.1); + border-color: rgb(255, 0, 255); } .md-typeset .alert > .admonition-title::before, .md-typeset .alert > summary::before { - background-color: rgb(255, 0, 255)); + background-color: rgb(255, 0, 255); -webkit-mask-image: var(--md-admonition-icon--alert); mask-image: var(--md-admonition-icon--alert); } @@ -154,16 +159,15 @@ limitations under the License. -webkit-mask-image: var(--md-admonition-icon--xquote); mask-image: var(--md-admonition-icon--xquote); } - - -td { - vertical-align: middle !important; - text-align: center !important; -} -th { - font-weight: bold !important; - text-align: center !important; +.md-typeset .advance>.admonition-title::before, +.md-typeset .advance>summary::before, +.md-typeset .experiment>.admonition-title::before, +.md-typeset .experiment>summary::before { + background-color: rgb(0,57,166); + -webkit-mask-image: var(--md-admonition-icon--xadvance); + mask-image: var(--md-admonition-icon--xadvance); } + .md-nav__item--active > .md-nav__link { font-weight: bold; } diff --git a/setup.py b/setup.py index af187226f..f0202c5fe 100644 --- a/setup.py +++ b/setup.py @@ -91,6 +91,7 @@ def latest_version(package_name): install_requires=[ "pafy{}".format(latest_version("pafy")), "mss{}".format(latest_version("mss")), + "cython", "numpy", "youtube-dl{}".format(latest_version("youtube-dl")), "streamlink", @@ -98,7 +99,6 @@ def latest_version(package_name): "pyzmq{}".format(latest_version("pyzmq")), "simplejpeg{}".format(latest_version("simplejpeg")), "colorlog", - "colorama", "tqdm", "Pillow", "pyscreenshot{}".format(latest_version("pyscreenshot")), @@ -112,11 +112,10 @@ def latest_version(package_name): extras_require={ "asyncio": [ "starlette{}".format(latest_version("starlette")), - "aiofiles", "jinja2", - "aiohttp", "uvicorn{}".format(latest_version("uvicorn")), - "msgpack_numpy", + "msgpack{}".format(latest_version("msgpack")), + "msgpack_numpy{}".format(latest_version("msgpack_numpy")), "aiortc{}".format(latest_version("aiortc")), ] + ( diff --git a/vidgear/gears/asyncio/helper.py b/vidgear/gears/asyncio/helper.py index ab6abcad6..ebe3ce722 100755 --- a/vidgear/gears/asyncio/helper.py +++ b/vidgear/gears/asyncio/helper.py @@ -26,7 +26,6 @@ import sys import errno import numpy as np -import aiohttp import asyncio import logging as log import platform diff --git a/vidgear/gears/asyncio/netgear_async.py b/vidgear/gears/asyncio/netgear_async.py index ddb10b993..d4564e03a 100755 --- a/vidgear/gears/asyncio/netgear_async.py +++ b/vidgear/gears/asyncio/netgear_async.py @@ -36,7 +36,7 @@ from ..videogear import VideoGear # safe import critical Class modules -zmq = import_dependency_safe("zmq", error="silent", min_version="4.0") +zmq = import_dependency_safe("zmq", pkg_name="pyzmq", error="silent", min_version="4.0") if not (zmq is None): import zmq.asyncio msgpack = import_dependency_safe("msgpack", error="silent") diff --git a/vidgear/gears/helper.py b/vidgear/gears/helper.py index f57bb905b..afd6c1c62 100755 --- a/vidgear/gears/helper.py +++ b/vidgear/gears/helper.py @@ -53,7 +53,7 @@ def logger_handler(): """ # logging formatter formatter = ColoredFormatter( - "%(bold_cyan)s%(asctime)s :: %(bold_blue)s%(name)s%(reset)s :: %(log_color)s%(levelname)s%(reset)s :: %(message)s", + "%(bold_blue)s%(name)s%(reset)s :: %(log_color)s%(levelname)s%(reset)s :: %(message)s", datefmt="%H:%M:%S", reset=True, log_colors={ diff --git a/vidgear/gears/netgear.py b/vidgear/gears/netgear.py index cba317bac..53be57f89 100644 --- a/vidgear/gears/netgear.py +++ b/vidgear/gears/netgear.py @@ -39,7 +39,7 @@ ) # safe import critical Class modules -zmq = import_dependency_safe("zmq", error="silent", min_version="4.0") +zmq = import_dependency_safe("zmq", pkg_name="pyzmq", error="silent", min_version="4.0") if not (zmq is None): from zmq import ssh from zmq import auth From 114969e999d4577dec8e8bff7f0500878de45637 Mon Sep 17 00:00:00 2001 From: abhiTronix Date: Wed, 1 Sep 2021 11:27:24 +0530 Subject: [PATCH 06/11] =?UTF-8?q?=F0=9F=9A=B8=20Docs:=20Added=20bonus=20ex?= =?UTF-8?q?amples=20to=20help=20section.?= MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit - 🚸 Implemented a curated list of more advanced examples with unusual configuration for each API. - 💄 Updated admonitions colors and beautified `custom.css`. - ⚡️ Replaced VideoGear & CamGear with OpenCV in CPU intensive examples. - 📝 Updated `mkdocs.yml` with new changes and URLs. - 🚚 Moved FAQ examples to bonus. - 📝 Added several new contents and updated context. - ✏️ Fixed typos and updated links. --- docs/gears/camgear/usage.md | 6 + .../netgear/advanced/bidirectional_mode.md | 22 +- docs/gears/netgear/advanced/compression.md | 22 +- docs/gears/netgear/usage.md | 8 +- .../advanced/bidirectional_mode.md | 144 ------- docs/gears/netgear_async/usage.md | 24 +- docs/gears/pigear/usage.md | 8 +- docs/gears/screengear/usage.md | 8 +- docs/gears/stabilizer/usage.md | 16 +- docs/gears/streamgear/introduction.md | 8 +- docs/gears/streamgear/rtfm/usage.md | 2 +- docs/gears/streamgear/ssm/usage.md | 2 +- docs/gears/videogear/usage.md | 6 + docs/gears/webgear/advanced.md | 73 +--- docs/gears/webgear_rtc/advanced.md | 67 +--- docs/gears/writegear/compression/usage.md | 4 +- docs/gears/writegear/introduction.md | 8 +- docs/help/camgear_ex.md | 243 ++++++++++++ docs/help/camgear_faqs.md | 93 +---- docs/help/get_help.md | 22 +- docs/help/netgear_async_ex.md | 169 ++++++++ docs/help/netgear_ex.md | 368 ++++++++++++++++++ docs/help/pigear_ex.md | 75 ++++ docs/help/pigear_faqs.md | 49 +-- docs/help/screengear_ex.md | 149 +++++++ docs/help/stabilizer_ex.md | 236 +++++++++++ docs/help/streamgear_ex.md | 161 ++++++++ docs/help/videogear_ex.md | 220 +++++++++++ docs/help/webgear_ex.md | 233 +++++++++++ docs/help/webgear_faqs.md | 2 +- docs/help/webgear_rtc_ex.md | 213 ++++++++++ docs/help/writegear_ex.md | 306 +++++++++++++++ docs/help/writegear_faqs.md | 192 +-------- docs/overrides/assets/stylesheets/custom.css | 220 +++++++---- mkdocs.yml | 12 + 35 files changed, 2661 insertions(+), 730 deletions(-) create mode 100644 docs/help/camgear_ex.md create mode 100644 docs/help/netgear_async_ex.md create mode 100644 docs/help/netgear_ex.md create mode 100644 docs/help/pigear_ex.md create mode 100644 docs/help/screengear_ex.md create mode 100644 docs/help/stabilizer_ex.md create mode 100644 docs/help/streamgear_ex.md create mode 100644 docs/help/videogear_ex.md create mode 100644 docs/help/webgear_ex.md create mode 100644 docs/help/webgear_rtc_ex.md create mode 100644 docs/help/writegear_ex.md diff --git a/docs/gears/camgear/usage.md b/docs/gears/camgear/usage.md index 0583fe944..67f8e9b05 100644 --- a/docs/gears/camgear/usage.md +++ b/docs/gears/camgear/usage.md @@ -301,4 +301,10 @@ cv2.destroyAllWindows() stream.stop() ``` +  + +## Bonus Examples + +!!! example "Checkout more advanced CamGear examples with unusual configuration [here ➶](../../../help/camgear_ex/)" +   \ No newline at end of file diff --git a/docs/gears/netgear/advanced/bidirectional_mode.md b/docs/gears/netgear/advanced/bidirectional_mode.md index 3f8ac47ac..cfea17b39 100644 --- a/docs/gears/netgear/advanced/bidirectional_mode.md +++ b/docs/gears/netgear/advanced/bidirectional_mode.md @@ -378,14 +378,13 @@ Open your favorite terminal and execute the following python code: ```python # import required libraries -from vidgear.gears import VideoGear from vidgear.gears import NetGear from vidgear.gears.helper import reducer import numpy as np import cv2 # open any valid video stream(for e.g `test.mp4` file) -stream = VideoGear(source="test.mp4").start() +stream = cv2.VideoCapture("test.mp4") # activate Bidirectional mode options = {"bidirectional_mode": True} @@ -398,10 +397,10 @@ while True: try: # read frames from stream - frame = stream.read() + (grabbed, frame) = stream.read() - # check for frame if Nonetype - if frame is None: + # check for frame if not grabbed + if not grabbed: break # reducer frames size if you want more performance, otherwise comment this line @@ -428,7 +427,7 @@ while True: break # safely close video stream -stream.stop() +stream.release() # safely close server server.close() @@ -445,7 +444,6 @@ Then open another terminal on the same system and execute the following python c ```python # import required libraries from vidgear.gears import NetGear -from vidgear.gears import VideoGear from vidgear.gears.helper import reducer import cv2 @@ -453,7 +451,7 @@ import cv2 options = {"bidirectional_mode": True} # again open the same video stream -stream = VideoGear(source="test.mp4").start() +stream = cv2.VideoCapture("test.mp4") # define NetGear Client with `receive_mode = True` and defined parameter client = NetGear(receive_mode=True, pattern=1, logging=True, **options) @@ -462,10 +460,10 @@ client = NetGear(receive_mode=True, pattern=1, logging=True, **options) while True: # read frames from stream - frame = stream.read() + (grabbed, frame) = stream.read() - # check for frame if Nonetype - if frame is None: + # check for frame if not grabbed + if not grabbed: break # reducer frames size if you want more performance, otherwise comment this line @@ -503,7 +501,7 @@ while True: cv2.destroyAllWindows() # safely close video stream -stream.stop() +stream.release() # safely close client client.close() diff --git a/docs/gears/netgear/advanced/compression.md b/docs/gears/netgear/advanced/compression.md index 21e6c00c8..6187ddef9 100644 --- a/docs/gears/netgear/advanced/compression.md +++ b/docs/gears/netgear/advanced/compression.md @@ -475,14 +475,13 @@ Open your favorite terminal and execute the following python code: ```python # import required libraries -from vidgear.gears import VideoGear from vidgear.gears import NetGear from vidgear.gears.helper import reducer import numpy as np import cv2 # open any valid video stream(for e.g `test.mp4` file) -stream = VideoGear(source="test.mp4").start() +stream = cv2.VideoCapture("test.mp4") # activate Bidirectional mode and Frame Compression options = { @@ -501,10 +500,10 @@ while True: try: # read frames from stream - frame = stream.read() + (grabbed, frame) = stream.read() - # check for frame if Nonetype - if frame is None: + # check for frame if not grabbed + if not grabbed: break # reducer frames size if you want even more performance, otherwise comment this line @@ -531,7 +530,7 @@ while True: break # safely close video stream -stream.stop() +stream.release() # safely close server server.close() @@ -548,7 +547,6 @@ Then open another terminal on the same system and execute the following python c ```python # import required libraries from vidgear.gears import NetGear -from vidgear.gears import VideoGear from vidgear.gears.helper import reducer import cv2 @@ -562,7 +560,7 @@ options = { } # again open the same video stream -stream = VideoGear(source="test.mp4").start() +stream = cv2.VideoCapture("test.mp4") # define NetGear Client with `receive_mode = True` and defined parameter client = NetGear(receive_mode=True, pattern=1, logging=True, **options) @@ -571,10 +569,10 @@ client = NetGear(receive_mode=True, pattern=1, logging=True, **options) while True: # read frames from stream - frame = stream.read() + (grabbed, frame) = stream.read() - # check for frame if Nonetype - if frame is None: + # check for frame if not grabbed + if not grabbed: break # reducer frames size if you want even more performance, otherwise comment this line @@ -612,7 +610,7 @@ while True: cv2.destroyAllWindows() # safely close video stream -stream.stop() +stream.release() # safely close client client.close() diff --git a/docs/gears/netgear/usage.md b/docs/gears/netgear/usage.md index 95d310bd0..e2aaa2d14 100644 --- a/docs/gears/netgear/usage.md +++ b/docs/gears/netgear/usage.md @@ -471,4 +471,10 @@ stream.stop() server.close() ``` -  \ No newline at end of file +  + +## Bonus Examples + +!!! example "Checkout more advanced NetGear examples with unusual configuration [here ➶](../../../help/netgear_ex/)" + +  \ No newline at end of file diff --git a/docs/gears/netgear_async/advanced/bidirectional_mode.md b/docs/gears/netgear_async/advanced/bidirectional_mode.md index 923372156..0341f372b 100644 --- a/docs/gears/netgear_async/advanced/bidirectional_mode.md +++ b/docs/gears/netgear_async/advanced/bidirectional_mode.md @@ -219,150 +219,6 @@ if __name__ == "__main__":   -### Bare-Minimum Usage with VideoGear - -Following is another comparatively faster Bidirectional Mode bare-minimum example over Custom Source Server built using multi-threaded [VideoGear](../../../videogear/overview/) _(instead of OpenCV)_ and NetGear_Async API: - -#### Server End - -Open your favorite terminal and execute the following python code: - -!!! tip "You can terminate both sides anytime by pressing ++ctrl+"C"++ on your keyboard!" - -```python -# import library -from vidgear.gears.asyncio import NetGear_Async -from vidgear.gears import VideoGear -import cv2, asyncio - -# activate Bidirectional mode -options = {"bidirectional_mode": True} - -# initialize Server without any source -server = NetGear_Async(source=None, logging=True, **options) - -# Create a async frame generator as custom source -async def my_frame_generator(): - - # !!! define your own video source here !!! - # Open any valid video stream(for e.g `foo.mp4` file) - stream = VideoGear(source="foo.mp4").start() - - # loop over stream until its terminated - while True: - # read frames - frame = stream.read() - - # check for frame if Nonetype - if frame is None: - break - - # {do something with the frame to be sent here} - - # prepare data to be sent(a simple text in our case) - target_data = "Hello, I am a Server." - - # receive data from Client - recv_data = await server.transceive_data() - - # print data just received from Client - if not (recv_data is None): - print(recv_data) - - # send our frame & data - yield (target_data, frame) - - # sleep for sometime - await asyncio.sleep(0) - - # safely close video stream - stream.stop() - - -if __name__ == "__main__": - # set event loop - asyncio.set_event_loop(server.loop) - # Add your custom source generator to Server configuration - server.config["generator"] = my_frame_generator() - # Launch the Server - server.launch() - try: - # run your main function task until it is complete - server.loop.run_until_complete(server.task) - except (KeyboardInterrupt, SystemExit): - # wait for interrupts - pass - finally: - # finally close the server - server.close() -``` - -#### Client End - -Then open another terminal on the same system and execute the following python code and see the output: - -!!! tip "You can terminate client anytime by pressing ++ctrl+"C"++ on your keyboard!" - -```python -# import libraries -from vidgear.gears.asyncio import NetGear_Async -import cv2, asyncio - -# activate Bidirectional mode -options = {"bidirectional_mode": True} - -# define and launch Client with `receive_mode=True` -client = NetGear_Async(receive_mode=True, logging=True, **options).launch() - -# Create a async function where you want to show/manipulate your received frames -async def main(): - # loop over Client's Asynchronous Frame Generator - async for (data, frame) in client.recv_generator(): - - # do something with receive data from server - if not (data is None): - # let's print it - print(data) - - # {do something with received frames here} - - # Show output window(comment these lines if not required) - cv2.imshow("Output Frame", frame) - cv2.waitKey(1) & 0xFF - - # prepare data to be sent - target_data = "Hi, I am a Client here." - - # send our data to server - await client.transceive_data(data=target_data) - - # await before continuing - await asyncio.sleep(0) - - -if __name__ == "__main__": - # Set event loop to client's - asyncio.set_event_loop(client.loop) - try: - # run your main function task until it is complete - client.loop.run_until_complete(main()) - except (KeyboardInterrupt, SystemExit): - # wait for interrupts - pass - - # close all output window - cv2.destroyAllWindows() - - # safely close client - client.close() -``` - -  - -  - - - ### Using Bidirectional Mode with Variable Parameters diff --git a/docs/gears/netgear_async/usage.md b/docs/gears/netgear_async/usage.md index 0220d1252..63323b049 100644 --- a/docs/gears/netgear_async/usage.md +++ b/docs/gears/netgear_async/usage.md @@ -241,14 +241,14 @@ import cv2, asyncio # initialize Server without any source server = NetGear_Async(source=None, logging=True) +# !!! define your own video source here !!! +# Open any video stream such as live webcam +# video stream on first index(i.e. 0) device +stream = cv2.VideoCapture(0) + # Create a async frame generator as custom source async def my_frame_generator(): - # !!! define your own video source here !!! - # Open any video stream such as live webcam - # video stream on first index(i.e. 0) device - stream = cv2.VideoCapture(0) - # loop over stream until its terminated while True: @@ -265,9 +265,6 @@ async def my_frame_generator(): yield frame # sleep for sometime await asyncio.sleep(0) - - # close stream - stream.release() if __name__ == "__main__": @@ -284,6 +281,8 @@ if __name__ == "__main__": # wait for interrupts pass finally: + # close stream + stream.release() # finally close the server server.close() ``` @@ -375,6 +374,7 @@ if __name__ == "__main__": ``` ### Client's End + Then open another terminal on the same system and execute the following python code and see the output: !!! warning "Client will throw TimeoutError if it fails to connect to the Server in given [`timeout`](../params/#timeout) value!" @@ -429,4 +429,10 @@ if __name__ == "__main__": writer.close() ``` -  \ No newline at end of file +  + +## Bonus Examples + +!!! example "Checkout more advanced NetGear_Async examples with unusual configuration [here ➶](../../../help/netgear_async_ex/)" + +  \ No newline at end of file diff --git a/docs/gears/pigear/usage.md b/docs/gears/pigear/usage.md index 7b9827685..78ec04348 100644 --- a/docs/gears/pigear/usage.md +++ b/docs/gears/pigear/usage.md @@ -270,4 +270,10 @@ stream.stop() writer.close() ``` -  \ No newline at end of file +  + +## Bonus Examples + +!!! example "Checkout more advanced PiGear examples with unusual configuration [here ➶](../../../help/pigear_ex/)" + +  \ No newline at end of file diff --git a/docs/gears/screengear/usage.md b/docs/gears/screengear/usage.md index dea324021..9dd7c6ce3 100644 --- a/docs/gears/screengear/usage.md +++ b/docs/gears/screengear/usage.md @@ -321,4 +321,10 @@ stream.stop() writer.close() ``` -  \ No newline at end of file +  + +## Bonus Examples + +!!! example "Checkout more advanced NetGear examples with unusual configuration [here ➶](../../../help/screengear_ex/)" + +  \ No newline at end of file diff --git a/docs/gears/stabilizer/usage.md b/docs/gears/stabilizer/usage.md index fd423ba41..acd7ca2ae 100644 --- a/docs/gears/stabilizer/usage.md +++ b/docs/gears/stabilizer/usage.md @@ -67,7 +67,7 @@ while True: if stabilized_frame is None: continue - # {do something with the stabilized_frame frame here} + # {do something with the stabilized frame here} # Show output window cv2.imshow("Output Stabilized Frame", stabilized_frame) @@ -121,7 +121,7 @@ while True: if stabilized_frame is None: continue - # {do something with the frame here} + # {do something with the stabilized frame here} # Show output window cv2.imshow("Stabilized Frame", stabilized_frame) @@ -176,7 +176,7 @@ while True: if stabilized_frame is None: continue - # {do something with the stabilized_frame frame here} + # {do something with the stabilized frame here} # Show output window cv2.imshow("Output Stabilized Frame", stabilized_frame) @@ -203,6 +203,8 @@ stream.stop() VideoGear's stabilizer can be used in conjunction with WriteGear API directly without any compatibility issues. The complete usage example is as follows: +!!! tip "You can also add live audio input to WriteGear pipeline. See this [bonus example](../../../help)" + ```python # import required libraries from vidgear.gears.stabilizer import Stabilizer @@ -236,7 +238,7 @@ while True: if stabilized_frame is None: continue - # {do something with the frame here} + # {do something with the stabilized frame here} # write stabilized frame to writer writer.write(stabilized_frame) @@ -271,4 +273,10 @@ writer.close() !!! example "The complete usage example can be found [here ➶](../../videogear/usage/#using-videogear-with-video-stabilizer-backend)" +  + +## Bonus Examples + +!!! example "Checkout more advanced Stabilizer examples with unusual configuration [here ➶](../../../help/stabilizer_ex/)" +   \ No newline at end of file diff --git a/docs/gears/streamgear/introduction.md b/docs/gears/streamgear/introduction.md index 027b1a0d3..b460a41ee 100644 --- a/docs/gears/streamgear/introduction.md +++ b/docs/gears/streamgear/introduction.md @@ -170,4 +170,10 @@ from vidgear.gears import StreamGear See here 🚀 -  \ No newline at end of file +  + +## Bonus Examples + +!!! example "Checkout more advanced StreamGear examples with unusual configuration [here ➶](../../../help/streamgear_ex/)" + +  \ No newline at end of file diff --git a/docs/gears/streamgear/rtfm/usage.md b/docs/gears/streamgear/rtfm/usage.md index 8b0d34a3f..6008a65a7 100644 --- a/docs/gears/streamgear/rtfm/usage.md +++ b/docs/gears/streamgear/rtfm/usage.md @@ -155,7 +155,7 @@ You can easily activate ==Low-latency Livestreaming in Real-time Frames Mode==, !!! tip "Use `-window_size` & `-extra_window_size` FFmpeg parameters for controlling number of frames to be kept in Chunks. Less these value, less will be latency." -!!! warning "All Chunks will be overwritten in this mode after every few Chunks _(equal to the sum of `-window_size` & `-extra_window_size` values)_, Hence Newer Chunks and Manifest contains NO information of any older video-frames." +!!! alert "After every few chunks _(equal to the sum of `-window_size` & `-extra_window_size` values)_, all chunks will be overwritten in Live-Streaming. Thereby, since newer chunks in manifest/playlist will contain NO information of any older ones, and therefore resultant DASH/HLS stream will play only the most recent frames." !!! note "In this mode, StreamGear **DOES NOT** automatically maps video-source audio to generated streams. You need to manually assign separate audio-source through [`-audio`](../../params/#a-exclusive-parameters) attribute of `stream_params` dictionary parameter." diff --git a/docs/gears/streamgear/ssm/usage.md b/docs/gears/streamgear/ssm/usage.md index 4756f41c4..1db663992 100644 --- a/docs/gears/streamgear/ssm/usage.md +++ b/docs/gears/streamgear/ssm/usage.md @@ -82,7 +82,7 @@ You can easily activate ==Low-latency Livestreaming in Single-Source Mode==, whe !!! tip "Use `-window_size` & `-extra_window_size` FFmpeg parameters for controlling number of frames to be kept in Chunks. Less these value, less will be latency." -!!! warning "All Chunks will be overwritten in this mode after every few Chunks _(equal to the sum of `-window_size` & `-extra_window_size` values)_, Hence Newer Chunks and Manifest contains NO information of any older video-frames." +!!! alert "After every few chunks _(equal to the sum of `-window_size` & `-extra_window_size` values)_, all chunks will be overwritten in Live-Streaming. Thereby, since newer chunks in manifest/playlist will contain NO information of any older ones, and therefore resultant DASH/HLS stream will play only the most recent frames." !!! note "If input video-source _(i.e. `-video_source`)_ contains any audio stream/channel, then it automatically gets mapped to all generated streams without any extra efforts." diff --git a/docs/gears/videogear/usage.md b/docs/gears/videogear/usage.md index b02541e34..3f41d1ab8 100644 --- a/docs/gears/videogear/usage.md +++ b/docs/gears/videogear/usage.md @@ -274,4 +274,10 @@ cv2.destroyAllWindows() stream.stop() ``` +  + +## Bonus Examples + +!!! example "Checkout more advanced VideoGear examples with unusual configuration [here ➶](../../../help/videogear_ex/)" +   \ No newline at end of file diff --git a/docs/gears/webgear/advanced.md b/docs/gears/webgear/advanced.md index 507f42fc5..c0a735366 100644 --- a/docs/gears/webgear/advanced.md +++ b/docs/gears/webgear/advanced.md @@ -108,7 +108,7 @@ async def my_frame_producer(): # do something with your OpenCV frame here # reducer frames size if you want more performance otherwise comment this line - frame = await reducer(frame, percentage=30, interpolation=cv2.INTER_LINEAR) # reduce frame by 30% + frame = await reducer(frame, percentage=30, interpolation=cv2.INTER_AREA) # reduce frame by 30% # handle JPEG encoding encodedImage = cv2.imencode(".jpg", frame)[1].tobytes() # yield frame in byte format @@ -314,75 +314,8 @@ WebGear gives us complete freedom of altering data files generated in [**Auto-Ge   -## Bonus Usage Examples +## Bonus Examples -Because of WebGear API's flexible internal wapper around [VideoGear](../../videogear/overview/), it can easily access any parameter of [CamGear](#camgear) and [PiGear](#pigear) videocapture APIs. - -!!! info "Following usage examples are just an idea of what can be done with WebGear API, you can try various [VideoGear](../../videogear/params/), [CamGear](../../camgear/params/) and [PiGear](../../pigear/params/) parameters directly in WebGear API in the similar manner." - -### Using WebGear with Pi Camera Module - -Here's a bare-minimum example of using WebGear API with the Raspberry Pi camera module while tweaking its various properties in just one-liner: - -```python -# import libs -import uvicorn -from vidgear.gears.asyncio import WebGear - -# various webgear performance and Raspberry Pi camera tweaks -options = { - "frame_size_reduction": 40, - "jpeg_compression_quality": 80, - "jpeg_compression_fastdct": True, - "jpeg_compression_fastupsample": False, - "hflip": True, - "exposure_mode": "auto", - "iso": 800, - "exposure_compensation": 15, - "awb_mode": "horizon", - "sensor_mode": 0, -} - -# initialize WebGear app -web = WebGear( - enablePiCamera=True, resolution=(640, 480), framerate=60, logging=True, **options -) - -# run this app on Uvicorn server at address http://localhost:8000/ -uvicorn.run(web(), host="localhost", port=8000) - -# close app safely -web.shutdown() -``` - -  - -### Using WebGear with real-time Video Stabilization enabled - -Here's an example of using WebGear API with real-time Video Stabilization enabled: - -```python -# import libs -import uvicorn -from vidgear.gears.asyncio import WebGear - -# various webgear performance tweaks -options = { - "frame_size_reduction": 40, - "jpeg_compression_quality": 80, - "jpeg_compression_fastdct": True, - "jpeg_compression_fastupsample": False, -} - -# initialize WebGear app with a raw source and enable video stabilization(`stabilize=True`) -web = WebGear(source="foo.mp4", stabilize=True, logging=True, **options) - -# run this app on Uvicorn server at address http://localhost:8000/ -uvicorn.run(web(), host="localhost", port=8000) - -# close app safely -web.shutdown() -``` +!!! example "Checkout more advanced WebGear examples with unusual configuration [here ➶](../../../help/webgear_ex/)"   - \ No newline at end of file diff --git a/docs/gears/webgear_rtc/advanced.md b/docs/gears/webgear_rtc/advanced.md index da0887954..1f4646d7a 100644 --- a/docs/gears/webgear_rtc/advanced.md +++ b/docs/gears/webgear_rtc/advanced.md @@ -326,69 +326,8 @@ WebGear_RTC gives us complete freedom of altering data files generated in [**Aut   -## Bonus Usage Examples +## Bonus Examples -Because of WebGear_RTC API's flexible internal wapper around [VideoGear](../../videogear/overview/), it can easily access any parameter of [CamGear](#camgear) and [PiGear](#pigear) videocapture APIs. +!!! example "Checkout more advanced WebGear_RTC examples with unusual configuration [here ➶](../../../help/webgear_rtc_ex/)" -!!! info "Following usage examples are just an idea of what can be done with WebGear_RTC API, you can try various [VideoGear](../../videogear/params/), [CamGear](../../camgear/params/) and [PiGear](../../pigear/params/) parameters directly in WebGear_RTC API in the similar manner." - -### Using WebGear_RTC with Pi Camera Module - -Here's a bare-minimum example of using WebGear_RTC API with the Raspberry Pi camera module while tweaking its various properties in just one-liner: - -```python -# import libs -import uvicorn -from vidgear.gears.asyncio import WebGear_RTC - -# various webgear_rtc performance and Raspberry Pi camera tweaks -options = { - "frame_size_reduction": 25, - "hflip": True, - "exposure_mode": "auto", - "iso": 800, - "exposure_compensation": 15, - "awb_mode": "horizon", - "sensor_mode": 0, -} - -# initialize WebGear_RTC app -web = WebGear_RTC( - enablePiCamera=True, resolution=(640, 480), framerate=60, logging=True, **options -) - -# run this app on Uvicorn server at address http://localhost:8000/ -uvicorn.run(web(), host="localhost", port=8000) - -# close app safely -web.shutdown() -``` - -  - -### Using WebGear_RTC with real-time Video Stabilization enabled - -Here's an example of using WebGear_RTC API with real-time Video Stabilization enabled: - -```python -# import libs -import uvicorn -from vidgear.gears.asyncio import WebGear_RTC - -# various webgear_rtc performance tweaks -options = { - "frame_size_reduction": 25, -} - -# initialize WebGear_RTC app with a raw source and enable video stabilization(`stabilize=True`) -web = WebGear_RTC(source="foo.mp4", stabilize=True, logging=True, **options) - -# run this app on Uvicorn server at address http://localhost:8000/ -uvicorn.run(web(), host="localhost", port=8000) - -# close app safely -web.shutdown() -``` - -  - \ No newline at end of file +  \ No newline at end of file diff --git a/docs/gears/writegear/compression/usage.md b/docs/gears/writegear/compression/usage.md index 97d1bd259..bc3b0c0ea 100644 --- a/docs/gears/writegear/compression/usage.md +++ b/docs/gears/writegear/compression/usage.md @@ -221,7 +221,7 @@ In Compression Mode, WriteGear also allows URL strings _(as output)_ for network In this example, we will stream live camera feed directly to Twitch: -!!! info "YouTube-Live Streaming example code also available in [WriteGear FAQs ➶](../../../../help/writegear_faqs/#is-youtube-live-streaming-possibe-with-writegear)" +!!! info "YouTube-Live Streaming example code also available in [WriteGear FAQs ➶](../../../../help/writegear_ex/#using-writegears-compression-mode-for-youtube-live-streaming)" !!! warning "This example assume you already have a [**Twitch Account**](https://www.twitch.tv/) for publishing video." @@ -576,7 +576,7 @@ In this example code, we will merging the audio from a Audio Device _(for e.g. W !!! fail "If audio still doesn't work then reach us out on [Gitter ➶](https://gitter.im/vidgear/community) Community channel" -!!! danger "Make sure this `-audio` audio-source it compatible with provided video-source, otherwise you encounter multiple errors or no output at all." +!!! danger "Make sure this `-i` audio-source it compatible with provided video-source, otherwise you encounter multiple errors or no output at all." !!! warning "You **MUST** use [`-input_framerate`](../../params/#a-exclusive-parameters) attribute to set exact value of input framerate when using external audio in Real-time Frames mode, otherwise audio delay will occur in output streams." diff --git a/docs/gears/writegear/introduction.md b/docs/gears/writegear/introduction.md index 2e5d7e609..0149e6376 100644 --- a/docs/gears/writegear/introduction.md +++ b/docs/gears/writegear/introduction.md @@ -75,4 +75,10 @@ from vidgear.gears import WriteGear See here 🚀 -  \ No newline at end of file +  + +## Bonus Examples + +!!! example "Checkout more advanced WriteGear examples with unusual configuration [here ➶](../../../help/writegear_ex/)" + +  \ No newline at end of file diff --git a/docs/help/camgear_ex.md b/docs/help/camgear_ex.md new file mode 100644 index 000000000..5a0522d3a --- /dev/null +++ b/docs/help/camgear_ex.md @@ -0,0 +1,243 @@ + + +# CamGear Examples + +  + +## Synchronizing Two Sources in CamGear + +In this example both streams and corresponding frames will be processed synchronously i.e. with no delay: + +!!! danger "Using same source with more than one instances of CamGear can lead to [Global Interpreter Lock (GIL)](https://wiki.python.org/moin/GlobalInterpreterLock#:~:text=In%20CPython%2C%20the%20global%20interpreter,conditions%20and%20ensures%20thread%20safety.&text=The%20GIL%20can%20degrade%20performance%20even%20when%20it%20is%20not%20a%20bottleneck.) that degrades performance even when it is not a bottleneck." + +```python +# import required libraries +from vidgear.gears import CamGear +import cv2 +import time + +# define and start the stream on first source ( For e.g #0 index device) +stream1 = CamGear(source=0, logging=True).start() + +# define and start the stream on second source ( For e.g #1 index device) +stream2 = CamGear(source=1, logging=True).start() + +# infinite loop +while True: + + frameA = stream1.read() + # read frames from stream1 + + frameB = stream2.read() + # read frames from stream2 + + # check if any of two frame is None + if frameA is None or frameB is None: + #if True break the infinite loop + break + + # do something with both frameA and frameB here + cv2.imshow("Output Frame1", frameA) + cv2.imshow("Output Frame2", frameB) + # Show output window of stream1 and stream 2 separately + + key = cv2.waitKey(1) & 0xFF + # check for 'q' key-press + if key == ord("q"): + #if 'q' key-pressed break out + break + + if key == ord("w"): + #if 'w' key-pressed save both frameA and frameB at same time + cv2.imwrite("Image-1.jpg", frameA) + cv2.imwrite("Image-2.jpg", frameB) + #break #uncomment this line to break out after taking images + +cv2.destroyAllWindows() +# close output window + +# safely close both video streams +stream1.stop() +stream2.stop() +``` + +  + +## Using variable Youtube-DL parameters in CamGear + +CamGear provides exclusive attributes `STREAM_RESOLUTION` _(for specifying stream resolution)_ & `STREAM_PARAMS` _(for specifying underlying API(e.g. `youtube-dl`) parameters)_ with its [`options`](../../gears/camgear/params/#options) dictionary parameter. + +The complete usage example is as follows: + +!!! tip "More information on `STREAM_RESOLUTION` & `STREAM_PARAMS` attributes can be found [here ➶](../../gears/camgear/advanced/source_params/#exclusive-camgear-parameters)" + +```python +# import required libraries +from vidgear.gears import CamGear +import cv2 + +# specify attributes +options = {"STREAM_RESOLUTION": "720p", "STREAM_PARAMS": {"nocheckcertificate": True}} + +# Add YouTube Video URL as input source (for e.g https://youtu.be/bvetuLwJIkA) +# and enable Stream Mode (`stream_mode = True`) +stream = CamGear( + source="https://youtu.be/bvetuLwJIkA", stream_mode=True, logging=True, **options +).start() + +# loop over +while True: + + # read frames from stream + frame = stream.read() + + # check for frame if Nonetype + if frame is None: + break + + # {do something with the frame here} + + # Show output window + cv2.imshow("Output", frame) + + # check for 'q' key if pressed + key = cv2.waitKey(1) & 0xFF + if key == ord("q"): + break + +# close output window +cv2.destroyAllWindows() + +# safely close video stream +stream.stop() +``` + + +  + + +## Using CamGear for capturing RSTP/RTMP URLs + +You can open any network stream _(such as RTSP/RTMP)_ just by providing its URL directly to CamGear's [`source`](../../gears/camgear/params/#source) parameter. + +Here's a high-level wrapper code around CamGear API to enable auto-reconnection during capturing: + +!!! new "New in v0.2.2" + This example was added in `v0.2.2`. + +??? tip "Enforcing UDP stream" + + You can easily enforce UDP for RSTP streams inplace of default TCP, by putting following lines of code on the top of your existing code: + + ```python + # import required libraries + import os + + # enforce UDP + os.environ["OPENCV_FFMPEG_CAPTURE_OPTIONS"] = "rtsp_transport;udp" + ``` + + Finally, use [`backend`](../../gears/camgear/params/#backend) parameter value as `backend=cv2.CAP_FFMPEG` in CamGear. + + +```python +from vidgear.gears import CamGear +import cv2 +import datetime +import time + + +class Reconnecting_CamGear: + def __init__(self, cam_address, reset_attempts=50, reset_delay=5): + self.cam_address = cam_address + self.reset_attempts = reset_attempts + self.reset_delay = reset_delay + self.source = CamGear(source=self.cam_address).start() + self.running = True + + def read(self): + if self.source is None: + return None + if self.running and self.reset_attempts > 0: + frame = self.source.read() + if frame is None: + self.source.stop() + self.reset_attempts -= 1 + print( + "Re-connection Attempt-{} occured at time:{}".format( + str(self.reset_attempts), + datetime.datetime.now().strftime("%m-%d-%Y %I:%M:%S%p"), + ) + ) + time.sleep(self.reset_delay) + self.source = CamGear(source=self.cam_address).start() + # return previous frame + return self.frame + else: + self.frame = frame + return frame + else: + return None + + def stop(self): + self.running = False + self.reset_attempts = 0 + self.frame = None + if not self.source is None: + self.source.stop() + + +if __name__ == "__main__": + # open any valid video stream + stream = Reconnecting_CamGear( + cam_address="rtsp://wowzaec2demo.streamlock.net/vod/mp4:BigBuckBunny_115k.mov", + reset_attempts=20, + reset_delay=5, + ) + + # loop over + while True: + + # read frames from stream + frame = stream.read() + + # check for frame if None-type + if frame is None: + break + + # {do something with the frame here} + + # Show output window + cv2.imshow("Output", frame) + + # check for 'q' key if pressed + key = cv2.waitKey(1) & 0xFF + if key == ord("q"): + break + + # close output window + cv2.destroyAllWindows() + + # safely close video stream + stream.stop() +``` + +  \ No newline at end of file diff --git a/docs/help/camgear_faqs.md b/docs/help/camgear_faqs.md index a55cdc848..a2b394105 100644 --- a/docs/help/camgear_faqs.md +++ b/docs/help/camgear_faqs.md @@ -74,48 +74,7 @@ limitations under the License. CamGear provides exclusive attributes `STREAM_RESOLUTION` _(for specifying stream resolution)_ & `STREAM_PARAMS` _(for specifying underlying API(e.g. `youtube-dl`) parameters)_ with its [`options`](../../gears/camgear/params/#options) dictionary parameter. The complete usage example is as follows: -!!! tip "More information on `STREAM_RESOLUTION` & `STREAM_PARAMS` attributes can be found [here ➶](../../gears/camgear/advanced/source_params/#exclusive-camgear-parameters)" - -```python -# import required libraries -from vidgear.gears import CamGear -import cv2 - -# specify attributes -options = {"STREAM_RESOLUTION": "720p", "STREAM_PARAMS": {"nocheckcertificate": True}} - -# Add YouTube Video URL as input source (for e.g https://youtu.be/bvetuLwJIkA) -# and enable Stream Mode (`stream_mode = True`) -stream = CamGear( - source="https://youtu.be/bvetuLwJIkA", stream_mode=True, logging=True, **options -).start() - -# loop over -while True: - - # read frames from stream - frame = stream.read() - - # check for frame if Nonetype - if frame is None: - break - - # {do something with the frame here} - - # Show output window - cv2.imshow("Output", frame) - - # check for 'q' key if pressed - key = cv2.waitKey(1) & 0xFF - if key == ord("q"): - break - -# close output window -cv2.destroyAllWindows() - -# safely close video stream -stream.stop() -``` +**Answer:** See [this bonus example ➶](../camgear_ex/#using-variable-youtube-dl-parameters-in-camgear).   @@ -125,55 +84,7 @@ stream.stop() You can open any local network stream _(such as RTSP)_ just by providing its URL directly to CamGear's [`source`](../../gears/camgear/params/#source) parameter. The complete usage example is as follows: -??? tip "Enforcing UDP stream" - - You can easily enforce UDP for RSTP streams inplace of default TCP, by putting following lines of code on the top of your existing code: - - ```python - # import required libraries - import os - - # enforce UDP - os.environ["OPENCV_FFMPEG_CAPTURE_OPTIONS"] = "rtsp_transport;udp" - ``` - - Finally, use [`backend`](../../gears/camgear/params/#backend) parameter value as `backend=cv2.CAP_FFMPEG` in CamGear. - - -```python -# import required libraries -from vidgear.gears import CamGear -import cv2 - -# open valid network video-stream -stream = CamGear(source="rtsp://wowzaec2demo.streamlock.net/vod/mp4:BigBuckBunny_115k.mov").start() - -# loop over -while True: - - # read frames from stream - frame = stream.read() - - # check for frame if Nonetype - if frame is None: - break - - # {do something with the frame here} - - # Show output window - cv2.imshow("Output", frame) - - # check for 'q' key if pressed - key = cv2.waitKey(1) & 0xFF - if key == ord("q"): - break - -# close output window -cv2.destroyAllWindows() - -# safely close video stream -stream.stop() -``` +**Answer:** See [this bonus example ➶](../camgear_ex/#using-camgear-for-capturing-rstprtmp-urls).   diff --git a/docs/help/get_help.md b/docs/help/get_help.md index 619e2d56a..b01b34706 100644 --- a/docs/help/get_help.md +++ b/docs/help/get_help.md @@ -37,7 +37,7 @@ There are several ways to get help with VidGear: > Got a question related to VidGear Working? -Checkout our Frequently Asked Questions, a curated list of all the questions with adequate answer that we commonly receive, for quickly troubleshooting your problems: +Checkout the Frequently Asked Questions, a curated list of all the questions with adequate answer that we commonly receive, for quickly troubleshooting your problems: - [General FAQs ➶](general_faqs.md) - [CamGear FAQs ➶](camgear_faqs.md) @@ -56,6 +56,26 @@ Checkout our Frequently Asked Questions, a curated list of all the questions wit   +## Bonus Examples + +> How we do this with that API? + +Checkout the Bonus Examples, a curated list of all advanced examples with unusual configuration, which isn't available in Vidgear API's usage examples: + +- [CamGear FAQs ➶](camgear_ex.md) +- [PiGear FAQs ➶](pigear_ex.md) +- [ScreenGear FAQs ➶](screengear_ex.md) +- [StreamGear FAQs ➶](streamgear_ex.md) +- [WriteGear FAQs ➶](writegear_ex.md) +- [NetGear FAQs ➶](netgear_ex.md) +- [WebGear FAQs ➶](webgear_ex.md) +- [WebGear_RTC FAQs ➶](webgear_rtc_ex.md) +- [VideoGear FAQs ➶](videogear_ex.md) +- [Stabilizer Class FAQs ➶](stabilizer_ex.md) +- [NetGear_Async FAQs ➶](netgear_async_ex.md) + +  + ## Join our Gitter Community channel > Have you come up with some new idea 💡 or looking for the fastest way troubleshoot your problems diff --git a/docs/help/netgear_async_ex.md b/docs/help/netgear_async_ex.md new file mode 100644 index 000000000..a46c17c7e --- /dev/null +++ b/docs/help/netgear_async_ex.md @@ -0,0 +1,169 @@ + + +# NetGear_Async Examples + +  + +## Using NetGear_Async with WebGear + +The complete usage example is as follows: + +!!! new "New in v0.2.2" + This example was added in `v0.2.2`. + +### Client + WebGear Server + +Open a terminal on Client System where you want to display the input frames _(and setup WebGear server)_ received from the Server and execute the following python code: + +!!! danger "After running this code, Make sure to open Browser immediately otherwise NetGear_Async will soon exit with `TimeoutError`. You can also try setting [`timeout`](../../gears/netgear_async/params/#timeout) parameter to a higher value to extend this timeout." + +!!! warning "Make sure you use different `port` value for NetGear_Async and WebGear API." + +!!! alert "High CPU utilization may occur on Client's end. User discretion is advised." + +!!! note "Note down the IP-address of this system _(required at Server's end)_ by executing the `hostname -I` command and also replace it in the following code."" + +```python +# import libraries +from vidgear.gears.asyncio import NetGear_Async +from vidgear.gears.asyncio import WebGear +from vidgear.gears.asyncio.helper import reducer +import uvicorn, asyncio, cv2 + +# Define NetGear_Async Client at given IP address and define parameters +# !!! change following IP address '192.168.x.xxx' with yours !!! +client = NetGear_Async( + receive_mode=True, + pattern=1, + logging=True, +).launch() + +# create your own custom frame producer +async def my_frame_producer(): + + # loop over Client's Asynchronous Frame Generator + async for frame in client.recv_generator(): + + # {do something with received frames here} + + # reducer frames size if you want more performance otherwise comment this line + frame = await reducer( + frame, percentage=30, interpolation=cv2.INTER_AREA + ) # reduce frame by 30% + + # handle JPEG encoding + encodedImage = cv2.imencode(".jpg", frame)[1].tobytes() + # yield frame in byte format + yield (b"--frame\r\nContent-Type:image/jpeg\r\n\r\n" + encodedImage + b"\r\n") + await asyncio.sleep(0) + + +if __name__ == "__main__": + # Set event loop to client's + asyncio.set_event_loop(client.loop) + + # initialize WebGear app without any source + web = WebGear(logging=True) + + # add your custom frame producer to config with adequate IP address + web.config["generator"] = my_frame_producer + + # run this app on Uvicorn server at address http://localhost:8000/ + uvicorn.run(web(), host="localhost", port=8000) + + # safely close client + client.close() + + # close app safely + web.shutdown() +``` + +!!! success "On successfully running this code, the output stream will be displayed at address http://localhost:8000/ in your Client's Browser." + +### Server + +Now, Open the terminal on another Server System _(with a webcam connected to it at index 0)_, and execute the following python code: + +!!! note "Replace the IP address in the following code with Client's IP address you noted earlier." + +```python +# import library +from vidgear.gears.asyncio import NetGear_Async +import cv2, asyncio + +# initialize Server without any source +server = NetGear_Async( + source=None, + address="192.168.x.xxx", + port="5454", + protocol="tcp", + pattern=1, + logging=True, +) + +# Create a async frame generator as custom source +async def my_frame_generator(): + + # !!! define your own video source here !!! + # Open any video stream such as live webcam + # video stream on first index(i.e. 0) device + stream = cv2.VideoCapture(0) + + # loop over stream until its terminated + while True: + + # read frames + (grabbed, frame) = stream.read() + + # check if frame empty + if not grabbed: + break + + # do something with the frame to be sent here + + # yield frame + yield frame + # sleep for sometime + await asyncio.sleep(0) + + # close stream + stream.release() + + +if __name__ == "__main__": + # set event loop + asyncio.set_event_loop(server.loop) + # Add your custom source generator to Server configuration + server.config["generator"] = my_frame_generator() + # Launch the Server + server.launch() + try: + # run your main function task until it is complete + server.loop.run_until_complete(server.task) + except (KeyboardInterrupt, SystemExit): + # wait for interrupts + pass + finally: + # finally close the server + server.close() +``` + +  diff --git a/docs/help/netgear_ex.md b/docs/help/netgear_ex.md new file mode 100644 index 000000000..ef43baaa8 --- /dev/null +++ b/docs/help/netgear_ex.md @@ -0,0 +1,368 @@ + + +# NetGear Examples + +  + +## Using NetGear with WebGear + +The complete usage example is as follows: + +!!! new "New in v0.2.2" + This example was added in `v0.2.2`. + +### Client + WebGear Server + +Open a terminal on Client System where you want to display the input frames _(and setup WebGear server)_ received from the Server and execute the following python code: + +!!! danger "After running this code, Make sure to open Browser immediately otherwise NetGear will soon exit with `RuntimeError`. You can also try setting [`max_retries`](../../gears/netgear/params/#options) and [`request_timeout`](../../gears/netgear/params/#options) like attributes to a higher value to avoid this." + +!!! warning "Make sure you use different `port` value for NetGear and WebGear API." + +!!! alert "High CPU utilization may occur on Client's end. User discretion is advised." + +!!! note "Note down the IP-address of this system _(required at Server's end)_ by executing the `hostname -I` command and also replace it in the following code."" + +```python +# import necessary libs +import uvicorn, asyncio, cv2 +from vidgear.gears.asyncio import WebGear +from vidgear.gears.asyncio.helper import reducer + +# initialize WebGear app without any source +web = WebGear(logging=True) + + +# activate jpeg encoding and specify other related parameters +options = { + "jpeg_compression": True, + "jpeg_compression_quality": 90, + "jpeg_compression_fastdct": True, + "jpeg_compression_fastupsample": True, +} + +# create your own custom frame producer +async def my_frame_producer(): + # initialize global params + # Define NetGear Client at given IP address and define parameters + # !!! change following IP address '192.168.x.xxx' with yours !!! + client = NetGear( + receive_mode=True, + address="192.168.x.xxx", + port="5454", + protocol="tcp", + pattern=1, + logging=True, + **options, + ) + + # loop over frames + while True: + # receive frames from network + frame = self.client.recv() + + # if NoneType + if frame is None: + return None + + # do something with your OpenCV frame here + + # reducer frames size if you want more performance otherwise comment this line + frame = await reducer( + frame, percentage=30, interpolation=cv2.INTER_AREA + ) # reduce frame by 30% + + # handle JPEG encoding + encodedImage = cv2.imencode(".jpg", frame)[1].tobytes() + # yield frame in byte format + yield (b"--frame\r\nContent-Type:image/jpeg\r\n\r\n" + encodedImage + b"\r\n") + await asyncio.sleep(0) + # close stream + client.close() + + +# add your custom frame producer to config with adequate IP address +web.config["generator"] = my_frame_producer + +# run this app on Uvicorn server at address http://localhost:8000/ +uvicorn.run(web(), host="localhost", port=8000) + +# close app safely +web.shutdown() +``` + +!!! success "On successfully running this code, the output stream will be displayed at address http://localhost:8000/ in your Client's Browser." + + +### Server + +Now, Open the terminal on another Server System _(with a webcam connected to it at index 0)_, and execute the following python code: + +!!! note "Replace the IP address in the following code with Client's IP address you noted earlier." + +```python +# import required libraries +from vidgear.gears import VideoGear +from vidgear.gears import NetGear +import cv2 + +# activate jpeg encoding and specify other related parameters +options = { + "jpeg_compression": True, + "jpeg_compression_quality": 90, + "jpeg_compression_fastdct": True, + "jpeg_compression_fastupsample": True, +} + +# Open live video stream on webcam at first index(i.e. 0) device +stream = VideoGear(source=0).start() + +# Define NetGear server at given IP address and define parameters +# !!! change following IP address '192.168.x.xxx' with client's IP address !!! +server = NetGear( + address="192.168.x.xxx", + port="5454", + protocol="tcp", + pattern=1, + logging=True, + **options +) + +# loop over until KeyBoard Interrupted +while True: + + try: + # read frames from stream + frame = stream.read() + + # check for frame if None-type + if frame is None: + break + + # {do something with the frame here} + + # send frame to server + server.send(frame) + + except KeyboardInterrupt: + break + +# safely close video stream +stream.stop() + +# safely close server +server.close() +``` + +  + +## Using NetGear with WebGear_RTC + +The complete usage example is as follows: + +!!! new "New in v0.2.2" + This example was added in `v0.2.2`. + +### Client + WebGear_RTC Server + +Open a terminal on Client System where you want to display the input frames _(and setup WebGear_RTC server)_ received from the Server and execute the following python code: + +!!! danger "After running this code, Make sure to open Browser immediately otherwise NetGear will soon exit with `RuntimeError`. You can also try setting [`max_retries`](../../gears/netgear/params/#options) and [`request_timeout`](../../gears/netgear/params/#options) like attributes to a higher value to avoid this." + +!!! warning "Make sure you use different `port` value for NetGear and WebGear_RTC API." + +!!! alert "High CPU utilization may occur on Client's end. User discretion is advised." + +!!! note "Note down the IP-address of this system _required at Server's end)_ by executing the `hostname -I` command and also replace it in the following code."" + +```python +# import required libraries +import uvicorn, asyncio, cv2 +from av import VideoFrame +from aiortc import VideoStreamTrack +from aiortc.mediastreams import MediaStreamError +from vidgear.gears import NetGear +from vidgear.gears.asyncio import WebGear_RTC +from vidgear.gears.asyncio.helper import reducer + +# initialize WebGear_RTC app without any source +web = WebGear_RTC(logging=True) + +# activate jpeg encoding and specify other related parameters +options = { + "jpeg_compression": True, + "jpeg_compression_quality": 90, + "jpeg_compression_fastdct": True, + "jpeg_compression_fastupsample": True, +} + + +# create your own Bare-Minimum Custom Media Server +class Custom_RTCServer(VideoStreamTrack): + """ + Custom Media Server using OpenCV, an inherit-class + to aiortc's VideoStreamTrack. + """ + + def __init__( + self, + address=None, + port="5454", + protocol="tcp", + pattern=1, + logging=True, + options={}, + ): + # don't forget this line! + super().__init__() + + # initialize global params + # Define NetGear Client at given IP address and define parameters + self.client = NetGear( + receive_mode=True, + address=address, + port=protocol, + pattern=pattern, + receive_mode=True, + logging=logging, + **options + ) + + async def recv(self): + """ + A coroutine function that yields `av.frame.Frame`. + """ + # don't forget this function!!! + + # get next timestamp + pts, time_base = await self.next_timestamp() + + # receive frames from network + frame = self.client.recv() + + # if NoneType + if frame is None: + raise MediaStreamError + + # reducer frames size if you want more performance otherwise comment this line + frame = await reducer(frame, percentage=30) # reduce frame by 30% + + # contruct `av.frame.Frame` from `numpy.nd.array` + av_frame = VideoFrame.from_ndarray(frame, format="bgr24") + av_frame.pts = pts + av_frame.time_base = time_base + + # return `av.frame.Frame` + return av_frame + + def terminate(self): + """ + Gracefully terminates VideoGear stream + """ + # don't forget this function!!! + + # terminate + if not (self.client is None): + self.client.close() + self.client = None + + +# assign your custom media server to config with adequate IP address +# !!! change following IP address '192.168.x.xxx' with yours !!! +web.config["server"] = Custom_RTCServer( + address="192.168.x.xxx", + port="5454", + protocol="tcp", + pattern=1, + logging=True, + **options +) + +# run this app on Uvicorn server at address http://localhost:8000/ +uvicorn.run(web(), host="localhost", port=8000) + +# close app safely +web.shutdown() +``` + +!!! success "On successfully running this code, the output stream will be displayed at address http://localhost:8000/ in your Client's Browser." + +### Server + +Now, Open the terminal on another Server System _(with a webcam connected to it at index 0)_, and execute the following python code: + +!!! note "Replace the IP address in the following code with Client's IP address you noted earlier." + +```python +# import required libraries +from vidgear.gears import VideoGear +from vidgear.gears import NetGear +import cv2 + +# activate jpeg encoding and specify other related parameters +options = { + "jpeg_compression": True, + "jpeg_compression_quality": 90, + "jpeg_compression_fastdct": True, + "jpeg_compression_fastupsample": True, +} + +# Open live video stream on webcam at first index(i.e. 0) device +stream = VideoGear(source=0).start() + +# Define NetGear server at given IP address and define parameters +# !!! change following IP address '192.168.x.xxx' with client's IP address !!! +server = NetGear( + address="192.168.x.xxx", + port="5454", + protocol="tcp", + pattern=1, + logging=True, + **options +) + +# loop over until KeyBoard Interrupted +while True: + + try: + # read frames from stream + frame = stream.read() + + # check for frame if Nonetype + if frame is None: + break + + # {do something with the frame here} + + # send frame to server + server.send(frame) + + except KeyboardInterrupt: + break + +# safely close video stream +stream.stop() + +# safely close server +server.close() +``` + +  \ No newline at end of file diff --git a/docs/help/pigear_ex.md b/docs/help/pigear_ex.md new file mode 100644 index 000000000..03d86f63e --- /dev/null +++ b/docs/help/pigear_ex.md @@ -0,0 +1,75 @@ + + +# PiGear Examples + +  + +## Setting variable `picamera` parameters for Camera Module at runtime + +You can use `stream` global parameter in PiGear to feed any [`picamera`](https://picamera.readthedocs.io/en/release-1.10/api_camera.html) parameters at runtime. + +In this example we will set initial Camera Module's `brightness` value `80`, and will change it `50` when **`z` key** is pressed at runtime: + +```python +# import required libraries +from vidgear.gears import PiGear +import cv2 + +# initial parameters +options = {"brightness": 80} # set brightness to 80 + +# open pi video stream with default parameters +stream = PiGear(logging=True, **options).start() + +# loop over +while True: + + # read frames from stream + frame = stream.read() + + # check for frame if Nonetype + if frame is None: + break + + + # {do something with the frame here} + + + # Show output window + cv2.imshow("Output Frame", frame) + + # check for 'q' key if pressed + key = cv2.waitKey(1) & 0xFF + if key == ord("q"): + break + # check for 'z' key if pressed + if key == ord("z"): + # change brightness to 50 + stream.stream.brightness = 50 + +# close output window +cv2.destroyAllWindows() + +# safely close video stream +stream.stop() +``` + +  \ No newline at end of file diff --git a/docs/help/pigear_faqs.md b/docs/help/pigear_faqs.md index 41725348e..30661a518 100644 --- a/docs/help/pigear_faqs.md +++ b/docs/help/pigear_faqs.md @@ -67,53 +67,6 @@ limitations under the License. ## How to change `picamera` settings for Camera Module at runtime? -**Answer:** You can use `stream` global parameter in PiGear to feed any `picamera` setting at runtime. See following sample usage example: - -!!! info "" - In this example we will set initial Camera Module's `brightness` value `80`, and will change it `50` when **`z` key** is pressed at runtime. - -```python -# import required libraries -from vidgear.gears import PiGear -import cv2 - -# initial parameters -options = {"brightness": 80} # set brightness to 80 - -# open pi video stream with default parameters -stream = PiGear(logging=True, **options).start() - -# loop over -while True: - - # read frames from stream - frame = stream.read() - - # check for frame if Nonetype - if frame is None: - break - - - # {do something with the frame here} - - - # Show output window - cv2.imshow("Output Frame", frame) - - # check for 'q' key if pressed - key = cv2.waitKey(1) & 0xFF - if key == ord("q"): - break - # check for 'z' key if pressed - if key == ord("z"): - # change brightness to 50 - stream.stream.brightness = 50 - -# close output window -cv2.destroyAllWindows() - -# safely close video stream -stream.stop() -``` +**Answer:** You can use `stream` global parameter in PiGear to feed any `picamera` setting at runtime. See [this bonus example ➶](../pigear_ex/#setting-variable-picamera-parameters-for-camera-module-at-runtime).   \ No newline at end of file diff --git a/docs/help/screengear_ex.md b/docs/help/screengear_ex.md new file mode 100644 index 000000000..80463ee11 --- /dev/null +++ b/docs/help/screengear_ex.md @@ -0,0 +1,149 @@ + + +# ScreenGear Examples + +  + +## Using ScreenGear with NetGear and WriteGear + +The complete usage example is as follows: + +!!! new "New in v0.2.2" + This example was added in `v0.2.2`. + +### Client + WriteGear + +Open a terminal on Client System _(where you want to save the input frames received from the Server)_ and execute the following python code: + +!!! info "Note down the IP-address of this system(required at Server's end) by executing the command: `hostname -I` and also replace it in the following code." + +!!! tip "You can terminate client anytime by pressing ++ctrl+"C"++ on your keyboard!" + +```python +# import required libraries +from vidgear.gears import NetGear +from vidgear.gears import WriteGear +import cv2 + +# define various tweak flags +options = {"flag": 0, "copy": False, "track": False} + +# Define Netgear Client at given IP address and define parameters +# !!! change following IP address '192.168.x.xxx' with yours !!! +client = NetGear( + address="192.168.x.xxx", + port="5454", + protocol="tcp", + pattern=1, + receive_mode=True, + logging=True, + **options +) + +# Define writer with default parameters and suitable output filename for e.g. `Output.mp4` +writer = WriteGear(output_filename="Output.mp4") + +# loop over +while True: + + # receive frames from network + frame = client.recv() + + # check for received frame if Nonetype + if frame is None: + break + + # {do something with the frame here} + + # write frame to writer + writer.write(frame) + +# close output window +cv2.destroyAllWindows() + +# safely close client +client.close() + +# safely close writer +writer.close() +``` + +### Server + ScreenGear + +Now, Open the terminal on another Server System _(with a montior/display attached to it)_, and execute the following python code: + +!!! info "Replace the IP address in the following code with Client's IP address you noted earlier." + +!!! tip "You can terminate stream on both side anytime by pressing ++ctrl+"C"++ on your keyboard!" + +```python +# import required libraries +from vidgear.gears import VideoGear +from vidgear.gears import NetGear + +# define dimensions of screen w.r.t to given monitor to be captured +options = {"top": 40, "left": 0, "width": 100, "height": 100} + +# open stream with defined parameters +stream = ScreenGear(logging=True, **options).start() + +# define various netgear tweak flags +options = {"flag": 0, "copy": False, "track": False} + +# Define Netgear server at given IP address and define parameters +# !!! change following IP address '192.168.x.xxx' with client's IP address !!! +server = NetGear( + address="192.168.x.xxx", + port="5454", + protocol="tcp", + pattern=1, + logging=True, + **options +) + +# loop over until KeyBoard Interrupted +while True: + + try: + # read frames from stream + frame = stream.read() + + # check for frame if Nonetype + if frame is None: + break + + # {do something with the frame here} + + # send frame to server + server.send(frame) + + except KeyboardInterrupt: + break + +# safely close video stream +stream.stop() + +# safely close server +server.close() +``` + +  + diff --git a/docs/help/stabilizer_ex.md b/docs/help/stabilizer_ex.md new file mode 100644 index 000000000..8b8636265 --- /dev/null +++ b/docs/help/stabilizer_ex.md @@ -0,0 +1,236 @@ + + +# Stabilizer Class Examples + +  + +## Saving Stabilizer Class output with Live Audio Input + +In this example code, we will merging the audio from a Audio Device _(for e.g. Webcam inbuilt mic input)_ with Stablized frames incoming from the Stabilizer Class _(which is also using same Webcam video input through OpenCV)_, and save the final output as a compressed video file, all in real time: + +!!! new "New in v0.2.2" + This example was added in `v0.2.2`. + +!!! alert "Example Assumptions" + + * You're running are Linux machine. + * You already have appropriate audio driver and software installed on your machine. + + +??? tip "Identifying and Specifying sound card on different OS platforms" + + === "On Windows" + + Windows OS users can use the [dshow](https://trac.ffmpeg.org/wiki/DirectShow) (DirectShow) to list audio input device which is the preferred option for Windows users. You can refer following steps to identify and specify your sound card: + + - [x] **[OPTIONAL] Enable sound card(if disabled):** First enable your Stereo Mix by opening the "Sound" window and select the "Recording" tab, then right click on the window and select "Show Disabled Devices" to toggle the Stereo Mix device visibility. **Follow this [post ➶](https://forums.tomshardware.com/threads/no-sound-through-stereo-mix-realtek-hd-audio.1716182/) for more details.** + + - [x] **Identify Sound Card:** Then, You can locate your soundcard using `dshow` as follows: + + ```sh + c:\> ffmpeg -list_devices true -f dshow -i dummy + ffmpeg version N-45279-g6b86dd5... --enable-runtime-cpudetect + libavutil 51. 74.100 / 51. 74.100 + libavcodec 54. 65.100 / 54. 65.100 + libavformat 54. 31.100 / 54. 31.100 + libavdevice 54. 3.100 / 54. 3.100 + libavfilter 3. 19.102 / 3. 19.102 + libswscale 2. 1.101 / 2. 1.101 + libswresample 0. 16.100 / 0. 16.100 + [dshow @ 03ACF580] DirectShow video devices + [dshow @ 03ACF580] "Integrated Camera" + [dshow @ 03ACF580] "USB2.0 Camera" + [dshow @ 03ACF580] DirectShow audio devices + [dshow @ 03ACF580] "Microphone (Realtek High Definition Audio)" + [dshow @ 03ACF580] "Microphone (USB2.0 Camera)" + dummy: Immediate exit requested + ``` + + + - [x] **Specify Sound Card:** Then, you can specify your located soundcard in StreamGear as follows: + + ```python + # assign appropriate input audio-source + output_params = { + "-i":"audio=Microphone (USB2.0 Camera)", + "-thread_queue_size": "512", + "-f": "dshow", + "-ac": "2", + "-acodec": "aac", + "-ar": "44100", + } + ``` + + !!! fail "If audio still doesn't work then [checkout this troubleshooting guide ➶](https://www.maketecheasier.com/fix-microphone-not-working-windows10/) or reach us out on [Gitter ➶](https://gitter.im/vidgear/community) Community channel" + + + === "On Linux" + + Linux OS users can use the [alsa](https://ffmpeg.org/ffmpeg-all.html#alsa) to list input device to capture live audio input such as from a webcam. You can refer following steps to identify and specify your sound card: + + - [x] **Identify Sound Card:** To get the list of all installed cards on your machine, you can type `arecord -l` or `arecord -L` _(longer output)_. + + ```sh + arecord -l + + **** List of CAPTURE Hardware Devices **** + card 0: ICH5 [Intel ICH5], device 0: Intel ICH [Intel ICH5] + Subdevices: 1/1 + Subdevice #0: subdevice #0 + card 0: ICH5 [Intel ICH5], device 1: Intel ICH - MIC ADC [Intel ICH5 - MIC ADC] + Subdevices: 1/1 + Subdevice #0: subdevice #0 + card 0: ICH5 [Intel ICH5], device 2: Intel ICH - MIC2 ADC [Intel ICH5 - MIC2 ADC] + Subdevices: 1/1 + Subdevice #0: subdevice #0 + card 0: ICH5 [Intel ICH5], device 3: Intel ICH - ADC2 [Intel ICH5 - ADC2] + Subdevices: 1/1 + Subdevice #0: subdevice #0 + card 1: U0x46d0x809 [USB Device 0x46d:0x809], device 0: USB Audio [USB Audio] + Subdevices: 1/1 + Subdevice #0: subdevice #0 + ``` + + + - [x] **Specify Sound Card:** Then, you can specify your located soundcard in WriteGear as follows: + + !!! info "The easiest thing to do is to reference sound card directly, namely "card 0" (Intel ICH5) and "card 1" (Microphone on the USB web cam), as `hw:0` or `hw:1`" + + ```python + # assign appropriate input audio-source + output_params = { + "-i": "hw:1", + "-thread_queue_size": "512", + "-f": "alsa", + "-ac": "2", + "-acodec": "aac", + "-ar": "44100", + } + ``` + + !!! fail "If audio still doesn't work then reach us out on [Gitter ➶](https://gitter.im/vidgear/community) Community channel" + + + === "On MacOS" + + MAC OS users can use the [avfoundation](https://ffmpeg.org/ffmpeg-devices.html#avfoundation) to list input devices for grabbing audio from integrated iSight cameras as well as cameras connected via USB or FireWire. You can refer following steps to identify and specify your sound card on MacOS/OSX machines: + + + - [x] **Identify Sound Card:** Then, You can locate your soundcard using `avfoundation` as follows: + + ```sh + ffmpeg -f qtkit -list_devices true -i "" + ffmpeg version N-45279-g6b86dd5... --enable-runtime-cpudetect + libavutil 51. 74.100 / 51. 74.100 + libavcodec 54. 65.100 / 54. 65.100 + libavformat 54. 31.100 / 54. 31.100 + libavdevice 54. 3.100 / 54. 3.100 + libavfilter 3. 19.102 / 3. 19.102 + libswscale 2. 1.101 / 2. 1.101 + libswresample 0. 16.100 / 0. 16.100 + [AVFoundation input device @ 0x7f8e2540ef20] AVFoundation video devices: + [AVFoundation input device @ 0x7f8e2540ef20] [0] FaceTime HD camera (built-in) + [AVFoundation input device @ 0x7f8e2540ef20] [1] Capture screen 0 + [AVFoundation input device @ 0x7f8e2540ef20] AVFoundation audio devices: + [AVFoundation input device @ 0x7f8e2540ef20] [0] Blackmagic Audio + [AVFoundation input device @ 0x7f8e2540ef20] [1] Built-in Microphone + ``` + + + - [x] **Specify Sound Card:** Then, you can specify your located soundcard in StreamGear as follows: + + ```python + # assign appropriate input audio-source + output_params = { + "-audio_device_index": "0", + "-thread_queue_size": "512", + "-f": "avfoundation", + "-ac": "2", + "-acodec": "aac", + "-ar": "44100", + } + ``` + + !!! fail "If audio still doesn't work then reach us out on [Gitter ➶](https://gitter.im/vidgear/community) Community channel" + + +!!! danger "Make sure this `-i` audio-source it compatible with provided video-source, otherwise you encounter multiple errors or no output at all." + +!!! warning "You **MUST** use [`-input_framerate`](../../gears/writegear/compression/params/#a-exclusive-parameters) attribute to set exact value of input framerate when using external audio in Real-time Frames mode, otherwise audio delay will occur in output streams." + +```python +# import required libraries +from vidgear.gears import WriteGear +from vidgear.gears.stabilizer import Stabilizer +import cv2 + +# Open suitable video stream, such as webcam on first index(i.e. 0) +stream = cv2.VideoCapture(0) + +# initiate stabilizer object with defined parameters +stab = Stabilizer(smoothing_radius=30, crop_n_zoom=True, border_size=5, logging=True) + +# change with your webcam soundcard, plus add additional required FFmpeg parameters for your writer +output_params = { + "-thread_queue_size": "512", + "-f": "alsa", + "-ac": "1", + "-ar": "48000", + "-i": "plughw:CARD=CAMERA,DEV=0", +} + +# Define writer with defined parameters and suitable output filename for e.g. `Output.mp4 +writer = WriteGear(output_filename="Output.mp4", logging=True, **output_params) + +# loop over +while True: + + # read frames from stream + (grabbed, frame) = stream.read() + + # check for frame if not grabbed + if not grabbed: + break + + # send current frame to stabilizer for processing + stabilized_frame = stab.stabilize(frame) + + # wait for stabilizer which still be initializing + if stabilized_frame is None: + continue + + # {do something with the stabilized frame here} + + # write stabilized frame to writer + writer.write(stabilized_frame) + + +# clear stabilizer resources +stab.clean() + +# safely close video stream +stream.release() + +# safely close writer +writer.close() +``` + +  \ No newline at end of file diff --git a/docs/help/streamgear_ex.md b/docs/help/streamgear_ex.md new file mode 100644 index 000000000..d8a83db14 --- /dev/null +++ b/docs/help/streamgear_ex.md @@ -0,0 +1,161 @@ + + +# StreamGear Examples + +  + +## StreamGear Live-Streaming Usage with PiGear + +In this example, we will be Live-Streaming video-frames from Raspberry Pi _(with Camera Module connected)_ using PiGear API and StreamGear API's Real-time Frames Mode: + +!!! new "New in v0.2.2" + This example was added in `v0.2.2`. + +!!! tip "Use `-window_size` & `-extra_window_size` FFmpeg parameters for controlling number of frames to be kept in Chunks. Less these value, less will be latency." + +!!! alert "After every few chunks _(equal to the sum of `-window_size` & `-extra_window_size` values)_, all chunks will be overwritten in Live-Streaming. Thereby, since newer chunks in manifest/playlist will contain NO information of any older ones, and therefore resultant DASH/HLS stream will play only the most recent frames." + +!!! note "In this mode, StreamGear **DOES NOT** automatically maps video-source audio to generated streams. You need to manually assign separate audio-source through [`-audio`](../../gears/streamgear/params/#a-exclusive-parameters) attribute of `stream_params` dictionary parameter." + +=== "DASH" + + ```python + # import required libraries + from vidgear.gears import PiGear + from vidgear.gears import StreamGear + import cv2 + + # add various Picamera tweak parameters to dictionary + options = { + "hflip": True, + "exposure_mode": "auto", + "iso": 800, + "exposure_compensation": 15, + "awb_mode": "horizon", + "sensor_mode": 0, + } + + # open pi video stream with defined parameters + stream = PiGear(resolution=(640, 480), framerate=60, logging=True, **options).start() + + # enable livestreaming and retrieve framerate from CamGear Stream and + # pass it as `-input_framerate` parameter for controlled framerate + stream_params = {"-input_framerate": stream.framerate, "-livestream": True} + + # describe a suitable manifest-file location/name + streamer = StreamGear(output="dash_out.mpd", **stream_params) + + # loop over + while True: + + # read frames from stream + frame = stream.read() + + # check for frame if Nonetype + if frame is None: + break + + # {do something with the frame here} + + # send frame to streamer + streamer.stream(frame) + + # Show output window + cv2.imshow("Output Frame", frame) + + # check for 'q' key if pressed + key = cv2.waitKey(1) & 0xFF + if key == ord("q"): + break + + # close output window + cv2.destroyAllWindows() + + # safely close video stream + stream.stop() + + # safely close streamer + streamer.terminate() + ``` + +=== "HLS" + + ```python + # import required libraries + from vidgear.gears import PiGear + from vidgear.gears import StreamGear + import cv2 + + # add various Picamera tweak parameters to dictionary + options = { + "hflip": True, + "exposure_mode": "auto", + "iso": 800, + "exposure_compensation": 15, + "awb_mode": "horizon", + "sensor_mode": 0, + } + + # open pi video stream with defined parameters + stream = PiGear(resolution=(640, 480), framerate=60, logging=True, **options).start() + + # enable livestreaming and retrieve framerate from CamGear Stream and + # pass it as `-input_framerate` parameter for controlled framerate + stream_params = {"-input_framerate": stream.framerate, "-livestream": True} + + # describe a suitable manifest-file location/name + streamer = StreamGear(output="hls_out.m3u8", format = "hls", **stream_params) + + # loop over + while True: + + # read frames from stream + frame = stream.read() + + # check for frame if Nonetype + if frame is None: + break + + # {do something with the frame here} + + # send frame to streamer + streamer.stream(frame) + + # Show output window + cv2.imshow("Output Frame", frame) + + # check for 'q' key if pressed + key = cv2.waitKey(1) & 0xFF + if key == ord("q"): + break + + # close output window + cv2.destroyAllWindows() + + # safely close video stream + stream.stop() + + # safely close streamer + streamer.terminate() + ``` + + +  \ No newline at end of file diff --git a/docs/help/videogear_ex.md b/docs/help/videogear_ex.md new file mode 100644 index 000000000..de8a92053 --- /dev/null +++ b/docs/help/videogear_ex.md @@ -0,0 +1,220 @@ + + +# VideoGear Examples + +  + +## Using VideoGear with ROS(Robot Operating System) + +We will be using [`cv_bridge`](http://wiki.ros.org/cv_bridge/Tutorials/ConvertingBetweenROSImagesAndOpenCVImagesPython) to convert OpenCV frames to ROS image messages and vice-versa. + +In this example, we'll create a node that convert OpenCV frames into ROS image messages, and then publishes them over ROS. + +!!! new "New in v0.2.2" + This example was added in `v0.2.2`. + +!!! note "This example is vidgear implementation of this [wiki example](http://wiki.ros.org/cv_bridge/Tutorials/ConvertingBetweenROSImagesAndOpenCVImagesPython)." + +```python +# import roslib +import roslib + +roslib.load_manifest("my_package") + +# import other required libraries +import sys +import rospy +import cv2 +from std_msgs.msg import String +from sensor_msgs.msg import Image +from cv_bridge import CvBridge, CvBridgeError +from vidgear.gears import VideoGear + +# custom publisher class +class image_publisher: + def __init__(self, source=0, logging=False): + # create CV bridge + self.bridge = CvBridge() + # define publisher topic + self.image_pub = rospy.Publisher("image_topic_pub", Image) + # open stream with given parameters + self.stream_stab = VideoGear(source=source, logging=logging).start() + # define publisher topic + rospy.Subscriber("image_topic_sub", Image, self.callback) + + def callback(self, data): + + # {do something with received ROS node data here} + + # read stabilized frames + frame = self.stream.read() + # check for stabilized frame if None-type + if not (frame is None): + + # {do something with the frame here} + + # publish our frame + try: + self.image_pub.publish(self.bridge.cv2_to_imgmsg(frame, "bgr8")) + except CvBridgeError as e: + # catch any errors + print(e) + + def close(self): + # stop stream + self.stream_stab.stop() + + +def main(args): + # !!! define your own video source here !!! + # Open any video stream such as live webcam + # video stream on first index(i.e. 0) device + + # define publisher + ic = image_publisher(source=0, logging=True) + # initiate ROS node on publisher + rospy.init_node("image_publisher", anonymous=True) + try: + # run node + rospy.spin() + except KeyboardInterrupt: + print("Shutting down") + finally: + # close publisher + ic.close() + + +if __name__ == "__main__": + main(sys.argv) +``` + +  + +## Using VideoGear for capturing RSTP/RTMP URLs + +Here's a high-level wrapper code around VideoGear API to enable auto-reconnection during capturing, plus stabilization is enabled _(`stabilize=True`)_ in order to stabilize captured frames on-the-go: + +!!! new "New in v0.2.2" + This example was added in `v0.2.2`. + +??? tip "Enforcing UDP stream" + + You can easily enforce UDP for RSTP streams inplace of default TCP, by putting following lines of code on the top of your existing code: + + ```python + # import required libraries + import os + + # enforce UDP + os.environ["OPENCV_FFMPEG_CAPTURE_OPTIONS"] = "rtsp_transport;udp" + ``` + + Finally, use [`backend`](../../gears/videogear/params/#backend) parameter value as `backend=cv2.CAP_FFMPEG` in VideoGear. + + +```python +from vidgear.gears import VideoGear +import cv2 +import datetime +import time + + +class Reconnecting_VideoGear: + def __init__(self, cam_address, stabilize=False, reset_attempts=50, reset_delay=5): + self.cam_address = cam_address + self.stabilize = stabilize + self.reset_attempts = reset_attempts + self.reset_delay = reset_delay + self.source = VideoGear( + source=self.cam_address, stabilize=self.stabilize + ).start() + self.running = True + + def read(self): + if self.source is None: + return None + if self.running and self.reset_attempts > 0: + frame = self.source.read() + if frame is None: + self.source.stop() + self.reset_attempts -= 1 + print( + "Re-connection Attempt-{} occured at time:{}".format( + str(self.reset_attempts), + datetime.datetime.now().strftime("%m-%d-%Y %I:%M:%S%p"), + ) + ) + time.sleep(self.reset_delay) + self.source = VideoGear( + source=self.cam_address, stabilize=self.stabilize + ).start() + # return previous frame + return self.frame + else: + self.frame = frame + return frame + else: + return None + + def stop(self): + self.running = False + self.reset_attempts = 0 + self.frame = None + if not self.source is None: + self.source.stop() + + +if __name__ == "__main__": + # open any valid video stream + stream = Reconnecting_VideoGear( + cam_address="rtsp://wowzaec2demo.streamlock.net/vod/mp4:BigBuckBunny_115k.mov", + reset_attempts=20, + reset_delay=5, + ) + + # loop over + while True: + + # read frames from stream + frame = stream.read() + + # check for frame if None-type + if frame is None: + break + + # {do something with the frame here} + + # Show output window + cv2.imshow("Output", frame) + + # check for 'q' key if pressed + key = cv2.waitKey(1) & 0xFF + if key == ord("q"): + break + + # close output window + cv2.destroyAllWindows() + + # safely close video stream + stream.stop() +``` + +  \ No newline at end of file diff --git a/docs/help/webgear_ex.md b/docs/help/webgear_ex.md new file mode 100644 index 000000000..05b1dc628 --- /dev/null +++ b/docs/help/webgear_ex.md @@ -0,0 +1,233 @@ + + +# WebGear Examples + +  + +## Using WebGear with RaspberryPi Camera Module + +Because of WebGear API's flexible internal wapper around VideoGear, it can easily access any parameter of CamGear and PiGear videocapture APIs. + +!!! info "Following usage examples are just an idea of what can be done with WebGear API, you can try various [VideoGear](../../gears/videogear/params/), [CamGear](../../gears/camgear/params/) and [PiGear](../../gears/pigear/params/) parameters directly in WebGear API in the similar manner." + +Here's a bare-minimum example of using WebGear API with the Raspberry Pi camera module while tweaking its various properties in just one-liner: + +```python +# import libs +import uvicorn +from vidgear.gears.asyncio import WebGear + +# various webgear performance and Raspberry Pi camera tweaks +options = { + "frame_size_reduction": 40, + "jpeg_compression_quality": 80, + "jpeg_compression_fastdct": True, + "jpeg_compression_fastupsample": False, + "hflip": True, + "exposure_mode": "auto", + "iso": 800, + "exposure_compensation": 15, + "awb_mode": "horizon", + "sensor_mode": 0, +} + +# initialize WebGear app +web = WebGear( + enablePiCamera=True, resolution=(640, 480), framerate=60, logging=True, **options +) + +# run this app on Uvicorn server at address http://localhost:8000/ +uvicorn.run(web(), host="localhost", port=8000) + +# close app safely +web.shutdown() +``` + +  + +## Using WebGear with real-time Video Stabilization enabled + +Here's an example of using WebGear API with real-time Video Stabilization enabled: + +```python +# import libs +import uvicorn +from vidgear.gears.asyncio import WebGear + +# various webgear performance tweaks +options = { + "frame_size_reduction": 40, + "jpeg_compression_quality": 80, + "jpeg_compression_fastdct": True, + "jpeg_compression_fastupsample": False, +} + +# initialize WebGear app with a raw source and enable video stabilization(`stabilize=True`) +web = WebGear(source="foo.mp4", stabilize=True, logging=True, **options) + +# run this app on Uvicorn server at address http://localhost:8000/ +uvicorn.run(web(), host="localhost", port=8000) + +# close app safely +web.shutdown() +``` + +  + + +## Display Two Sources Simultaneously in WebGear + +In this example, we'll be displaying two video feeds side-by-side simultaneously on browser using WebGear API by defining two separate frame generators: + +!!! new "New in v0.2.2" + This example was added in `v0.2.2`. + +**Step-1 (Trigger Auto-Generation Process):** Firstly, run this bare-minimum code to trigger the [**Auto-generation**](../../gears/webgear/#auto-generation-process) process, this will create `.vidgear` directory at current location _(directory where you'll run this code)_: + +```python +# import required libraries +import uvicorn +from vidgear.gears.asyncio import WebGear + +# provide current directory to save data files +options = {"custom_data_location": "./"} + +# initialize WebGear app +web = WebGear(source=0, logging=True, **options) + +# close app safely +web.shutdown() +``` + +**Step-2 (Replace HTML file):** Now, go inside `.vidgear` :arrow_right: `webgear` :arrow_right: `templates` directory at current location of your machine, and there replace content of `index.html` file with following: + +```html +{% extends "base.html" %} +{% block content %} +

WebGear Video Feed

+
+ Feed + Feed +
+{% endblock %} +``` + +**Step-3 (Build your own Frame Producers):** Now, create a python script code with OpenCV source, as follows: + +```python +# import necessary libs +import uvicorn, asyncio, cv2 +from vidgear.gears.asyncio import WebGear +from vidgear.gears.asyncio.helper import reducer +from starlette.responses import StreamingResponse +from starlette.routing import Route + +# provide current directory to load data files +options = {"custom_data_location": "./"} + +# initialize WebGear app without any source +web = WebGear(logging=True, **options) + +# create your own custom frame producer +async def my_frame_producer1(): + + # !!! define your first video source here !!! + # Open any video stream such as "foo1.mp4" + stream = cv2.VideoCapture("foo1.mp4") + # loop over frames + while True: + # read frame from provided source + (grabbed, frame) = stream.read() + # break if NoneType + if not grabbed: + break + + # do something with your OpenCV frame here + + # reducer frames size if you want more performance otherwise comment this line + frame = await reducer(frame, percentage=30) # reduce frame by 30% + # handle JPEG encoding + encodedImage = cv2.imencode(".jpg", frame)[1].tobytes() + # yield frame in byte format + yield (b"--frame\r\nContent-Type:video/jpeg2000\r\n\r\n" + encodedImage + b"\r\n") + await asyncio.sleep(0.00001) + # close stream + stream.release() + + +# create your own custom frame producer +async def my_frame_producer2(): + + # !!! define your second video source here !!! + # Open any video stream such as "foo2.mp4" + stream = cv2.VideoCapture("foo2.mp4") + # loop over frames + while True: + # read frame from provided source + (grabbed, frame) = stream.read() + # break if NoneType + if not grabbed: + break + + # do something with your OpenCV frame here + + # reducer frames size if you want more performance otherwise comment this line + frame = await reducer(frame, percentage=30) # reduce frame by 30% + # handle JPEG encoding + encodedImage = cv2.imencode(".jpg", frame)[1].tobytes() + # yield frame in byte format + yield (b"--frame\r\nContent-Type:video/jpeg2000\r\n\r\n" + encodedImage + b"\r\n") + await asyncio.sleep(0.00001) + # close stream + stream.release() + + +async def custom_video_response(scope): + """ + Return a async video streaming response for `my_frame_producer2` generator + """ + assert scope["type"] in ["http", "https"] + await asyncio.sleep(0.00001) + return StreamingResponse( + my_frame_producer2(), + media_type="multipart/x-mixed-replace; boundary=frame", + ) + + +# add your custom frame producer to config +web.config["generator"] = my_frame_producer1 + +# append new route i.e. new custom route with custom response +web.routes.append( + Route("/video2", endpoint=custom_video_response) + ) + +# run this app on Uvicorn server at address http://localhost:8000/ +uvicorn.run(web(), host="localhost", port=8000) + +# close app safely +web.shutdown() +``` + +!!! success "On successfully running this code, the output stream will be displayed at address http://localhost:8000/ in Browser." + + +  \ No newline at end of file diff --git a/docs/help/webgear_faqs.md b/docs/help/webgear_faqs.md index ca7e1b42d..e39194337 100644 --- a/docs/help/webgear_faqs.md +++ b/docs/help/webgear_faqs.md @@ -48,7 +48,7 @@ limitations under the License. ## Is it possible to stream on a different device on the network with WebGear? -!!! note "If you set `"0.0.0.0"` as host value instead of `"localhost"` on Host Machine, then you must still use http://localhost:8000/ to access stream on your host machine browser." +!!! alert "If you set `"0.0.0.0"` as host value instead of `"localhost"` on Host Machine, then you must still use http://localhost:8000/ to access stream on that same host machine browser." For accessing WebGear on different Client Devices on the network, use `"0.0.0.0"` as host value instead of `"localhost"` on Host Machine. Then type the IP-address of source machine followed by the defined `port` value in your desired Client Device's browser (for e.g. http://192.27.0.101:8000) to access the stream. diff --git a/docs/help/webgear_rtc_ex.md b/docs/help/webgear_rtc_ex.md new file mode 100644 index 000000000..894599957 --- /dev/null +++ b/docs/help/webgear_rtc_ex.md @@ -0,0 +1,213 @@ + + +# WebGear_RTC_RTC Examples + +  + +## Using WebGear_RTC with RaspberryPi Camera Module + +Because of WebGear_RTC API's flexible internal wapper around VideoGear, it can easily access any parameter of CamGear and PiGear videocapture APIs. + +!!! info "Following usage examples are just an idea of what can be done with WebGear_RTC API, you can try various [VideoGear](../../gears/videogear/params/), [CamGear](../../gears/camgear/params/) and [PiGear](../../gears/pigear/params/) parameters directly in WebGear_RTC API in the similar manner." + +Here's a bare-minimum example of using WebGear_RTC API with the Raspberry Pi camera module while tweaking its various properties in just one-liner: + +```python +# import libs +import uvicorn +from vidgear.gears.asyncio import WebGear_RTC + +# various webgear_rtc performance and Raspberry Pi camera tweaks +options = { + "frame_size_reduction": 25, + "hflip": True, + "exposure_mode": "auto", + "iso": 800, + "exposure_compensation": 15, + "awb_mode": "horizon", + "sensor_mode": 0, +} + +# initialize WebGear_RTC app +web = WebGear_RTC( + enablePiCamera=True, resolution=(640, 480), framerate=60, logging=True, **options +) + +# run this app on Uvicorn server at address http://localhost:8000/ +uvicorn.run(web(), host="localhost", port=8000) + +# close app safely +web.shutdown() +``` + +  + +## Using WebGear_RTC with real-time Video Stabilization enabled + +Here's an example of using WebGear_RTC API with real-time Video Stabilization enabled: + +```python +# import libs +import uvicorn +from vidgear.gears.asyncio import WebGear_RTC + +# various webgear_rtc performance tweaks +options = { + "frame_size_reduction": 25, +} + +# initialize WebGear_RTC app with a raw source and enable video stabilization(`stabilize=True`) +web = WebGear_RTC(source="foo.mp4", stabilize=True, logging=True, **options) + +# run this app on Uvicorn server at address http://localhost:8000/ +uvicorn.run(web(), host="localhost", port=8000) + +# close app safely +web.shutdown() +``` + +  + +## Display Two Sources Simultaneously in WebGear_RTC + +In this example, we'll be displaying two video feeds side-by-side simultaneously on browser using WebGear_RTC API by simply concatenating frames in real-time: + +!!! new "New in v0.2.2" + This example was added in `v0.2.2`. + +```python +# import necessary libs +import uvicorn, asyncio, cv2 +import numpy as np +from av import VideoFrame +from aiortc import VideoStreamTrack +from vidgear.gears.asyncio import WebGear_RTC +from vidgear.gears.asyncio.helper import reducer + +# initialize WebGear_RTC app without any source +web = WebGear_RTC(logging=True) + +# frame concatenator +def get_conc_frame(frame1, frame2): + h1, w1 = frame1.shape[:2] + h2, w2 = frame2.shape[:2] + + # create empty matrix + vis = np.zeros((max(h1, h2), w1 + w2, 3), np.uint8) + + # combine 2 frames + vis[:h1, :w1, :3] = frame1 + vis[:h2, w1 : w1 + w2, :3] = frame2 + + return vis + + +# create your own Bare-Minimum Custom Media Server +class Custom_RTCServer(VideoStreamTrack): + """ + Custom Media Server using OpenCV, an inherit-class + to aiortc's VideoStreamTrack. + """ + + def __init__(self, source1=None, source2=None): + + # don't forget this line! + super().__init__() + + # check is source are provided + if source1 is None or source2 is None: + raise ValueError("Provide both source") + + # initialize global params + # define both source here + self.stream1 = cv2.VideoCapture(source1) + self.stream2 = cv2.VideoCapture(source2) + + async def recv(self): + """ + A coroutine function that yields `av.frame.Frame`. + """ + # don't forget this function!!! + + # get next timestamp + pts, time_base = await self.next_timestamp() + + # read video frame + (grabbed1, frame1) = self.stream1.read() + (grabbed2, frame2) = self.stream2.read() + + # if NoneType + if not grabbed1 or not grabbed2: + return None + else: + print("Got frames") + + print(frame1.shape) + print(frame2.shape) + + # concatenate frame + frame = get_conc_frame(frame1, frame2) + + print(frame.shape) + + # reducer frames size if you want more performance otherwise comment this line + # frame = await reducer(frame, percentage=30) # reduce frame by 30% + + # contruct `av.frame.Frame` from `numpy.nd.array` + av_frame = VideoFrame.from_ndarray(frame, format="bgr24") + av_frame.pts = pts + av_frame.time_base = time_base + + # return `av.frame.Frame` + return av_frame + + def terminate(self): + """ + Gracefully terminates VideoGear stream + """ + # don't forget this function!!! + + # terminate + if not (self.stream1 is None): + self.stream1.release() + self.stream1 = None + + if not (self.stream2 is None): + self.stream2.release() + self.stream2 = None + + +# assign your custom media server to config with both adequate sources (for e.g. foo1.mp4 and foo2.mp4) +web.config["server"] = Custom_RTCServer( + source1="dance_videos/foo1.mp4", source2="dance_videos/foo2.mp4" +) + +# run this app on Uvicorn server at address http://localhost:8000/ +uvicorn.run(web(), host="localhost", port=8000) + +# close app safely +web.shutdown() +``` + +!!! success "On successfully running this code, the output stream will be displayed at address http://localhost:8000/ in Browser." + + +  \ No newline at end of file diff --git a/docs/help/writegear_ex.md b/docs/help/writegear_ex.md new file mode 100644 index 000000000..c505a55cb --- /dev/null +++ b/docs/help/writegear_ex.md @@ -0,0 +1,306 @@ + + + +# WriteGear Examples + +  + +## Using WriteGear's Compression Mode for YouTube-Live Streaming + +!!! new "New in v0.2.1" + This example was added in `v0.2.1`. + +!!! alert "This example assume you already have a [**YouTube Account with Live-Streaming enabled**](https://support.google.com/youtube/answer/2474026#enable) for publishing video." + +!!! danger "Make sure to change [_YouTube-Live Stream Key_](https://support.google.com/youtube/answer/2907883#zippy=%2Cstart-live-streaming-now) with yours in following code before running!" + +```python +# import required libraries +from vidgear.gears import CamGear +from vidgear.gears import WriteGear +import cv2 + +# define video source +VIDEO_SOURCE = "/home/foo/foo.mp4" + +# Open stream +stream = CamGear(source=VIDEO_SOURCE, logging=True).start() + +# define required FFmpeg optimizing parameters for your writer +# [NOTE]: Added VIDEO_SOURCE as audio-source, since YouTube rejects audioless streams! +output_params = { + "-i": VIDEO_SOURCE, + "-acodec": "aac", + "-ar": 44100, + "-b:a": 712000, + "-vcodec": "libx264", + "-preset": "medium", + "-b:v": "4500k", + "-bufsize": "512k", + "-pix_fmt": "yuv420p", + "-f": "flv", +} + +# [WARNING] Change your YouTube-Live Stream Key here: +YOUTUBE_STREAM_KEY = "xxxx-xxxx-xxxx-xxxx-xxxx" + +# Define writer with defined parameters and +writer = WriteGear( + output_filename="rtmp://a.rtmp.youtube.com/live2/{}".format(YOUTUBE_STREAM_KEY), + logging=True, + **output_params +) + +# loop over +while True: + + # read frames from stream + frame = stream.read() + + # check for frame if Nonetype + if frame is None: + break + + # {do something with the frame here} + + # write frame to writer + writer.write(frame) + +# safely close video stream +stream.stop() + +# safely close writer +writer.close() +``` + +  + + +## Using WriteGear's Compression Mode creating MP4 segments from a video stream + +!!! new "New in v0.2.1" + This example was added in `v0.2.1`. + +```python +# import required libraries +from vidgear.gears import VideoGear +from vidgear.gears import WriteGear +import cv2 + +# Open any video source `foo.mp4` +stream = VideoGear( + source="foo.mp4", logging=True +).start() + +# define required FFmpeg optimizing parameters for your writer +output_params = { + "-c:v": "libx264", + "-crf": 22, + "-map": 0, + "-segment_time": 9, + "-g": 9, + "-sc_threshold": 0, + "-force_key_frames": "expr:gte(t,n_forced*9)", + "-clones": ["-f", "segment"], +} + +# Define writer with defined parameters +writer = WriteGear(output_filename="output%03d.mp4", logging=True, **output_params) + +# loop over +while True: + + # read frames from stream + frame = stream.read() + + # check for frame if Nonetype + if frame is None: + break + + # {do something with the frame here} + + # write frame to writer + writer.write(frame) + + # Show output window + cv2.imshow("Output Frame", frame) + + # check for 'q' key if pressed + key = cv2.waitKey(1) & 0xFF + if key == ord("q"): + break + +# close output window +cv2.destroyAllWindows() + +# safely close video stream +stream.stop() + +# safely close writer +writer.close() +``` + +  + + +## Using WriteGear's Compression Mode to add external audio file input to video frames + +!!! new "New in v0.2.1" + This example was added in `v0.2.1`. + +!!! failure "Make sure this `-i` audio-source it compatible with provided video-source, otherwise you encounter multiple errors or no output at all." + +```python +# import required libraries +from vidgear.gears import CamGear +from vidgear.gears import WriteGear +import cv2 + +# open any valid video stream(for e.g `foo_video.mp4` file) +stream = CamGear(source="foo_video.mp4").start() + +# add various parameters, along with custom audio +stream_params = { + "-input_framerate": stream.framerate, # controlled framerate for audio-video sync !!! don't forget this line !!! + "-i": "foo_audio.aac", # assigns input audio-source: "foo_audio.aac" +} + +# Define writer with defined parameters +writer = WriteGear(output_filename="Output.mp4", logging=True, **stream_params) + +# loop over +while True: + + # read frames from stream + frame = stream.read() + + # check for frame if Nonetype + if frame is None: + break + + # {do something with the frame here} + + # write frame to writer + writer.write(frame) + + # Show output window + cv2.imshow("Output Frame", frame) + + # check for 'q' key if pressed + key = cv2.waitKey(1) & 0xFF + if key == ord("q"): + break + +# close output window +cv2.destroyAllWindows() + +# safely close video stream +stream.stop() + +# safely close writer +writer.close() +``` + +  + + +## Using WriteGear with ROS(Robot Operating System) + +We will be using [`cv_bridge`](http://wiki.ros.org/cv_bridge/Tutorials/ConvertingBetweenROSImagesAndOpenCVImagesPython) to convert OpenCV frames to ROS image messages and vice-versa. + +In this example, we'll create a node that listens to a ROS image message topic, converts the recieved images messages into OpenCV frames, draws a circle on it, and then process these frames into a lossless compressed file format in real-time. + +!!! new "New in v0.2.2" + This example was added in `v0.2.2`. + +!!! note "This example is vidgear implementation of this [wiki example](http://wiki.ros.org/cv_bridge/Tutorials/ConvertingBetweenROSImagesAndOpenCVImagesPython)." + +```python +# import roslib +import roslib + +roslib.load_manifest("my_package") + +# import other required libraries +import sys +import rospy +import cv2 +from std_msgs.msg import String +from sensor_msgs.msg import Image +from cv_bridge import CvBridge, CvBridgeError +from vidgear.gears import WriteGear + +# custom publisher class +class image_subscriber: + def __init__(self, output_filename="Output.mp4"): + # create CV bridge + self.bridge = CvBridge() + # define publisher topic + self.image_pub = rospy.Subscriber("image_topic_sub", Image, self.callback) + # Define writer with default parameters + self.writer = WriteGear(output_filename=output_filename) + + def callback(self, data): + # convert recieved data to frame + try: + cv_image = self.bridge.imgmsg_to_cv2(data, "bgr8") + except CvBridgeError as e: + print(e) + + # check if frame is valid + if cv_image: + + # {do something with the frame here} + + # add circle + (rows, cols, channels) = cv_image.shape + if cols > 60 and rows > 60: + cv2.circle(cv_image, (50, 50), 10, 255) + + # write frame to writer + writer.write(frame) + + def close(self): + # safely close video stream + self.writer.close() + + +def main(args): + # define publisher with suitable output filename + # such as `Output.mp4` for saving output + ic = image_subscriber(output_filename="Output.mp4") + # initiate ROS node on publisher + rospy.init_node("image_subscriber", anonymous=True) + try: + # run node + rospy.spin() + except KeyboardInterrupt: + print("Shutting down") + finally: + # close publisher + ic.close() + + +if __name__ == "__main__": + main(sys.argv) +``` + +  \ No newline at end of file diff --git a/docs/help/writegear_faqs.md b/docs/help/writegear_faqs.md index 53fe2950c..bb2764b2c 100644 --- a/docs/help/writegear_faqs.md +++ b/docs/help/writegear_faqs.md @@ -39,10 +39,8 @@ limitations under the License. **Answer:** WriteGear will exit with `ValueError` if you feed frames of different dimensions or channels. -   - ## How to install and configure FFmpeg correctly for WriteGear on my machine? **Answer:** Follow these [Installation Instructions ➶](../../gears/writegear/compression/advanced/ffmpeg_install/) for its installation. @@ -109,205 +107,21 @@ limitations under the License. ## Is YouTube-Live Streaming possibe with WriteGear? -**Answer:** Yes, See example below: - -!!! new "New in v0.2.1" - This example was added in `v0.2.1`. - -!!! alert "This example assume you already have a [**YouTube Account with Live-Streaming enabled**](https://support.google.com/youtube/answer/2474026#enable) for publishing video." - -!!! danger "Make sure to change [_YouTube-Live Stream Key_](https://support.google.com/youtube/answer/2907883#zippy=%2Cstart-live-streaming-now) with yours in following code before running!" - -```python -# import required libraries -from vidgear.gears import CamGear -from vidgear.gears import WriteGear -import cv2 - -# define video source -VIDEO_SOURCE = "/home/foo/foo.mp4" - -# Open stream -stream = CamGear(source=VIDEO_SOURCE, logging=True).start() - -# define required FFmpeg optimizing parameters for your writer -# [NOTE]: Added VIDEO_SOURCE as audio-source, since YouTube rejects audioless streams! -output_params = { - "-i": VIDEO_SOURCE, - "-acodec": "aac", - "-ar": 44100, - "-b:a": 712000, - "-vcodec": "libx264", - "-preset": "medium", - "-b:v": "4500k", - "-bufsize": "512k", - "-pix_fmt": "yuv420p", - "-f": "flv", -} - -# [WARNING] Change your YouTube-Live Stream Key here: -YOUTUBE_STREAM_KEY = "xxxx-xxxx-xxxx-xxxx-xxxx" - -# Define writer with defined parameters and -writer = WriteGear( - output_filename="rtmp://a.rtmp.youtube.com/live2/{}".format(YOUTUBE_STREAM_KEY), - logging=True, - **output_params -) - -# loop over -while True: - - # read frames from stream - frame = stream.read() - - # check for frame if Nonetype - if frame is None: - break - - # {do something with the frame here} - - # write frame to writer - writer.write(frame) - -# safely close video stream -stream.stop() - -# safely close writer -writer.close() -``` +**Answer:** Yes, See [this bonus example ➶](../writegear_ex/#using-writegears-compression-mode-for-youtube-live-streaming).   ## How to create MP4 segments from a video stream with WriteGear? -**Answer:** See example below: - -!!! new "New in v0.2.1" - This example was added in `v0.2.1`. - -```python -# import required libraries -from vidgear.gears import VideoGear -from vidgear.gears import WriteGear -import cv2 - -# Open any video source `foo.mp4` -stream = VideoGear( - source="foo.mp4", logging=True -).start() - -# define required FFmpeg optimizing parameters for your writer -output_params = { - "-c:v": "libx264", - "-crf": 22, - "-map": 0, - "-segment_time": 9, - "-g": 9, - "-sc_threshold": 0, - "-force_key_frames": "expr:gte(t,n_forced*9)", - "-clones": ["-f", "segment"], -} - -# Define writer with defined parameters -writer = WriteGear(output_filename="output%03d.mp4", logging=True, **output_params) - -# loop over -while True: - - # read frames from stream - frame = stream.read() - - # check for frame if Nonetype - if frame is None: - break - - # {do something with the frame here} - - # write frame to writer - writer.write(frame) - - # Show output window - cv2.imshow("Output Frame", frame) - - # check for 'q' key if pressed - key = cv2.waitKey(1) & 0xFF - if key == ord("q"): - break - -# close output window -cv2.destroyAllWindows() - -# safely close video stream -stream.stop() - -# safely close writer -writer.close() -``` +**Answer:** See [this bonus example ➶](../writegear_ex/#using-writegears-compression-mode-creating-mp4-segments-from-a-video-stream).   ## How add external audio file input to video frames? -**Answer:** See example below: - -!!! new "New in v0.2.1" - This example was added in `v0.2.1`. - -!!! failure "Make sure this `-i` audio-source it compatible with provided video-source, otherwise you encounter multiple errors or no output at all." - -```python -# import required libraries -from vidgear.gears import CamGear -from vidgear.gears import WriteGear -import cv2 - -# open any valid video stream(for e.g `foo_video.mp4` file) -stream = CamGear(source="foo_video.mp4").start() - -# add various parameters, along with custom audio -stream_params = { - "-input_framerate": stream.framerate, # controlled framerate for audio-video sync !!! don't forget this line !!! - "-i": "foo_audio.aac", # assigns input audio-source: "foo_audio.aac" -} - -# Define writer with defined parameters -writer = WriteGear(output_filename="Output.mp4", logging=True, **stream_params) - -# loop over -while True: - - # read frames from stream - frame = stream.read() - - # check for frame if Nonetype - if frame is None: - break - - # {do something with the frame here} - - # write frame to writer - writer.write(frame) - - # Show output window - cv2.imshow("Output Frame", frame) - - # check for 'q' key if pressed - key = cv2.waitKey(1) & 0xFF - if key == ord("q"): - break - -# close output window -cv2.destroyAllWindows() - -# safely close video stream -stream.stop() - -# safely close writer -writer.close() -``` +**Answer:** See [this bonus example ➶](../writegear_ex/#using-writegears-compression-mode-to-add-external-audio-file-input-to-video-frames).   diff --git a/docs/overrides/assets/stylesheets/custom.css b/docs/overrides/assets/stylesheets/custom.css index 411c08edd..32f12528d 100755 --- a/docs/overrides/assets/stylesheets/custom.css +++ b/docs/overrides/assets/stylesheets/custom.css @@ -35,176 +35,213 @@ limitations under the License. --md-admonition-icon--xinfo: url("data:image/svg+xml,%3C%3Fxml version='1.0' encoding='UTF-8'%3F%3E%3C!DOCTYPE svg PUBLIC '-//W3C//DTD SVG 1.1//EN' 'http://www.w3.org/Graphics/SVG/1.1/DTD/svg11.dtd'%3E%3Csvg xmlns='http://www.w3.org/2000/svg' xmlns:xlink='http://www.w3.org/1999/xlink' version='1.1' width='24' height='24' viewBox='0 0 24 24'%3E%3Cpath fill='%23000000' d='M18 2H12V9L9.5 7.5L7 9V2H6C4.9 2 4 2.9 4 4V20C4 21.1 4.9 22 6 22H18C19.1 22 20 21.1 20 20V4C20 2.89 19.1 2 18 2M17.68 18.41C17.57 18.5 16.47 19.25 16.05 19.5C15.63 19.79 14 20.72 14.26 18.92C14.89 15.28 16.11 13.12 14.65 14.06C14.27 14.29 14.05 14.43 13.91 14.5C13.78 14.61 13.79 14.6 13.68 14.41S13.53 14.23 13.67 14.13C13.67 14.13 15.9 12.34 16.72 12.28C17.5 12.21 17.31 13.17 17.24 13.61C16.78 15.46 15.94 18.15 16.07 18.54C16.18 18.93 17 18.31 17.44 18C17.44 18 17.5 17.93 17.61 18.05C17.72 18.22 17.83 18.3 17.68 18.41M16.97 11.06C16.4 11.06 15.94 10.6 15.94 10.03C15.94 9.46 16.4 9 16.97 9C17.54 9 18 9.46 18 10.03C18 10.6 17.54 11.06 16.97 11.06Z' /%3E%3C/svg%3E"); --md-admonition-icon--xadvance: url("data:image/svg+xml,%3C%3Fxml version='1.0' encoding='UTF-8'%3F%3E%3C!DOCTYPE svg PUBLIC '-//W3C//DTD SVG 1.1//EN' 'http://www.w3.org/Graphics/SVG/1.1/DTD/svg11.dtd'%3E%3Csvg xmlns='http://www.w3.org/2000/svg' xmlns:xlink='http://www.w3.org/1999/xlink' version='1.1' width='24' height='24' viewBox='0 0 24 24'%3E%3Cpath d='M7,2V4H8V18A4,4 0 0,0 12,22A4,4 0 0,0 16,18V4H17V2H7M11,16C10.4,16 10,15.6 10,15C10,14.4 10.4,14 11,14C11.6,14 12,14.4 12,15C12,15.6 11.6,16 11,16M13,12C12.4,12 12,11.6 12,11C12,10.4 12.4,10 13,10C13.6,10 14,10.4 14,11C14,11.6 13.6,12 13,12M14,7H10V4H14V7Z' /%3E%3C/svg%3E"); } + .md-typeset .admonition.advance, .md-typeset details.advance { - border-color: rgb(27,77,62); + border-color: rgb(27, 77, 62); } + .md-typeset .admonition.new, .md-typeset details.new { - border-color: rgb(43, 155, 70); + border-color: rgb(57,255,20); +} + +.md-typeset .admonition.alert, +.md-typeset details.alert { + border-color: rgb(255, 0, 255); } + .md-typeset .new > .admonition-title, .md-typeset .new > summary { - background-color: rgba(43, 155, 70, 0.1); - border-color: rgb(43, 155, 70); + background-color: rgb(57,255,20,0.1); + border-color: rgb(57,255,20); } + .md-typeset .new > .admonition-title::before, .md-typeset .new > summary::before { - background-color: rgb(228,24,30); + background-color: rgb(57,255,20); -webkit-mask-image: var(--md-admonition-icon--new); mask-image: var(--md-admonition-icon--new); } -.md-typeset .admonition.alert, -.md-typeset details.alert { - border-color: rgb(255, 0, 255); -} + .md-typeset .alert > .admonition-title, .md-typeset .alert > summary { background-color: rgba(255, 0, 255, 0.1); border-color: rgb(255, 0, 255); } + .md-typeset .alert > .admonition-title::before, .md-typeset .alert > summary::before { background-color: rgb(255, 0, 255); -webkit-mask-image: var(--md-admonition-icon--alert); mask-image: var(--md-admonition-icon--alert); } -.md-typeset .attention>.admonition-title::before, -.md-typeset .attention>summary::before, -.md-typeset .caution>.admonition-title::before, -.md-typeset .caution>summary::before, -.md-typeset .warning>.admonition-title::before, -.md-typeset .warning>summary::before { + +.md-typeset .advance > .admonition-title, +.md-typeset .advance > summary, +.md-typeset .experiment > .admonition-title, +.md-typeset .experiment > summary { + background-color: rgba(0, 57, 166, 0.1); + border-color: rgb(0, 57, 166); +} + +.md-typeset .advance > .admonition-title::before, +.md-typeset .advance > summary::before, +.md-typeset .experiment > .admonition-title::before, +.md-typeset .experiment > summary::before { + background-color: rgb(0, 57, 166); + -webkit-mask-image: var(--md-admonition-icon--xadvance); + mask-image: var(--md-admonition-icon--xadvance); +} + +.md-typeset .attention > .admonition-title::before, +.md-typeset .attention > summary::before, +.md-typeset .caution > .admonition-title::before, +.md-typeset .caution > summary::before, +.md-typeset .warning > .admonition-title::before, +.md-typeset .warning > summary::before { -webkit-mask-image: var(--md-admonition-icon--xwarning); mask-image: var(--md-admonition-icon--xwarning); } -.md-typeset .hint>.admonition-title::before, -.md-typeset .hint>summary::before, -.md-typeset .important>.admonition-title::before, -.md-typeset .important>summary::before, -.md-typeset .tip>.admonition-title::before, -.md-typeset .tip>summary::before { + +.md-typeset .hint > .admonition-title::before, +.md-typeset .hint > summary::before, +.md-typeset .important > .admonition-title::before, +.md-typeset .important > summary::before, +.md-typeset .tip > .admonition-title::before, +.md-typeset .tip > summary::before { -webkit-mask-image: var(--md-admonition-icon--xtip) !important; mask-image: var(--md-admonition-icon--xtip) !important; } -.md-typeset .info>.admonition-title::before, -.md-typeset .info>summary::before, -.md-typeset .todo>.admonition-title::before, -.md-typeset .todo>summary::before { + +.md-typeset .info > .admonition-title::before, +.md-typeset .info > summary::before, +.md-typeset .todo > .admonition-title::before, +.md-typeset .todo > summary::before { -webkit-mask-image: var(--md-admonition-icon--xinfo); mask-image: var(--md-admonition-icon--xinfo); } -.md-typeset .danger>.admonition-title::before, -.md-typeset .danger>summary::before, -.md-typeset .error>.admonition-title::before, -.md-typeset .error>summary::before { + +.md-typeset .danger > .admonition-title::before, +.md-typeset .danger > summary::before, +.md-typeset .error > .admonition-title::before, +.md-typeset .error > summary::before { -webkit-mask-image: var(--md-admonition-icon--xdanger); mask-image: var(--md-admonition-icon--xdanger); } -.md-typeset .note>.admonition-title::before, -.md-typeset .note>summary::before { + +.md-typeset .note > .admonition-title::before, +.md-typeset .note > summary::before { -webkit-mask-image: var(--md-admonition-icon--xnote); mask-image: var(--md-admonition-icon--xnote); } -.md-typeset .abstract>.admonition-title::before, -.md-typeset .abstract>summary::before, -.md-typeset .summary>.admonition-title::before, -.md-typeset .summary>summary::before, -.md-typeset .tldr>.admonition-title::before, -.md-typeset .tldr>summary::before { + +.md-typeset .abstract > .admonition-title::before, +.md-typeset .abstract > summary::before, +.md-typeset .summary > .admonition-title::before, +.md-typeset .summary > summary::before, +.md-typeset .tldr > .admonition-title::before, +.md-typeset .tldr > summary::before { -webkit-mask-image: var(--md-admonition-icon--xabstract); mask-image: var(--md-admonition-icon--xabstract); } -.md-typeset .faq>.admonition-title::before, -.md-typeset .faq>summary::before, -.md-typeset .help>.admonition-title::before, -.md-typeset .help>summary::before, -.md-typeset .question>.admonition-title::before, -.md-typeset .question>summary::before { + +.md-typeset .faq > .admonition-title::before, +.md-typeset .faq > summary::before, +.md-typeset .help > .admonition-title::before, +.md-typeset .help > summary::before, +.md-typeset .question > .admonition-title::before, +.md-typeset .question > summary::before { -webkit-mask-image: var(--md-admonition-icon--xquestion); mask-image: var(--md-admonition-icon--xquestion); } -.md-typeset .check>.admonition-title::before, -.md-typeset .check>summary::before, -.md-typeset .done>.admonition-title::before, -.md-typeset .done>summary::before, -.md-typeset .success>.admonition-title::before, -.md-typeset .success>summary::before { + +.md-typeset .check > .admonition-title::before, +.md-typeset .check > summary::before, +.md-typeset .done > .admonition-title::before, +.md-typeset .done > summary::before, +.md-typeset .success > .admonition-title::before, +.md-typeset .success > summary::before { -webkit-mask-image: var(--md-admonition-icon--xsuccess); mask-image: var(--md-admonition-icon--xsuccess); } -.md-typeset .fail>.admonition-title::before, -.md-typeset .fail>summary::before, -.md-typeset .failure>.admonition-title::before, -.md-typeset .failure>summary::before, -.md-typeset .missing>.admonition-title::before, -.md-typeset .missing>summary::before { + +.md-typeset .fail > .admonition-title::before, +.md-typeset .fail > summary::before, +.md-typeset .failure > .admonition-title::before, +.md-typeset .failure > summary::before, +.md-typeset .missing > .admonition-title::before, +.md-typeset .missing > summary::before { -webkit-mask-image: var(--md-admonition-icon--xfail); mask-image: var(--md-admonition-icon--xfail); } -.md-typeset .bug>.admonition-title::before, -.md-typeset .bug>summary::before { + +.md-typeset .bug > .admonition-title::before, +.md-typeset .bug > summary::before { -webkit-mask-image: var(--md-admonition-icon--xbug); mask-image: var(--md-admonition-icon--xbug); } -.md-typeset .example>.admonition-title::before, -.md-typeset .example>summary::before { + +.md-typeset .example > .admonition-title::before, +.md-typeset .example > summary::before { -webkit-mask-image: var(--md-admonition-icon--xexample); mask-image: var(--md-admonition-icon--xexample); } -.md-typeset .cite>.admonition-title::before, -.md-typeset .cite>summary::before, -.md-typeset .quote>.admonition-title::before, -.md-typeset .quote>summary::before { + +.md-typeset .cite > .admonition-title::before, +.md-typeset .cite > summary::before, +.md-typeset .quote > .admonition-title::before, +.md-typeset .quote > summary::before { -webkit-mask-image: var(--md-admonition-icon--xquote); mask-image: var(--md-admonition-icon--xquote); } -.md-typeset .advance>.admonition-title::before, -.md-typeset .advance>summary::before, -.md-typeset .experiment>.admonition-title::before, -.md-typeset .experiment>summary::before { - background-color: rgb(0,57,166); - -webkit-mask-image: var(--md-admonition-icon--xadvance); - mask-image: var(--md-admonition-icon--xadvance); -} .md-nav__item--active > .md-nav__link { font-weight: bold; } + .center { display: block; margin-left: auto; margin-right: auto; width: 80%; } + .center-small { display: block; margin-left: auto; margin-right: auto; width: 90%; } + .md-tabs__link--active { font-weight: bold; } + .md-nav__title { font-size: 1rem !important; } + .md-version__link { overflow: hidden; } + .md-version__current { text-transform: uppercase; font-weight: bolder; } + .md-typeset .task-list-control .task-list-indicator::before { - background-color: #FF0000; - -webkit-mask-image: var(--md-admonition-icon--failure); - mask-image: var(--md-admonition-icon--failure); + background-color: #ff0000; + -webkit-mask-image: var(--md-admonition-icon--failure); + mask-image: var(--md-admonition-icon--failure); } + blockquote { padding: 0.5em 10px; quotes: "\201C""\201D""\2018""\2019"; } + blockquote:before { color: #ccc; content: open-quote; @@ -213,10 +250,12 @@ blockquote:before { margin-right: 0.25em; vertical-align: -0.4em; } + blockquote:after { visibility: hidden; content: close-quote; } + blockquote p { display: inline; } @@ -229,6 +268,7 @@ blockquote p { display: flex; justify-content: center; } + .embed-responsive { position: relative; display: block; @@ -236,10 +276,12 @@ blockquote p { padding: 0; overflow: hidden; } + .embed-responsive::before { display: block; content: ""; } + .embed-responsive .embed-responsive-item, .embed-responsive iframe, .embed-responsive embed, @@ -253,15 +295,19 @@ blockquote p { height: 100%; border: 0; } + .embed-responsive-21by9::before { padding-top: 42.857143%; } + .embed-responsive-16by9::before { padding-top: 56.25%; } + .embed-responsive-4by3::before { padding-top: 75%; } + .embed-responsive-1by1::before { padding-top: 100%; } @@ -270,6 +316,7 @@ blockquote p { footer.sponsorship { text-align: center; } + footer.sponsorship hr { display: inline-block; width: 2rem; @@ -277,15 +324,19 @@ footer.sponsorship hr { vertical-align: middle; border-bottom: 2px solid var(--md-default-fg-color--lighter); } + footer.sponsorship:hover hr { border-color: var(--md-accent-fg-color); } + footer.sponsorship:not(:hover) .twemoji.heart-throb-hover svg { color: var(--md-default-fg-color--lighter) !important; } + .doc-heading { padding-top: 50px; } + .btn { z-index: 1; overflow: hidden; @@ -300,10 +351,12 @@ footer.sponsorship:not(:hover) .twemoji.heart-throb-hover svg { font-weight: bold; margin: 5px 0px; } + .btn.bcolor { border: 4px solid var(--md-typeset-a-color); color: var(--blue); } + .btn.bcolor:before { content: ""; position: absolute; @@ -315,53 +368,68 @@ footer.sponsorship:not(:hover) .twemoji.heart-throb-hover svg { z-index: -1; transition: 0.2s ease; } + .btn.bcolor:hover { color: var(--white); background: var(--md-typeset-a-color); transition: 0.2s ease; } + .btn.bcolor:hover:before { width: 100%; } + main #g6219 { transform-origin: 85px 4px; - animation: an1 12s .5s infinite ease-out; + animation: an1 12s 0.5s infinite ease-out; } + @keyframes an1 { 0% { transform: rotate(0); } + 5% { transform: rotate(3deg); } + 15% { transform: rotate(-2.5deg); } + 25% { transform: rotate(2deg); } + 35% { transform: rotate(-1.5deg); } + 45% { transform: rotate(1deg); } + 55% { transform: rotate(-1.5deg); } + 65% { transform: rotate(2deg); } + 75% { transform: rotate(-2deg); } + 85% { transform: rotate(2.5deg); } + 95% { transform: rotate(-3deg); } + 100% { transform: rotate(0); } -} +} \ No newline at end of file diff --git a/mkdocs.yml b/mkdocs.yml index 61d467c2a..b819890b4 100644 --- a/mkdocs.yml +++ b/mkdocs.yml @@ -299,3 +299,15 @@ nav: - WebGear_RTC FAQs: help/webgear_rtc_faqs.md - NetGear_Async FAQs: help/netgear_async_faqs.md - Stabilizer Class FAQs: help/stabilizer_faqs.md + - Bonus Examples: + - CamGear Examples: help/camgear_ex.md + - PiGear Examples: help/pigear_ex.md + - VideoGear Examples: help/videogear_ex.md + - ScreenGear Examples: help/screengear_ex.md + - WriteGear Examples: help/writegear_ex.md + - StreamGear Examples: help/streamgear_ex.md + - NetGear Examples: help/netgear_ex.md + - WebGear Examples: help/webgear_ex.md + - WebGear_RTC Examples: help/webgear_rtc_ex.md + - NetGear_Async Examples: help/netgear_async_ex.md + - Stabilizer Class Examples: help/stabilizer_ex.md \ No newline at end of file From 2ea4e89ce0b5faec54d49d97eea90de1af25c0b4 Mon Sep 17 00:00:00 2001 From: abhiTronix Date: Wed, 1 Sep 2021 11:43:21 +0530 Subject: [PATCH 07/11] =?UTF-8?q?=F0=9F=90=9B=20WebGearRTC:=20Fixed=20Asse?= =?UTF-8?q?rtion=20error=20bug?= MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit - 🚑️ Source must raise MediaStreamError when stream ends instead of returning None-type. - 👷 Updated CI tests. - 📝 Updated Docs Examples. --- docs/gears/webgear_rtc/advanced.md | 4 ++-- vidgear/gears/asyncio/webgear_rtc.py | 4 ++-- .../asyncio_tests/test_webgear_rtc.py | 21 +++++++++++-------- 3 files changed, 16 insertions(+), 13 deletions(-) diff --git a/docs/gears/webgear_rtc/advanced.md b/docs/gears/webgear_rtc/advanced.md index 1f4646d7a..2726cdc64 100644 --- a/docs/gears/webgear_rtc/advanced.md +++ b/docs/gears/webgear_rtc/advanced.md @@ -77,6 +77,7 @@ Let's implement a bare-minimum example with a Custom Source using WebGear_RTC AP import uvicorn, asyncio, cv2 from av import VideoFrame from aiortc import VideoStreamTrack +from aiortc.mediastreams import MediaStreamError from vidgear.gears.asyncio import WebGear_RTC from vidgear.gears.asyncio.helper import reducer @@ -112,7 +113,7 @@ class Custom_RTCServer(VideoStreamTrack): # if NoneType if not grabbed: - return None + return MediaStreamError # reducer frames size if you want more performance otherwise comment this line frame = await reducer(frame, percentage=30) # reduce frame by 30% @@ -145,7 +146,6 @@ uvicorn.run(web(), host="localhost", port=8000) # close app safely web.shutdown() - ``` **And that's all, Now you can see output at [`http://localhost:8000/`](http://localhost:8000/) address.** diff --git a/vidgear/gears/asyncio/webgear_rtc.py b/vidgear/gears/asyncio/webgear_rtc.py index 3ae39b931..86cfc2a37 100644 --- a/vidgear/gears/asyncio/webgear_rtc.py +++ b/vidgear/gears/asyncio/webgear_rtc.py @@ -223,14 +223,14 @@ async def recv(self): # read video frame f_stream = None if self.__stream is None: - return None + raise MediaStreamError else: f_stream = self.__stream.read() # display blank if NoneType if f_stream is None: if self.blank_frame is None or not self.is_running: - return None + raise MediaStreamError else: f_stream = self.blank_frame[:] if not self.__enable_inf and not self.__reset_enabled: diff --git a/vidgear/tests/streamer_tests/asyncio_tests/test_webgear_rtc.py b/vidgear/tests/streamer_tests/asyncio_tests/test_webgear_rtc.py index cec35673f..a150c20ff 100644 --- a/vidgear/tests/streamer_tests/asyncio_tests/test_webgear_rtc.py +++ b/vidgear/tests/streamer_tests/asyncio_tests/test_webgear_rtc.py @@ -43,6 +43,7 @@ RTCSessionDescription, ) from av import VideoFrame +from aiortc.mediastreams import MediaStreamError from vidgear.gears.asyncio import WebGear_RTC from vidgear.gears.helper import logger_handler @@ -142,7 +143,7 @@ async def recv(self): # if NoneType if not grabbed: - return None + raise MediaStreamError # contruct `av.frame.Frame` from `numpy.nd.array` av_frame = VideoFrame.from_ndarray(frame, format="bgr24") @@ -187,7 +188,7 @@ async def recv(self): # if NoneType if not grabbed: - return None + raise MediaStreamError # contruct `av.frame.Frame` from `numpy.nd.array` av_frame = VideoFrame.from_ndarray(frame, format="bgr24") @@ -252,7 +253,8 @@ async def test_webgear_rtc_class(source, stabilize, colorspace, time_delay): await offer_pc.close() web.shutdown() except Exception as e: - pytest.fail(str(e)) + if not isinstance(e, MediaStreamError): + pytest.fail(str(e)) test_data = [ @@ -314,7 +316,7 @@ async def test_webgear_rtc_options(options): await offer_pc.close() web.shutdown() except Exception as e: - if isinstance(e, AssertionError): + if isinstance(e, (AssertionError, MediaStreamError)): logger.exception(str(e)) elif isinstance(e, requests.exceptions.Timeout): logger.exceptions(str(e)) @@ -396,7 +398,7 @@ async def test_webpage_reload(options): # shutdown await offer_pc.close() except Exception as e: - if "enable_live_broadcast" in options and isinstance(e, AssertionError): + if "enable_live_broadcast" in options and isinstance(e, (AssertionError, MediaStreamError)): pytest.xfail("Test Passed") else: pytest.fail(str(e)) @@ -414,7 +416,7 @@ async def test_webpage_reload(options): @pytest.mark.asyncio -@pytest.mark.xfail(raises=ValueError) +@pytest.mark.xfail(raises=(ValueError, MediaStreamError)) @pytest.mark.parametrize("server, result", test_data_class) async def test_webgear_rtc_custom_server_generator(server, result): """ @@ -448,7 +450,7 @@ async def test_webgear_rtc_custom_middleware(middleware, result): assert response.status_code == 200 web.shutdown() except Exception as e: - if result: + if result and not isinstance(e, MediaStreamError): pytest.fail(str(e)) @@ -493,7 +495,8 @@ async def test_webgear_rtc_routes(): await offer_pc.close() web.shutdown() except Exception as e: - pytest.fail(str(e)) + if not isinstance(e, MediaStreamError): + pytest.fail(str(e)) @pytest.mark.asyncio @@ -515,7 +518,7 @@ async def test_webgear_rtc_routes_validity(): async with TestClient(web()) as client: pass except Exception as e: - if isinstance(e, RuntimeError): + if isinstance(e, (RuntimeError, MediaStreamError)): pytest.xfail(str(e)) else: pytest.fail(str(e)) From f57a889be1fecbaae837522ecfed3de48b66b9ff Mon Sep 17 00:00:00 2001 From: abhiTronix Date: Wed, 1 Sep 2021 12:36:01 +0530 Subject: [PATCH 08/11] =?UTF-8?q?=F0=9F=9A=B8=20Docs:=20Added=20Gitter=20s?= =?UTF-8?q?idecard=20embed=20widget.?= MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit - 🍱 Imported gitter-sidecar script to `main.html`. - 💄 Updated `custom.js` to set global window option. - 💄 Updated Sidecard UI in `custom.css`. - ⚰️ Removed dead code from docs. - ✏️ Fixed more typos. --- docs/bonus/reference/helper.md | 4 ++++ docs/bonus/reference/helper_async.md | 8 -------- docs/help/camgear_faqs.md | 10 +++------- docs/help/pigear_faqs.md | 2 +- docs/overrides/assets/javascripts/extra.js | 11 ++++++++++- docs/overrides/assets/stylesheets/custom.css | 7 +++++++ docs/overrides/main.html | 1 + vidgear/gears/__init__.py | 4 ++-- vidgear/gears/helper.py | 2 +- 9 files changed, 29 insertions(+), 20 deletions(-) diff --git a/docs/bonus/reference/helper.md b/docs/bonus/reference/helper.md index 20c94b625..2214e37f6 100644 --- a/docs/bonus/reference/helper.md +++ b/docs/bonus/reference/helper.md @@ -98,6 +98,10 @@ limitations under the License.   +::: vidgear.gears.helper.import_dependency_safe + +  + ::: vidgear.gears.helper.get_video_bitrate   diff --git a/docs/bonus/reference/helper_async.md b/docs/bonus/reference/helper_async.md index cfc329656..8e3e56b87 100644 --- a/docs/bonus/reference/helper_async.md +++ b/docs/bonus/reference/helper_async.md @@ -18,14 +18,6 @@ limitations under the License. =============================================== --> -::: vidgear.gears.asyncio.helper.logger_handler - -  - -::: vidgear.gears.asyncio.helper.mkdir_safe - -  - ::: vidgear.gears.asyncio.helper.reducer   diff --git a/docs/help/camgear_faqs.md b/docs/help/camgear_faqs.md index a2b394105..4aea0d91b 100644 --- a/docs/help/camgear_faqs.md +++ b/docs/help/camgear_faqs.md @@ -72,9 +72,7 @@ limitations under the License. ## How to change quality and parameters of YouTube Streams with CamGear? -CamGear provides exclusive attributes `STREAM_RESOLUTION` _(for specifying stream resolution)_ & `STREAM_PARAMS` _(for specifying underlying API(e.g. `youtube-dl`) parameters)_ with its [`options`](../../gears/camgear/params/#options) dictionary parameter. The complete usage example is as follows: - -**Answer:** See [this bonus example ➶](../camgear_ex/#using-variable-youtube-dl-parameters-in-camgear). +**Answer:** CamGear provides exclusive attributes `STREAM_RESOLUTION` _(for specifying stream resolution)_ & `STREAM_PARAMS` _(for specifying underlying API(e.g. `youtube-dl`) parameters)_ with its [`options`](../../gears/camgear/params/#options) dictionary parameter. See [this bonus example ➶](../camgear_ex/#using-variable-youtube-dl-parameters-in-camgear).   @@ -82,9 +80,7 @@ CamGear provides exclusive attributes `STREAM_RESOLUTION` _(for specifying strea ## How to open RSTP network streams with CamGear? -You can open any local network stream _(such as RTSP)_ just by providing its URL directly to CamGear's [`source`](../../gears/camgear/params/#source) parameter. The complete usage example is as follows: - -**Answer:** See [this bonus example ➶](../camgear_ex/#using-camgear-for-capturing-rstprtmp-urls). +**Answer:** You can open any local network stream _(such as RTSP)_ just by providing its URL directly to CamGear's [`source`](../../gears/camgear/params/#source) parameter. See [this bonus example ➶](../camgear_ex/#using-camgear-for-capturing-rstprtmp-urls).   @@ -102,7 +98,7 @@ You can open any local network stream _(such as RTSP)_ just by providing its URL ## How to synchronize between two cameras? -**Answer:** See [this issue comment ➶](https://github.com/abhiTronix/vidgear/issues/1#issuecomment-473943037). +**Answer:** See [this bonus example ➶](../camgear_ex/#synchronizing-two-sources-in-camgear).   diff --git a/docs/help/pigear_faqs.md b/docs/help/pigear_faqs.md index 30661a518..3c24814da 100644 --- a/docs/help/pigear_faqs.md +++ b/docs/help/pigear_faqs.md @@ -67,6 +67,6 @@ limitations under the License. ## How to change `picamera` settings for Camera Module at runtime? -**Answer:** You can use `stream` global parameter in PiGear to feed any `picamera` setting at runtime. See [this bonus example ➶](../pigear_ex/#setting-variable-picamera-parameters-for-camera-module-at-runtime). +**Answer:** You can use `stream` global parameter in PiGear to feed any `picamera` setting at runtime. See [this bonus example ➶](../pigear_ex/#setting-variable-picamera-parameters-for-camera-module-at-runtime)   \ No newline at end of file diff --git a/docs/overrides/assets/javascripts/extra.js b/docs/overrides/assets/javascripts/extra.js index 65c96542c..a09882de3 100755 --- a/docs/overrides/assets/javascripts/extra.js +++ b/docs/overrides/assets/javascripts/extra.js @@ -17,6 +17,8 @@ See the License for the specific language governing permissions and limitations under the License. =============================================== */ + +// DASH StreamGear demo var player_dash = new Clappr.Player({ source: 'https://rawcdn.githack.com/abhiTronix/vidgear-docs-additionals/dca65250d95eeeb87d594686c2f2c2208a015486/streamgear_video_segments/DASH/streamgear_dash.mpd', plugins: [DashShakaPlayback, LevelSelector], @@ -46,6 +48,7 @@ var player_dash = new Clappr.Player({ preload: 'metadata', }); +// HLS StremGear demo var player_hls = new Clappr.Player({ source: 'https://rawcdn.githack.com/abhiTronix/vidgear-docs-additionals/abc0c193ab26e21f97fa30c9267de6beb8a72295/streamgear_video_segments/HLS/streamgear_hls.m3u8', plugins: [HlsjsPlayback, LevelSelector], @@ -81,6 +84,7 @@ var player_hls = new Clappr.Player({ preload: 'metadata', }); +// DASH Stabilizer demo var player_stab = new Clappr.Player({ source: 'https://rawcdn.githack.com/abhiTronix/vidgear-docs-additionals/fbcf0377b171b777db5e0b3b939138df35a90676/stabilizer_video_chunks/stabilizer_dash.mpd', plugins: [DashShakaPlayback], @@ -97,4 +101,9 @@ var player_stab = new Clappr.Player({ parentId: '#player_stab', poster: 'https://rawcdn.githack.com/abhiTronix/vidgear-docs-additionals/94bf767c28bf2fe61b9c327625af8e22745f9fdf/stabilizer_video_chunks/hd_thumbnail_2.png', preload: 'metadata', -}); \ No newline at end of file +}); + +// gitter sidecard +((window.gitter = {}).chat = {}).options = { + room: 'vidgear/community' +}; \ No newline at end of file diff --git a/docs/overrides/assets/stylesheets/custom.css b/docs/overrides/assets/stylesheets/custom.css index 32f12528d..a04895b69 100755 --- a/docs/overrides/assets/stylesheets/custom.css +++ b/docs/overrides/assets/stylesheets/custom.css @@ -207,6 +207,13 @@ limitations under the License. width: 80%; } +/* Handles Gitter Sidecard UI */ +.gitter-open-chat-button { + background-color: var(--md-primary-fg-color) !important; + font-family: inherit !important; + font-size: 12px; +} + .center-small { display: block; margin-left: auto; diff --git a/docs/overrides/main.html b/docs/overrides/main.html index 969e3c710..1c8d80468 100644 --- a/docs/overrides/main.html +++ b/docs/overrides/main.html @@ -27,6 +27,7 @@ + {% endblock %}