mirror of
https://github.com/pikvm/ustreamer.git
synced 2026-02-18 02:55:46 +00:00
Update README.md
This commit is contained in:
parent
86fada184c
commit
5f43fb7d0a
@ -21,7 +21,7 @@
|
||||
Footnotes:
|
||||
* ```1``` Long before µStreamer, I made a [patch](https://github.com/jacksonliam/mjpg-streamer/pull/164) to add DV-timings support to mjpg-streamer and to keep it from hanging up no device disconnection. Alas, the patch is far from perfect and I can't guarantee it will work every time - mjpg-streamer's source code is very complicated and its structure is hard to understand. With this in mind, along with needing multithreading and JPEG hardware acceleration in the future, I decided to make my own stream server from scratch instead of supporting legacy code.
|
||||
|
||||
* ```2``` This feature allows to cut down outgoing traffic several-fold when broadcasting HDMI, but it increases CPU usage a little bit. The idea is that HDMI is a fully digital interface and each captured frame can be identical to the previous one bite-wise. There's no need to broadcast the same image over the net several times a second. With the `--drop-same-frames=20` option enabled, µStreamer will drop all the matching frames (with a limit of 20 in a row). Each new frame is matched with the previous one first by length, then using ```memcmp()```.
|
||||
* ```2``` This feature allows to cut down outgoing traffic several-fold when broadcasting HDMI, but it increases CPU usage a little bit. The idea is that HDMI is a fully digital interface and each captured frame can be identical to the previous one byte-wise. There's no need to broadcast the same image over the net several times a second. With the `--drop-same-frames=20` option enabled, µStreamer will drop all the matching frames (with a limit of 20 in a row). Each new frame is matched with the previous one first by length, then using ```memcmp()```.
|
||||
|
||||
* ```3``` As µStreamer was made mainly to be used with screencast hardware it supports video formats they usually need. MJPG input means that the screencast device can compress images to JPEG and feed it to the software, which allows to cut down CPU usage and avoid software image encoding. This video format is supported by most webcams, but I've never seen it supported by screencast hardware: neither [Auvidea B101](https://auvidea.com/b101-hdmi-to-csi-2-bridge-15-pin-fpc/), nor [EasyCap UTV 007](https://www.amazon.com/dp/B0126O0RDC) offer such support. It's not hard to add hardware MJPG sources support, it's just not done yet.
|
||||
|
||||
@ -44,7 +44,7 @@ $ make
|
||||
$ ./ustreamer --help
|
||||
```
|
||||
|
||||
AUR has a pre-built package for Arch Linux: https://aur.archlinux.org/packages/ustreamer
|
||||
AUR has a package for Arch Linux: https://aur.archlinux.org/packages/ustreamer
|
||||
|
||||
-----
|
||||
# Usage
|
||||
|
||||
Loading…
x
Reference in New Issue
Block a user