1: I think it would work but I have only tried 1 camera feeding 6 habpanels and my Odroid C2 handled it with ease, but it has a true 1gb network card and 2gb ram and more cpu grunt then a PI3. A single stream from 1 camera appears to only increase the CPU load by 1% on an ARM processor, however it is more than just cpu load as the more network traffic going in and out of the network card on a raspberry Pi is at some stage going to become saturated. I am interested in hearing what people find, but for cameras that can produce a MJPEG stream and have it accessable via HTTP the binding is working well.
Not currently, but that is a good idea for a future feature. Each IP needs to be entered in its own set of brackets.
Thanks for the info.
I have this working with the EZVIZ camera (Mini O).
Live video can be viewed with Basic UI on the PC but doesn’t work on the iOS app or Safari.
It DOES work with Chrome on iOS but that only works when on the same network. We could do with a major iOS OpenHab app overhaul…please…
I haven’t setup Habpanel yet to test it.
The password for the EZVIZ camera is printed on the bottom of the camera and you have to disable encryption in the EZVIZ app.
commands for pi
Start the server with
Then start the video conversion
(replace password with the camera password and xxx with your camera ip)
So you are asking why a $$$ dedicated x86 machine works and a $35 arm that is trying to do multiple roles is not as good?
You can enable hardware acceleration for FFMPEG but you need to google how and it most likely means you need to compile ffmpeg from source. Sorry I am not support, you will need to google how. https://trac.ffmpeg.org/wiki/HWAccelIntro
If you used a newer x86 based device for your openhab server one which uses Intel’s QuickSync you would have less issues and far better performance. Blue Iris also benefits from this and you probably have it on your server. https://trac.ffmpeg.org/wiki/Hardware/QuickSync
Since you are using a camera that will work with the IpCamera binding, why dont you use MJPEG and stream using the binding? I am finding it only increases CPU load by 1% here and is easy to setup and use. I would only look at using this FFMPEG method if the camera only supports h264 or it does not provide a http method of providing a stream.
@JackNJack and @sujitrp
The instructions are in the first post of this thread, however I never bothered to write a guide on how to autostart it as I was planing what I will recommend you do instead. The Ipcamera Binding can now do ffmpeg conversions and serve the files. This works on all platforms not just Linux and is easier to setup. See the readme file for the binding.
ffmpeg does use Quick Sync as well as many other forms of de/encoding. See their website for info on this… Most of the things that the ipcamera binding does not encode it only transforms the cameras packets into another package so it does not use much CPU. But if you are using this method in this thread to convert h264 into mjpeg then hardware de/encoding will help.
if you have dahua then make sure you use the IpCamera binding by searching this forum as it give you the smart alarms and more…
H264 turns into HLS streams with low cpu so try that format. Works great on apple browsers and app and it can work in other browsers if you google how to install a plugin for your browser. For cameras that only give low res mjpeg streams this allows high res via HLS format.
Also I will try a newer netty version for your errors in the next build. The mjpeg stream does not use ffmpeg.