How to display RTSP streams from IP Cameras in Openhab and Habpanel (Linux only)

1: I think it would work but I have only tried 1 camera feeding 6 habpanels and my Odroid C2 handled it with ease, but it has a true 1gb network card and 2gb ram and more cpu grunt then a PI3. A single stream from 1 camera appears to only increase the CPU load by 1% on an ARM processor, however it is more than just cpu load as the more network traffic going in and out of the network card on a raspberry Pi is at some stage going to become saturated. I am interested in hearing what people find, but for cameras that can produce a MJPEG stream and have it accessable via HTTP the binding is working well.

  1. Not currently, but that is a good idea for a future feature. Each IP needs to be entered in its own set of brackets.
1 Like

Thanks for the info.
I have this working with the EZVIZ camera (Mini O).
Live video can be viewed with Basic UI on the PC but doesn’t work on the iOS app or Safari.
It DOES work with Chrome on iOS :slight_smile: but that only works when on the same network. We could do with a major iOS OpenHab app overhaul…please…

I haven’t setup Habpanel yet to test it.

The password for the EZVIZ camera is printed on the bottom of the camera and you have to disable encryption in the EZVIZ app.

commands for pi
Start the server with

ffserver

Then start the video conversion
(replace password with the camera password and xxx with your camera ip)

ffmpeg -i ‘rtsp://admin:password@192.168.0.xxx:554’ http://127.0.0.1:8090/camera1.ffm

text for sitemap

Image url=“http://openhabianpi:8090/camera1.mjpg

Keep an eye on the CPU temp as I don’t have a fan and it was up to 76C

1 Like

Hi All
Got this working but it craps out early. Tried using RTSP_transport tcp or udp but the CPU is too high for 1 camera, like 20%

Seems like lots of packets are dropping which kills the stream.

Any idea why? Blue Iris works with the cameras just fine, never have an issue

See this at the start of a stream:

[rtsp @ 0x562f06b758c0] max delay reached. need to consume packet
[rtsp @ 0x562f06b758c0] RTP: missed 41 packets

So you are asking why a $$$ dedicated x86 machine works and a $35 arm that is trying to do multiple roles is not as good?

You can enable hardware acceleration for FFMPEG but you need to google how and it most likely means you need to compile ffmpeg from source. Sorry I am not support, you will need to google how.
https://trac.ffmpeg.org/wiki/HWAccelIntro
If you used a newer x86 based device for your openhab server one which uses Intel’s QuickSync you would have less issues and far better performance. Blue Iris also benefits from this and you probably have it on your server.
https://trac.ffmpeg.org/wiki/Hardware/QuickSync

A low cost and power draw system that has this would be the H2 but it has supply issues at the moment.
https://www.hardkernel.com/shop/odroid-h2/

Since you are using a camera that will work with the IpCamera binding, why dont you use MJPEG and stream using the binding? I am finding it only increases CPU load by 1% here and is easy to setup and use. I would only look at using this FFMPEG method if the camera only supports h264 or it does not provide a http method of providing a stream.

Well, using a digest python script also works but its heavy on CPU.

Does the binding allow for the removal of the authentication? to present only a http URL for HabPanel, i assume yes?

Heavy cpu will always be the case If you are converting h264 into mjpeg unless you use hardware acceleration. Video resolution will probably jump at the same speed that cpus get faster.

Yes the binding can work for you. See here

1 Like

Lovely, I got one camera working.

Ill do some more testing! Looks good matt!

1 Like

Hi @matt1 got it all working for video! nice work on the binding. Only issue is using 9 cameras on a Xeon Quad Core Cisco UCS blade, it hits 60-90% cpu when opening the camera page hah!

Might try and tune down the quality settings

Can u please help me with ffmpeg & ffsever installation instructions. I tried various options but no luck .
I have raspberry pi 3b + with us boot for this setup and pi 3b for openhab

Please provide procedure or link on how to do it.

@JackNJack and @sujitrp
The instructions are in the first post of this thread, however I never bothered to write a guide on how to autostart it as I was planing what I will recommend you do instead. The Ipcamera Binding can now do ffmpeg conversions and serve the files. This works on all platforms not just Linux and is easier to setup. See the readme file for the binding.

1 Like

Thanks @matt1 for detailing this! I have 5 Dahua cams (6 m/pix each) to setup on my openhab panel.

1 Like

Hi Matt

Does this make use of Intel Quick Sync hardware decoding for H264/H265? Thx

@dastrix80
ffmpeg does use Quick Sync as well as many other forms of de/encoding. See their website for info on this… Most of the things that the ipcamera binding does not encode it only transforms the cameras packets into another package so it does not use much CPU. But if you are using this method in this thread to convert h264 into mjpeg then hardware de/encoding will help.

@OpenHabitat
if you have dahua then make sure you use the IpCamera binding by searching this forum as it give you the smart alarms and more…

IpCamera: New IP Camera Binding

Hi Matt

Cool. Wanting to display a number of streams in a widget without destroying the CPU of the OH2 box. Ill give it a go!

Im using your binding already to display a number of streams but its the older version without the ffmpeg

H264 turns into HLS streams with low cpu so try that format. Works great on apple browsers and app and it can work in other browsers if you google how to install a plugin for your browser. For cameras that only give low res mjpeg streams this allows high res via HLS format.

Also I will try a newer netty version for your errors in the next build. The mjpeg stream does not use ffmpeg.

I recommend using mpeg-dash (the w3c standardisation of HLS) since that will work on all browsers. Nginx also has full support for creating the dash segments. I’m doing that here https://github.com/imaginator/system/tree/master/salt/motion/files

That is what I am looking into at the moment, what browsers have u used it in and do they need a plugin for support?

As long as they support https://caniuse.com/#feat=mediasource you have no need for a plugin. See: https://github.com/Dash-Industry-Forum/dash.js/

Hi Matt

I no longer have the error since moving to a QNAP Virtual machine actually (or cpu issues that im aware of)
Cheers

1 Like