IpCamera: New IP Camera Binding

New Build 2020-02-11 has these changes:

  • If you wish to force ffmpeg to be used for mjpeg you can use STREAM_URL_OVERRIDE="ffmpeg" This will also remove a warn that occurs if you leave it blank.
  • Reolink change to how it handles the snapshot @Axel_Kummerlowe can you test this version if you get a chance please? TRACE log output if it does not work should be helpful over a few snapshot requests.
1 Like

Thanks @matt1.

I have downloaded the latest version and used: STREAM_URL_OVERRIDE=“ffmpeg”

My FPS is 3 seconds… how do i fix that?

Thing ipcamera:HTTPONLY:001 [
    IPADDRESS="192.168.x.x",
    USERNAME="admin",
    PASSWORD="password",
    POLL_CAMERA_MS=200,
    SERVER_PORT=54321,
    IP_WHITELIST="DISABLE",
    FFMPEG_INPUT="rtsp://admin:password@192.168.x.x:554",
    FFMPEG_OUTPUT="/etc/openhab2/html/camera1/",
    UPDATE_IMAGE=true,
    STREAM_URL_OVERRIDE="ffmpeg",
    FFMPEG_HLS_OUT_ARGUMENTS="-strict -2 -acodec copy -vcodec copy -hls_flags delete_segments -segment_list_flags live -flags -global_header -hls_time 1 -hls_allow_cache 0 -hls_list_size 3"
]

On my habpanel i have also set to refresh every 200ms.

If i use the new command stream_url_override=“ffmpeg” how can i configure the parameters (e.g. resolution) for the mjpeg stream?

When i use the mjpeg stream given from the cam to the binding, i have to do this in my cam, because the binding only makes a path through, but with this new option, can i configure this different from the snapshots-parameters and hls-parameters?

@Marius_van_Belkum
You cant go below 1000, you will be getting an error in your logs telling you this is not a valid value.

@halloween
That would be the plan to do this if people find it useful, to give it is own config to override the default with. So far I have not had much feedback on the feature and no one has requested it so have been busy coding for things that I need for my own setup.
If you have a camera that can create its own mjpeg why do you want to use this? It should be a 6fps stream that is full resolution if your CPU can keep up. By lowering the resolution my guess is it will use more cpu not less, you would need to provide a stream that is already lower res for it to lower the CPU load if that is what you are after?

Hi, my own camera has mjpeg, i don´t need that here.

But a friend of mine didn´t hear to my advice to buy a camera with mjpeg capability (it was about 20 euro more expensive…) - so he bought the cheapest china cam and now he only has a rtsp stream (2 streams with different resolution). No mjpeg - no snapshots. Snapshots work now with your fabulous binding, but he wants mjpeg for his openhab-sitemap too. The new snapshot-stream only has about 1fps - thats not enough for him. His openhab server with ffmpeg installed is a raspi 4 with 4gb ram.

That should already work if you ask the binding for ipcamera.mjpeg but currently it will be fixed to the same resolution as the RTSP stream that is given to the binding and also to 6 FPS.
IMHO there is probably only 2 things that make sense to tweak and play with unless you want to crop or resize for various reasons.

  1. FPS increasing it above 6fps will increase CPU load so there is a trade off.
  2. Changing the jpeg compression to lower the network load by trading off picture quality with higher amounts of jpeg compression.

As for CPU load I published some tested a while back on what to expect on a ARM based system here

New build 2020-02-17 has these changes:

  • New channel externalMotion for allowing external sensors to trigger motion alarms. Cameras without built in motion can use a Zwave PIR for example and trigger the binding as if the sensor was a part of the camera.
  • New experimental group thing type for viewing multiple cameras as a single camera.
  • All cameras now have the channel lastMotionType

See readme for info on any of these.

Hi

First Great binding. :smile:

My HLS stream is about 8-10 sec behind, is this normal. it is running on a PI 3b+ would it be more live if I run it on a Pi4 instaid?

how do i change how much HLS stream is saved before overwriting.

Glad you find it useful and like it.

No upgrading wont change a thing other than giving you more ram and twice the CPU power and a network card that is not as limiting, which are worth it if you have multiple cameras.

To lower the delay. (there are a few posts on what to use only around 10 posts above this one)

  1. Set all cameras to create a keyframe (i frame) every second, refer to their manuals, support and google for how.
  2. Change the hls arguments to create segments that are 1 second in length.
  3. Limit it to 2-3 segments in the HLS playlist.
  4. Read up on the topic as you will need to know how/what HLS is to understand how to tweak it. Some browsers will stop working with some settings, and others will cache ahead before they play which adds on a delay.

Thanks

Could you give me a guide to solve solution 2 and 3

Around 11 posts up…

So this weekend I moved my Opnehab server from a Pi 3 to my proper server. I installed it into VirtualBox and it’s been running happily there.

So I removed everything related to HLS and went back to good old mjpeg. I set the stream url override to ffmpeg and let the binding generate mjpeg stream. Sadly the thing performs even worse than HLS. It still has 5s delay, worse framerate with very obvious skipping and it crashes the web browser on my cheapy Android tablet after a few minutes.

So I disabled the binding entirely and used my “old” way. Basically setting up the ffserver with this config (only the relevant part is here):

<Stream camera6.mjpg>
Feed camera6.ffm
Format mpjpeg
VideoFrameRate 8
VideoSize 725x450
VideoBitRate 2540
VideoIntraOnly
NoAudio
Strict -1
</Stream>

And then starting ffmpeg with the folowing parameters:

ffmpeg -hide_banner -loglevel panic -nostats -i 'rtsp://....' http://127.0.0.1:8090/camera6.ffm

And this works very nicely. Tablet doesn’t crash, the stream delay is about 2s, … Sadly using ffserver is a bad idea since it has been removed from newer builds, so it’s not a long term solution.

@roli
Thanks for reporting some feedback. Sadly I can not reproduce that here and this is the results of testing lag behind realtime using 3 different cameras all made in different factories.

Http based mjpeg from camera, through the binding and to my iPhone using WIFI = 0.5 seconds, not a mis print it is around half a second delay. Only API cameras can do this.

ffmpeg creating the mjpeg, routed through the binding and to my iPhone using WIFI = 1 second delay after it had been running for a while. It was possibly 2 to 2.5 seconds behind when it first started as ffmpeg takes some time to get up to full speed and yes during that time I did see some minor stuttering glitches. Due to the stream not closing down due to a bug (already fixed for next build), I left and came back to the stream and it was perfect and only 1 second behind real time. Since your running ffmpeg non stop with that method it is not a direct comparison to something that starts and stops the stream on demand whilst a browser is waiting. If the binding produces the stream ahead of time it works great with only a 1 second delay.

Using a camera that can supply a snapshot and then retesting using ffmpeg to create the snapshot from rtsp, I got the same results both ways:

ipcamera.jpg with a 1 second refresh was 1 second behind real time.

snapshots.mjpeg with 1 second poll was 1 second behind in a PC’s browser, but on iOS it seemed to buffer a frame and was an extra frame behind.

So it is really only HLS that has any lag worth mentioning on it and that is purely from the design of HLS and how it works that creates the lag. I am yet to run a test to see how far I can push HLS, but my main cameras can create keyframes at any interval that I wish so it will be an interesting experiment when I find the time to play.

My android tablet has had no issues when I use it and when it charges I can confirm it does the same as above.

Matt, do you know if the Google Nest Hub will accept anything else than HLS ?

Wow… seems i don’t visit this for a few weeks and so much is added…
So im using hls in ios app/sitemaps.
Keep up the great work.

@matt1 Whats the recommended way now for habpanel?

The binding can cast:
jpg
gif
HLS

However at any time Google could change something in a firmware.
Mjpeg appears to be possible, just I need a trace of a working stream to work out what is missing. It may be possible to pass the streams through nginx and get them working, I just lack the time to try, so if someone has it working and knows how to do a wireshark or similar capture it would be great to hear from you. HLS is the only way to do it with audio.

I don’t use habpanel, I cast most of my stuff or use automations to do things automatically. The snapshot based mjpeg streams are most likely to stay open for extended periods of time without stopping. Cameras may stop the stream on you, but snapshots are not likely to be stopped. Give them all a test and see what works best for you and I would love to hear the results of any testing as I have not done any.

So far snapshots seems the best option.
Mjpeg streams don’t close the ffmpeg when not in use so I soon hit 100% cpu usage
Still trying to get the hls working in there.

Also be nice if the developer of the iOS app integrated WebKit-playsinline for the app so we can have inline hls streams (using a web view). I know it was just rewritten to swift so I’ll drop him a message when I get a chance to find the post

That’s a bug that is fixed for the next build.

Also someone posted a way to get hls working by placing it in a HTML file with a few lines of code. Search this thread maybe a month ago. I did it a year ago and from memory that is how I did it too so probably have notes somewhere.

edit:

@matt1 thanks for that. Could you possibly add to the readme as Im sure i cant be the only one

you can also do in sitemaps like this to get cams side by side, however it only works through the dashboard as the ios app, as i suspected, doesnt integrate webkit-playsinline. this means all streams open a new fullscreen player in safari (I have opened this on git)

<!DOCTYPE html>
<html>
	<body>
		<div style="width: 50%; float: left;">
			<video playsinline autoplay muted controls style="width:100%; " src="http://192.168.6.4:50001/ipcamera.m3u8" />
		</div>
		<div style="width: 50%; float: left;">
			<video playsinline autoplay muted controls style="width: 100%; " src="http://192.168.6.4:50002/ipcamera.m3u8" />
		</div>
		<div style="width: 50%; float: left;">
			<video playsinline autoplay muted controls style="width:100%; " src="http://192.168.6.4:50003/ipcamera.m3u8" />
		</div>
		<div style="width: 50%; float: left;">
			<video playsinline autoplay muted controls style="width: 100%; " src="http://192.168.6.4:50004/ipcamera.m3u8" />
		</div>
	</body>
</html> 

hello everyone! first of all thanks for this binding! very nice job!
i’m trying to configure this binding but i’m stuck in some settings…
i don’t understand if the problem is my webcam (a sub brand of dahua named imou) or some wrong settings i’m using on binding.
all settings made are done via paperui, so no typos are made using command line.

here are some screenshot:



also here there is a screen of onvif device manager:

summarizing, the rtsp is right because i can see on vlc with that url the realtime streaming, but on openhab i have no images or control of any of webcam settings…

thanks to who will have 5 mins to help a little noob :slight_smile: