IpCamera: New IP Camera Binding

Just tried again with latest version of today.
Still get permission denied when tried to create a GIF.

2020-02-01 16:43:26.845 [ERROR] [ing.ipcamera.handler.IpCameraHandler] - FileNotFoundException {}
java.io.FileNotFoundException: /etc/openhab2/html/camera1/snapshot5.jpg (Permission denied)
	at java.io.FileOutputStream.open0(Native Method) ~[?:1.8.0_222]
	at java.io.FileOutputStream.open(FileOutputStream.java:270) ~[?:1.8.0_222]
	at java.io.FileOutputStream.<init>(FileOutputStream.java:213) ~[?:1.8.0_222]
	at java.io.FileOutputStream.<init>(FileOutputStream.java:162) ~[?:1.8.0_222]
	at org.openhab.binding.ipcamera.handler.IpCameraHandler.storeSnapshots(IpCameraHandler.java:1103) [bundleFile:?]
	at org.openhab.binding.ipcamera.handler.IpCameraHandler.setupFfmpegFormat(IpCameraHandler.java:1182) [bundleFile:?]
	at org.openhab.binding.ipcamera.handler.IpCameraHandler$3.run(IpCameraHandler.java:1735) [bundleFile:?]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [?:1.8.0_222]
	at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) [?:1.8.0_222]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) [?:1.8.0_222]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) [?:1.8.0_222]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_222]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_222]
	at java.lang.Thread.run(Thread.java:748) [?:1.8.0_222]

This is what happens when I start openhab:

2020-02-01 16:46:14.762 [hingStatusInfoChangedEvent] - 'ipcamera:HTTPONLY:Reolink' changed from INITIALIZING to ONLINE
2020-02-01 16:46:14.762 [INFO ] [ing.ipcamera.handler.IpCameraHandler] - IP Camera at 10.4.28.202 is now online.
2020-02-01 16:46:14.998 [INFO ] [ing.ipcamera.handler.IpCameraHandler] - IpCamera file server for camera 10.4.28.202 has started on port 54321 for all NIC's.
2020-02-01 16:46:15.002 [INFO ] [ing.ipcamera.handler.IpCameraHandler] - Binding has no snapshot url, and is set to always update images. Using your CPU to create snapshots with Ffmpeg.
2020-02-01 16:46:15.005 [vent.ItemStateChangedEvent] - FrontdoorCamera_HLSURL changed from NULL to http://10.4.28.221:54321/ipcamera.m3u8
2020-02-01 16:46:15.006 [vent.ItemStateChangedEvent] - FrontdoorCamera_RTSPURL changed from NULL to rtsp://openhab:openhab0101@10.4.28.202:554//h264Preview_01_main
2020-02-01 16:46:19.515 [vent.ItemStateChangedEvent] - FrontdoorCamera_UpdateTheImage changed from NULL to ON

Looks okay or?

Sorry, with ONVIF/HTTPonly protocol do I have to install ffmpeg with h264 anyway on my Raspberry Openhab host instance ?

New version (20200201) cannot create snapshots anymore with my setup. My cameras (Instar IN-9008 and IN-8015) didnā€™t change and still work without problems. I removed all the old jars, deleted cache with openhab-cli and removed tmp files. The new jars installed fine but I receive the following error:

2020-02-01 19:21:42.579 [WARN ] [okhttp3.OkHttpClient                ] - A connection to http://192.168.178.11:8080/ was leaked. Did you forget to close a response body? To see where this was allocated, set the OkHttpClient logger level to FINE: Logger.getLogger(OkHttpClient.class.getName()).setLevel(Level.FINE);

How can I set the level to fine in order to get more information? I couldnā€™t figure out if I can use the log.set command of the karaf console to do this. It returned that the level FINE is not defined.

I restarted openhab several times but the error persists.

I have set ipcamera to TRACE logging. It shows that the camera photo is downloaded from the camera. But nothing happens after the download. Maybe the file is not saved.

@rsterz
That warning is most likely from myopenhab cloud connector and is a common warning to get even for those that do not use the ipcamera binding, just ignore it. okhttp3 is used by the ONVIF functions of the binding but your questions are about non onvif cameras hence why I suspect it is the cloud connector.

Correct, nothing happens and nothing is saved. It is stored in RAM until you request it as per the readme examples which I just added more examples a few hours ago. The image channel will not update from Ffmpeg created snapshots, the other methods need to be used.

@Brignoud
The readme file tells you which features need Ffmpeg to work, some work fine without it.

@Kim_Andersen and @roli
Most likely it is the folder does not have read and write permissions for the user that Openhab uses to run as. Google how to change folder permissions. Some of my cameras donā€™t have audio in them at all and work fine. I am not a ffmpeg expert so try googling the error, that is usually a helpful thing to do.

Thanks!
What I did is I just changed the owner of my ā€œcamerasā€ folder to openhab and the thing started working nicely.

Now for my next questionā€¦ is it possible to include these streams in Habpanel so that all cameras would show up playing at the same time? I know that this was never an issue with mjpeg, but I am not sure about HLC.

Thanks Matt.
But itĀ“s the default hiracy /etc/openhab2/html/camera1/ folderā€¦ I assume the same (default) user would have write access.
I simply has to ask you (all of you) - What did you do, when you installed ffmpeg? My setup is a plain (new) openhabian (openhab 2.5) running on a Rpi4. There is nothing really unique in it. And since most of you got ffmpeg working, I assume there is a step I seem to be missing somewhere.

EDIT - Got animated GIF working nowā€¦ It was indeed a permission missing to the directoryā€¦ I now get a nice looking animated gif.

Now IĀ“ve only got on issue leftā€¦ I dont get any image updateā€¦ Sometimes it works and sometimes it doesntā€¦ (mostly it doesnt). I cant seem to figure out why. If I enter an Image override URL, IĀ“ll get the same error reported a coupple of months ago.
The rtsp input must be working fine, since I get the animated GIF working now, right?

EDIT again!

After I got the permission solved, I also receive .ts files. But the m3u8 file is just a 1kb fileā€¦

Any help/ideas on this part? I assume itĀ“s a ffmpeg thing, but I have no idea why, and no idea how to eitherā€¦ the readme mention som ffmepg settingsā€¦I dont know where to insert these?.

So for anyone else that is wondering how to integrate the HLS stream into habpanel (works nicely on low cost android tablet with chrome and mac/ipad with chrome/safari - with 8 streams running at the same time):

Add a new template widget and paste this code into it:

<span style="position:absolute; height:100%; width:100%; overflow: hidden; top: 0; left: 0;">
<video style="width: 100%; height: 100%; position: relative; top: 0; left: 0;" src="{{itemValue('Hiska_camera1_hls')}}" controls autoplay/>
</span>

Modify the channel name to your HLS stream url.

This will basically force the video player to fit into the widget size without overflowing or looking weird.

Would you care to share how you got ffmpeg to create the HLS stream?

Nothing special. The only thing that was wrong was the permissions for that directory. After I fixed that everything started working. I did have to manually set the ffmpeg input stream to my NVR rtsp feed or it would just take the feed from the first camera for all of them.

I see that the binding is correctly creating the stream for you. Note that the ipcamera.m3u8 is just the playlist. The actual video is in the .ts files. So just point the browser to the .m3u8 file and you should see the stream. Note that Firefox doesnā€™t want to play h264 streams for me. But it works in Safari.

EDIT: Chrome on Mac or Windows wonā€™t play this! It does play it on android though so I assumed it would elsewhere as well. Works in Safari for Mac though.

1 Like

AHHH!! That is a good infoā€¦ I was certain it was the m3u8 file onlyā€¦
And IĀ“m using Chromeā€¦ It wont play the hls stream eitherā€¦ Damit, this has been fooling me all day nowā€¦ Will try on my Android insted, (just need to figure howto).

If you are on Windows try Edge. Works there nicely as well. There are some addons for Chrome and Firefox that are supposed to enable HLS, but from testing them just now they donā€™t seem to work.

1 Like

Yes, Edge do seem to streamā€¦ But I got some strange delay in itā€¦ The Onscreen display is running realitime, but the actual picture is delayed for aprox 10 secā€¦ Pretty weird.

EDIT
VLC, (on Android) can also stream the hls. And from VLC I managed to cast the stream to my Google Nest hub, which also streams fine.
But I cant seem to get rid of this aprox 10 sec delay. I have tried chaging the setting in the camera, but it doesnt make any changes, except the file sizes are beeing smaller. .

@matt1 do you have any ide of this one?

I can confirm that delay as well. I didnā€™t notice it at first. But yes, there is around a 10s delay there. Which is highly annoying.

And, yes, I do have startStream channel set to on for all of my cameras.

Before I switched to this binding and a new NVR I was using a cheap NVR and I was using my fileserver (running an older Intel Pentium D) and it was creating a mjpg stream using ffmpeg for 8 cameras in almost real time. The only difference is that I was using ffserver to serve those streams.

IĀ“m glad you noticed this as well. The last coupple of hours has been a pain trying to change itā€¦ Doesnt seem to be possible. Matt hopefully has an idea as to why this happensā€¦ I was infact very pleased to see how fast I could cast it to my Google Nest Hub, only to find it beeing 10 seconds behind :frowning:

HLS by design has a delay google how to minimise it, but the lower you go the more u have issues. Just a matter of overriding the hls arguments with what u want.

Yeah just read a few articles about hlsā€¦ But 10 sec is very long, in my opinion.
Using it to starte a stream to a device on a motion sensor trigger or something simular makes it rather useless. The thing that triggered the start could very well be long gone before the stream even starts to playā€¦
It seems like a methode is to lower the time on the first segment. But since it defaults to 2 sec, and can go to 1 sec, it seems abit pointless. Another option is to try is to limit the amount of segments fra 5 to perhaps 2 or 3.
Well, I guess IĀ“ll have to play with the ffmpeg settings.

Well if you do play around with it and figure anything out - I am very interested. Even lowering the delay down to something like 5s would be a huge difference.

@matt1 Looking at ffmpeg there is also a mention of LHLS (low latency HLS). Not sure how fare this has been implemented, but it could probably be useful here.

If you upgrade Ffmpeg to version 4.x you can actually create LHLS and DASH at the same time as they now use the same files. I was playing with this a few weeks ago trying to formulate the best way forward for some new features. The newer version of HLS uses .m4s file types instead of .ts format but they still work very similar under the hood and it does not magically solve the delay when creating segmented streams.

with the current HLS you need to start EVERY new segment with a keyframe and most cameras default to 2 seconds between keyframes, hence the binding uses this as the default. Most cameras also can not create keyframes more often then once a second and this is the limiting factor.

A 5 second lag is pushing the limits as your RTSP stream is already lagging behind real time, so the HLS delay gets added onto this. Using a http source instead of RTSP may give better results.

I am keen to hear how you go playing with this as there are trade offs as you are sure to find.

I found the reason to my problem. The new version requires to set UPDATE_IMAGE=true for my channel to update regularly. It seems that the default changed to false. With the old version the channel got updated automatically.

The documentation update is great. Thanks @matt1

For using the ipcams as surveillance, it has to be no more than a maximum of 3-5 secā€¦ After I read about the HLS streaming, I doubt this would ever be possible. As Matt says, the cam itself has 2 sec between keyframes. Thats highly unfortunate!
I wonder what video doorbells are using, and how they manage it. I really dont think anyone could live with a >10 second delay for a doorbell. 10 seconds it way too much in my opinion.
Second, my focus is mainly on getting ipcams to show (stream) onto a Nest Hub device. So IĀ“ll be restricted to what formats/codec this device is able to show.

I will give it a try, but I really doubt it will be successfull, unfortunatly. HLS is mainly used for live streaming through the internet, where this kind of latenzy doesnt matter that much.

I thought RTSP was indeed a http source??