IpCamera: New IP Camera Binding

Sorry, with ONVIF/HTTPonly protocol do I have to install ffmpeg with h264 anyway on my Raspberry Openhab host instance ?

New version (20200201) cannot create snapshots anymore with my setup. My cameras (Instar IN-9008 and IN-8015) didn’t change and still work without problems. I removed all the old jars, deleted cache with openhab-cli and removed tmp files. The new jars installed fine but I receive the following error:

2020-02-01 19:21:42.579 [WARN ] [okhttp3.OkHttpClient                ] - A connection to http://192.168.178.11:8080/ was leaked. Did you forget to close a response body? To see where this was allocated, set the OkHttpClient logger level to FINE: Logger.getLogger(OkHttpClient.class.getName()).setLevel(Level.FINE);

How can I set the level to fine in order to get more information? I couldn’t figure out if I can use the log.set command of the karaf console to do this. It returned that the level FINE is not defined.

I restarted openhab several times but the error persists.

I have set ipcamera to TRACE logging. It shows that the camera photo is downloaded from the camera. But nothing happens after the download. Maybe the file is not saved.

@rsterz
That warning is most likely from myopenhab cloud connector and is a common warning to get even for those that do not use the ipcamera binding, just ignore it. okhttp3 is used by the ONVIF functions of the binding but your questions are about non onvif cameras hence why I suspect it is the cloud connector.

Correct, nothing happens and nothing is saved. It is stored in RAM until you request it as per the readme examples which I just added more examples a few hours ago. The image channel will not update from Ffmpeg created snapshots, the other methods need to be used.

@Brignoud
The readme file tells you which features need Ffmpeg to work, some work fine without it.

@Kim_Andersen and @roli
Most likely it is the folder does not have read and write permissions for the user that Openhab uses to run as. Google how to change folder permissions. Some of my cameras don’t have audio in them at all and work fine. I am not a ffmpeg expert so try googling the error, that is usually a helpful thing to do.

Thanks!
What I did is I just changed the owner of my “cameras” folder to openhab and the thing started working nicely.

Now for my next question… is it possible to include these streams in Habpanel so that all cameras would show up playing at the same time? I know that this was never an issue with mjpeg, but I am not sure about HLC.

Thanks Matt.
But it´s the default hiracy /etc/openhab2/html/camera1/ folder… I assume the same (default) user would have write access.
I simply has to ask you (all of you) - What did you do, when you installed ffmpeg? My setup is a plain (new) openhabian (openhab 2.5) running on a Rpi4. There is nothing really unique in it. And since most of you got ffmpeg working, I assume there is a step I seem to be missing somewhere.

EDIT - Got animated GIF working now… It was indeed a permission missing to the directory… I now get a nice looking animated gif.

Now I´ve only got on issue left… I dont get any image update… Sometimes it works and sometimes it doesnt… (mostly it doesnt). I cant seem to figure out why. If I enter an Image override URL, I´ll get the same error reported a coupple of months ago.
The rtsp input must be working fine, since I get the animated GIF working now, right?

EDIT again!

After I got the permission solved, I also receive .ts files. But the m3u8 file is just a 1kb file…

Any help/ideas on this part? I assume it´s a ffmpeg thing, but I have no idea why, and no idea how to either… the readme mention som ffmepg settings…I dont know where to insert these?.

So for anyone else that is wondering how to integrate the HLS stream into habpanel (works nicely on low cost android tablet with chrome and mac/ipad with chrome/safari - with 8 streams running at the same time):

Add a new template widget and paste this code into it:

<span style="position:absolute; height:100%; width:100%; overflow: hidden; top: 0; left: 0;">
<video style="width: 100%; height: 100%; position: relative; top: 0; left: 0;" src="{{itemValue('Hiska_camera1_hls')}}" controls autoplay/>
</span>

Modify the channel name to your HLS stream url.

This will basically force the video player to fit into the widget size without overflowing or looking weird.

Would you care to share how you got ffmpeg to create the HLS stream?

Nothing special. The only thing that was wrong was the permissions for that directory. After I fixed that everything started working. I did have to manually set the ffmpeg input stream to my NVR rtsp feed or it would just take the feed from the first camera for all of them.

I see that the binding is correctly creating the stream for you. Note that the ipcamera.m3u8 is just the playlist. The actual video is in the .ts files. So just point the browser to the .m3u8 file and you should see the stream. Note that Firefox doesn’t want to play h264 streams for me. But it works in Safari.

EDIT: Chrome on Mac or Windows won’t play this! It does play it on android though so I assumed it would elsewhere as well. Works in Safari for Mac though.

1 Like

AHHH!! That is a good info… I was certain it was the m3u8 file only…
And I´m using Chrome… It wont play the hls stream either… Damit, this has been fooling me all day now… Will try on my Android insted, (just need to figure howto).

If you are on Windows try Edge. Works there nicely as well. There are some addons for Chrome and Firefox that are supposed to enable HLS, but from testing them just now they don’t seem to work.

1 Like

Yes, Edge do seem to stream… But I got some strange delay in it… The Onscreen display is running realitime, but the actual picture is delayed for aprox 10 sec… Pretty weird.

EDIT
VLC, (on Android) can also stream the hls. And from VLC I managed to cast the stream to my Google Nest hub, which also streams fine.
But I cant seem to get rid of this aprox 10 sec delay. I have tried chaging the setting in the camera, but it doesnt make any changes, except the file sizes are beeing smaller. .

@matt1 do you have any ide of this one?

I can confirm that delay as well. I didn’t notice it at first. But yes, there is around a 10s delay there. Which is highly annoying.

And, yes, I do have startStream channel set to on for all of my cameras.

Before I switched to this binding and a new NVR I was using a cheap NVR and I was using my fileserver (running an older Intel Pentium D) and it was creating a mjpg stream using ffmpeg for 8 cameras in almost real time. The only difference is that I was using ffserver to serve those streams.

I´m glad you noticed this as well. The last coupple of hours has been a pain trying to change it… Doesnt seem to be possible. Matt hopefully has an idea as to why this happens… I was infact very pleased to see how fast I could cast it to my Google Nest Hub, only to find it beeing 10 seconds behind :frowning:

HLS by design has a delay google how to minimise it, but the lower you go the more u have issues. Just a matter of overriding the hls arguments with what u want.

Yeah just read a few articles about hls… But 10 sec is very long, in my opinion.
Using it to starte a stream to a device on a motion sensor trigger or something simular makes it rather useless. The thing that triggered the start could very well be long gone before the stream even starts to play…
It seems like a methode is to lower the time on the first segment. But since it defaults to 2 sec, and can go to 1 sec, it seems abit pointless. Another option is to try is to limit the amount of segments fra 5 to perhaps 2 or 3.
Well, I guess I´ll have to play with the ffmpeg settings.

Well if you do play around with it and figure anything out - I am very interested. Even lowering the delay down to something like 5s would be a huge difference.

@matt1 Looking at ffmpeg there is also a mention of LHLS (low latency HLS). Not sure how fare this has been implemented, but it could probably be useful here.

If you upgrade Ffmpeg to version 4.x you can actually create LHLS and DASH at the same time as they now use the same files. I was playing with this a few weeks ago trying to formulate the best way forward for some new features. The newer version of HLS uses .m4s file types instead of .ts format but they still work very similar under the hood and it does not magically solve the delay when creating segmented streams.

with the current HLS you need to start EVERY new segment with a keyframe and most cameras default to 2 seconds between keyframes, hence the binding uses this as the default. Most cameras also can not create keyframes more often then once a second and this is the limiting factor.

A 5 second lag is pushing the limits as your RTSP stream is already lagging behind real time, so the HLS delay gets added onto this. Using a http source instead of RTSP may give better results.

I am keen to hear how you go playing with this as there are trade offs as you are sure to find.

I found the reason to my problem. The new version requires to set UPDATE_IMAGE=true for my channel to update regularly. It seems that the default changed to false. With the old version the channel got updated automatically.

The documentation update is great. Thanks @matt1

For using the ipcams as surveillance, it has to be no more than a maximum of 3-5 sec… After I read about the HLS streaming, I doubt this would ever be possible. As Matt says, the cam itself has 2 sec between keyframes. Thats highly unfortunate!
I wonder what video doorbells are using, and how they manage it. I really dont think anyone could live with a >10 second delay for a doorbell. 10 seconds it way too much in my opinion.
Second, my focus is mainly on getting ipcams to show (stream) onto a Nest Hub device. So I´ll be restricted to what formats/codec this device is able to show.

I will give it a try, but I really doubt it will be successfull, unfortunatly. HLS is mainly used for live streaming through the internet, where this kind of latenzy doesnt matter that much.

I thought RTSP was indeed a http source??

No probs, glad you got it going and enjoy the binding.

It should be possible, but it needs you to have a play and understand the trade offs. Having a camera that allows keyframes every second (or even more often as some cameras allow you to specify how often, some even allow every second frame) will help the most, you will also need to keep the stream running non stop.

The binding has to work on as many browsers as possible with default settings, as many cameras with their default settings and also support working on demand for those that don’t want it running all the time. Compromises are made to allow it to work in a wide range of situations so if you play you can tailor it to what you want as you have direct access to the ffmpeg command.

It can actually be useful, if your TV takes time to turn on and switch inputs you still get to see the person at your door instead of their backs as they walk away :slight_smile:
I use the cameras as a baby monitor so I really don’t care about a delay, each person uses cameras for different things.

I personally do the following and it works great here on a Google/Nest hub.

  1. Cast the ipcamera.jpg the moment the doorbell is pushed and also trigger the GIF. The binding can send a jpg faster than your camera, as some cameras need to wait for a keyframe to be created first which may be 2 seconds away.
  2. When gif is ready, cast it and it can include preroll time that shows the person walking up to the door and pushing the button (pretty cool as the button is the trigger) and it can be in 2x 4x or any speed you wish to see it fast forward and loop.
  3. You could get fancy and show a history of the days visitors with the gif file. I have it sent to me with pushover which makes a different noise to email on my phone.

Plenty of ways to automate.