IpCamera: New IP Camera Binding

You only posted the output from ipcamera.jpg in full. What about the cameras output does the data that gets sent match the length of the header? That was the output I was
Looking for, just not for what the binding sends as I can reproduce that here, with your camera I can not.

Setup the camera as httponly and not onvif. Since u have set it up as hikvision and it is not a hik camera the binding may be using the wrong settings. Some of the cameras are from the hik factory but that does not mean they have the api or even use the same URLs.
I really can not comment much without debug log as to what is happening. Openhab will also be telling you why the camera is offline with a description that may help.

Thanks for your reply @matt1

This is how my thing config looks like now:

 Thing ipcamera:HTTPONLY:001 [
        IPADDRESS="192.168.x.x",
        USERNAME="admin",
        PASSWORD="password",
        POLL_CAMERA_MS=2000,
        SERVER_PORT=54321,
        IP_WHITELIST="DISABLED",
        FFMPEG_INPUT="rtsp://admin:password@192.168.x.x:554",
        FFMPEG_OUTPUT="/etc/openhab2/html/camera1/"
]

I get the following output in the log file:

2020-02-11 08:40:00.262 [DEBUG] [pcamera.internal.StreamServerHandler] - Stream Server recieved request         POST:/snapshot.jpg
2020-02-11 08:40:00.265 [WARN ] [pcamera.internal.StreamServerHandler] - The request made from (127.0.0.1) was not in the whitelist and will be ignored.
2020-02-11 08:40:00.856 [DEBUG] [hab.binding.ipcamera.internal.Ffmpeg] - frame=   16 fps=0.3 q=24.8 size=N/A time=00:01:00.06 bitrate=N/A speed=1.07x
2020-02-11 08:40:01.306 [DEBUG] [hab.binding.ipcamera.internal.Ffmpeg] - frame=   16 fps=0.3 q=24.8 size=N/A time=00:01:00.06 bitrate=N/A speed=1.06x
2020-02-11 08:40:01.816 [DEBUG] [hab.binding.ipcamera.internal.Ffmpeg] - frame=   16 fps=0.3 q=24.8 size=N/A time=00:01:00.06 bitrate=N/A speed=1.05x
2020-02-11 08:40:02.354 [DEBUG] [hab.binding.ipcamera.internal.Ffmpeg] - frame=   16 fps=0.3 q=24.8 size=N/A time=00:01:00.06 bitrate=N/A speed=1.04x
2020-02-11 08:40:02.896 [DEBUG] [hab.binding.ipcamera.internal.Ffmpeg] - frame=   16 fps=0.3 q=24.8 size=N/A time=00:01:00.06 bitrate=N/A speed=1.03x

I have created an item linked to: ipcamera:HTTPONLY:001:image

I still dont see my image in my habpanel.

Try disable in caps and not disabled for the white list. Also by default the image channel is turned off. Use the sitemap examples instead which should work when u make the above change.

Thanks!! Its working now…

Frame rate just very slow so to play around with the settings to fix that.

New Build 2020-02-11 has these changes:

  • If you wish to force ffmpeg to be used for mjpeg you can use STREAM_URL_OVERRIDE="ffmpeg" This will also remove a warn that occurs if you leave it blank.
  • Reolink change to how it handles the snapshot @Axel_Kummerlowe can you test this version if you get a chance please? TRACE log output if it does not work should be helpful over a few snapshot requests.
1 Like

Thanks @matt1.

I have downloaded the latest version and used: STREAM_URL_OVERRIDE=“ffmpeg”

My FPS is 3 seconds… how do i fix that?

Thing ipcamera:HTTPONLY:001 [
    IPADDRESS="192.168.x.x",
    USERNAME="admin",
    PASSWORD="password",
    POLL_CAMERA_MS=200,
    SERVER_PORT=54321,
    IP_WHITELIST="DISABLE",
    FFMPEG_INPUT="rtsp://admin:password@192.168.x.x:554",
    FFMPEG_OUTPUT="/etc/openhab2/html/camera1/",
    UPDATE_IMAGE=true,
    STREAM_URL_OVERRIDE="ffmpeg",
    FFMPEG_HLS_OUT_ARGUMENTS="-strict -2 -acodec copy -vcodec copy -hls_flags delete_segments -segment_list_flags live -flags -global_header -hls_time 1 -hls_allow_cache 0 -hls_list_size 3"
]

On my habpanel i have also set to refresh every 200ms.

If i use the new command stream_url_override=“ffmpeg” how can i configure the parameters (e.g. resolution) for the mjpeg stream?

When i use the mjpeg stream given from the cam to the binding, i have to do this in my cam, because the binding only makes a path through, but with this new option, can i configure this different from the snapshots-parameters and hls-parameters?

@Marius_van_Belkum
You cant go below 1000, you will be getting an error in your logs telling you this is not a valid value.

@halloween
That would be the plan to do this if people find it useful, to give it is own config to override the default with. So far I have not had much feedback on the feature and no one has requested it so have been busy coding for things that I need for my own setup.
If you have a camera that can create its own mjpeg why do you want to use this? It should be a 6fps stream that is full resolution if your CPU can keep up. By lowering the resolution my guess is it will use more cpu not less, you would need to provide a stream that is already lower res for it to lower the CPU load if that is what you are after?

Hi, my own camera has mjpeg, i don´t need that here.

But a friend of mine didn´t hear to my advice to buy a camera with mjpeg capability (it was about 20 euro more expensive…) - so he bought the cheapest china cam and now he only has a rtsp stream (2 streams with different resolution). No mjpeg - no snapshots. Snapshots work now with your fabulous binding, but he wants mjpeg for his openhab-sitemap too. The new snapshot-stream only has about 1fps - thats not enough for him. His openhab server with ffmpeg installed is a raspi 4 with 4gb ram.

That should already work if you ask the binding for ipcamera.mjpeg but currently it will be fixed to the same resolution as the RTSP stream that is given to the binding and also to 6 FPS.
IMHO there is probably only 2 things that make sense to tweak and play with unless you want to crop or resize for various reasons.

  1. FPS increasing it above 6fps will increase CPU load so there is a trade off.
  2. Changing the jpeg compression to lower the network load by trading off picture quality with higher amounts of jpeg compression.

As for CPU load I published some tested a while back on what to expect on a ARM based system here

New build 2020-02-17 has these changes:

  • New channel externalMotion for allowing external sensors to trigger motion alarms. Cameras without built in motion can use a Zwave PIR for example and trigger the binding as if the sensor was a part of the camera.
  • New experimental group thing type for viewing multiple cameras as a single camera.
  • All cameras now have the channel lastMotionType

See readme for info on any of these.

Hi

First Great binding. :smile:

My HLS stream is about 8-10 sec behind, is this normal. it is running on a PI 3b+ would it be more live if I run it on a Pi4 instaid?

how do i change how much HLS stream is saved before overwriting.

Glad you find it useful and like it.

No upgrading wont change a thing other than giving you more ram and twice the CPU power and a network card that is not as limiting, which are worth it if you have multiple cameras.

To lower the delay. (there are a few posts on what to use only around 10 posts above this one)

  1. Set all cameras to create a keyframe (i frame) every second, refer to their manuals, support and google for how.
  2. Change the hls arguments to create segments that are 1 second in length.
  3. Limit it to 2-3 segments in the HLS playlist.
  4. Read up on the topic as you will need to know how/what HLS is to understand how to tweak it. Some browsers will stop working with some settings, and others will cache ahead before they play which adds on a delay.

Thanks

Could you give me a guide to solve solution 2 and 3

Around 11 posts up…

So this weekend I moved my Opnehab server from a Pi 3 to my proper server. I installed it into VirtualBox and it’s been running happily there.

So I removed everything related to HLS and went back to good old mjpeg. I set the stream url override to ffmpeg and let the binding generate mjpeg stream. Sadly the thing performs even worse than HLS. It still has 5s delay, worse framerate with very obvious skipping and it crashes the web browser on my cheapy Android tablet after a few minutes.

So I disabled the binding entirely and used my “old” way. Basically setting up the ffserver with this config (only the relevant part is here):

<Stream camera6.mjpg>
Feed camera6.ffm
Format mpjpeg
VideoFrameRate 8
VideoSize 725x450
VideoBitRate 2540
VideoIntraOnly
NoAudio
Strict -1
</Stream>

And then starting ffmpeg with the folowing parameters:

ffmpeg -hide_banner -loglevel panic -nostats -i 'rtsp://....' http://127.0.0.1:8090/camera6.ffm

And this works very nicely. Tablet doesn’t crash, the stream delay is about 2s, … Sadly using ffserver is a bad idea since it has been removed from newer builds, so it’s not a long term solution.

@roli
Thanks for reporting some feedback. Sadly I can not reproduce that here and this is the results of testing lag behind realtime using 3 different cameras all made in different factories.

Http based mjpeg from camera, through the binding and to my iPhone using WIFI = 0.5 seconds, not a mis print it is around half a second delay. Only API cameras can do this.

ffmpeg creating the mjpeg, routed through the binding and to my iPhone using WIFI = 1 second delay after it had been running for a while. It was possibly 2 to 2.5 seconds behind when it first started as ffmpeg takes some time to get up to full speed and yes during that time I did see some minor stuttering glitches. Due to the stream not closing down due to a bug (already fixed for next build), I left and came back to the stream and it was perfect and only 1 second behind real time. Since your running ffmpeg non stop with that method it is not a direct comparison to something that starts and stops the stream on demand whilst a browser is waiting. If the binding produces the stream ahead of time it works great with only a 1 second delay.

Using a camera that can supply a snapshot and then retesting using ffmpeg to create the snapshot from rtsp, I got the same results both ways:

ipcamera.jpg with a 1 second refresh was 1 second behind real time.

snapshots.mjpeg with 1 second poll was 1 second behind in a PC’s browser, but on iOS it seemed to buffer a frame and was an extra frame behind.

So it is really only HLS that has any lag worth mentioning on it and that is purely from the design of HLS and how it works that creates the lag. I am yet to run a test to see how far I can push HLS, but my main cameras can create keyframes at any interval that I wish so it will be an interesting experiment when I find the time to play.

My android tablet has had no issues when I use it and when it charges I can confirm it does the same as above.

Matt, do you know if the Google Nest Hub will accept anything else than HLS ?

Wow… seems i don’t visit this for a few weeks and so much is added…
So im using hls in ios app/sitemaps.
Keep up the great work.

@matt1 Whats the recommended way now for habpanel?