IP Camera: ffmpeg options for snapshot URL do not work

Hello together,
I facing problems to apply some ffmpeg options for the snapshot URL (ipcamera.jpg).

I’m trying to apply the same ffmpeg options (rotating the input stream via transpose=2) that I am using for MJPEG (which works nice) to the snapshot Image. But all I tried did not work.

I tried the following config (default value except the postfix: “-vf transpose=2”):

thingTypeUID: ipcamera:onvif
  mjpegOptions: -q:v 5 -r 2 -vf scale=640:-2 -update 1 -vf transpose=2
  snapshotOptions: -an -vsync vfr -q:v 2 -update 1 -vf transpose=2

The output of mjpeg options are shown correctly at:
<openhabUrl>/ipcamera/<cameraId>/ipcamera.mjpeg

But for the snapshot there is no change on:
<openhabUrl>/ipcamera/<cameraId>/ipcamera.jpg

Has anyone applied some additional filters to the options which are working for the snapshot?
Or should I configure another option for the ipcamera.jpg endpoint?

Kind regards
Daniel

Based on the binding’s documentation I see these two entries:

mjpegOptions Allows you to change the settings for creating a MJPEG stream from RTSP using FFmpeg. Possible reasons to change this would be to rotate or re-scale the picture from the camera, change the JPG compression for better quality or the FPS rate.
snapshotOptions Specify your own FFmpeg options to be used when creating snapshots from RTSP. Default: -an -vsync vfr -q:v 2 -update 1

Based on that I would assume that the second option would be the one you are looking for.

Thanks for your reply. I had already feared :confused:
Then it may be a bug, or what do you think?

If I try the options plain via terminal it works, but when I use the snapshotOptions it does not work, or the image is not rotated

Have you checked the TRACE level logs to see what is being reported back from ffmpeg?

If it works via the terminal, what is the command you tested on the terminal?

Edit

Ffmpeg has two types of options. Those located before the input aka input options and those in the command for the output aka output options. You can not mix the two, so it’s possible the things option is for input options and can not have an output option added. I would have to review the code to work out if both can be handled.

I have already checked the TRACE log, but for the snapshot I do not see any further information. If I request the mjpeg, I see a lot of information about ffmpeg, see:

2023-11-30 09:25:38.049 [DEBUG] [amera.internal.servlet.CameraServlet] - GET:/ipcamera.mjpeg, received from 192.168.xx.xx
2023-11-30 09:25:38.052 [DEBUG] [amera.internal.servlet.CameraServlet] - First stream requested, opening up stream from camera
2023-11-30 09:25:38.067 [DEBUG] [hab.binding.ipcamera.internal.Ffmpeg] - Starting ffmpeg with this command now:-rtsp_transport tcp -hide_banner -i rtsp://<user>:<password>@192.168.xx.xx:554/live/ch0 -q:v 5 -r 2 -vf scale=640:-2 -update 1 -vf transpose=2 http://127.0.0.1:8080/ipcamera/f7b653e0f0/ipcamera.jpg
2023-11-30 09:25:42.267 [TRACE] [hab.binding.ipcamera.internal.Ffmpeg] - Guessed Channel Layout for Input Stream #0.1 : mono
2023-11-30 09:25:42.272 [TRACE] [hab.binding.ipcamera.internal.Ffmpeg] - Input #0, rtsp, from 'rtsp://<user>:<password>@192.168.xx.xx:554/live/ch0':
2023-11-30 09:25:42.273 [TRACE] [hab.binding.ipcamera.internal.Ffmpeg] -   Metadata:
2023-11-30 09:25:42.275 [TRACE] [hab.binding.ipcamera.internal.Ffmpeg] -     title           : hysxrtpsion
2023-11-30 09:25:42.276 [TRACE] [hab.binding.ipcamera.internal.Ffmpeg] -   Duration: N/A, start: 0.436000, bitrate: N/A
2023-11-30 09:25:42.277 [TRACE] [hab.binding.ipcamera.internal.Ffmpeg] -     Stream #0:0: Video: h264 (High), yuvj420p(pc, bt709, progressive), 1920x1080, 25 fps, 25.42 tbr, 90k tbn, 50 tbc
2023-11-30 09:25:42.279 [TRACE] [hab.binding.ipcamera.internal.Ffmpeg] -     Stream #0:1: Audio: pcm_alaw, 8000 Hz, mono, s16, 64 kb/s
2023-11-30 09:25:42.282 [TRACE] [hab.binding.ipcamera.internal.Ffmpeg] - Multiple -filter, -af or -vf options specified for stream 0, only the last option '-filter:v transpose=2' will be used.
2023-11-30 09:25:42.284 [TRACE] [hab.binding.ipcamera.internal.Ffmpeg] - Stream mapping:
2023-11-30 09:25:42.286 [TRACE] [hab.binding.ipcamera.internal.Ffmpeg] -   Stream #0:0 -> #0:0 (h264 (native) -> mjpeg (native))
2023-11-30 09:25:42.288 [TRACE] [hab.binding.ipcamera.internal.Ffmpeg] - Press [q] to stop, [?] for help
2023-11-30 09:25:42.592 [TRACE] [hab.binding.ipcamera.internal.Ffmpeg] - Output #0, image2, to 'http://127.0.0.1:8080/ipcamera/f7b653e0f0/ipcamera.jpg':
2023-11-30 09:25:42.597 [TRACE] [hab.binding.ipcamera.internal.Ffmpeg] -   Metadata:
2023-11-30 09:25:42.629 [TRACE] [hab.binding.ipcamera.internal.Ffmpeg] -     title           : hysxrtpsion
2023-11-30 09:25:42.652 [TRACE] [hab.binding.ipcamera.internal.Ffmpeg] -     encoder         : Lavf58.45.100
2023-11-30 09:25:42.654 [TRACE] [hab.binding.ipcamera.internal.Ffmpeg] -     Stream #0:0: Video: mjpeg, yuvj420p(pc), 1080x1920, q=2-31, 200 kb/s, 2 fps, 2 tbn, 2 tbc
2023-11-30 09:25:42.670 [TRACE] [hab.binding.ipcamera.internal.Ffmpeg] -     Metadata:
2023-11-30 09:25:42.672 [TRACE] [hab.binding.ipcamera.internal.Ffmpeg] -       encoder         : Lavc58.91.100 mjpeg
2023-11-30 09:25:42.686 [TRACE] [hab.binding.ipcamera.internal.Ffmpeg] -     Side data:
2023-11-30 09:25:42.708 [TRACE] [hab.binding.ipcamera.internal.Ffmpeg] -       cpb: bitrate max/min/avg: 0/0/200000 buffer size: 0 vbv_delay: N/A
2023-11-30 09:25:42.881 [TRACE] [hab.binding.ipcamera.internal.Ffmpeg] - frame=    2 fps=0.0 q=0.0 size=N/A time=00:00:00.00 bitrate=N/A dup=0 drop=4 speed=   0x
2023-11-30 09:25:43.681 [TRACE] [hab.binding.ipcamera.internal.Ffmpeg] - frame=    4 fps=2.9 q=5.0 size=N/A time=00:00:00.50 bitrate=N/A dup=0 drop=24 speed=0.361x
2023-11-30 09:25:44.176 [TRACE] [hab.binding.ipcamera.internal.Ffmpeg] - frame=    5 fps=2.6 q=5.0 size=N/A time=00:00:01.00 bitrate=N/A dup=0 drop=45 speed=0.528x
2023-11-30 09:25:44.715 [TRACE] [hab.binding.ipcamera.internal.Ffmpeg] - frame=    7 fps=2.9 q=5.0 size=N/A time=00:00:02.00 bitrate=N/A dup=0 drop=61 speed=0.832x
2023-11-30 09:25:45.226 [TRACE] [hab.binding.ipcamera.internal.Ffmpeg] - frame=    8 fps=2.7 q=5.0 size=N/A time=00:00:02.50 bitrate=N/A dup=0 drop=78 speed=0.855x
2023-11-30 09:25:45.820 [TRACE] [hab.binding.ipcamera.internal.Ffmpeg] - frame=   10 fps=2.8 q=5.0 size=N/A time=00:00:03.50 bitrate=N/A dup=0 drop=97 speed=0.992x
2023-11-30 09:25:46.320 [TRACE] [hab.binding.ipcamera.internal.Ffmpeg] - frame=   12 fps=3.0 q=5.0 size=N/A time=00:00:04.50 bitrate=N/A dup=0 drop=116 speed=1.11x

But if I use the jpg url I do only see the following lines in the log:

2023-11-30 09:23:38.810 [DEBUG] [amera.internal.servlet.CameraServlet] - GET:/ipcamera.jpg, received from 192.168.xx.xx
2023-11-30 09:23:38.825 [TRACE] [era.internal.handler.IpCameraHandler] - Sending camera: GET: http://192.168.178.xx.xx/mjpeg/snap.cgi?chn=0

It looks like the snapshotOptions are ignored and only the snapshot image from the camera directly is used.

Thats why I asked if I do it the right way with the snapshotOptions and using the jpg url?!

I tested the following command via terminal:

ffmpeg -rtsp_transport tcp -hide_banner -i rtsp://<user>:<password>@192.168.xx.xx:554/live/ch0 -q:v 5 -r 2 -vf scale=640:-2 -update 1 -vf transpose=2 /mnt/c/tmp/file5.jpg

as well as:

ffmpeg -i http://<user>:<password>@192.168.xx.xx/mjpeg/stream.cgi?chn=0 -an -vsync vfr -q:v 2 -update 1 -vf "transpose=2" /mnt/c/tmp/file4.jpg

They both work fine.

Do you see any problems, or did I missconfigured the thing somehow?

The complete configuration of the Thing is:

UID: ipcamera:onvif:xxxxxxx
label: Frontdoor Camera - ONVIF IP Camera
thingTypeUID: ipcamera:onvif
configuration:
  mjpegOptions: -q:v 5 -r 2 -vf scale=640:-2 -update 1 -vf transpose=2
  ipAddress: 192.168.xx.xx
  updateImageWhen: "0"
  gifPreroll: 0
  onvifPort: 80
  ffmpegLocation: /usr/bin/ffmpeg
  ipWhitelist: DISABLE
  mp4OutOptions: -c:v copy -c:a copy
  pollTime: 2000
  password: "<password>"
  port: 80
  snapshotOptions: -an -vsync vfr -q:v 2 -update 1
  ptzContinuous: false
  onvifMediaProfile: 0
  gifOutOptions: -r 2 -filter_complex
    scale=-2:360:flags=lanczos,setpts=0.5*PTS,split[o1][o2];[o1]palettegen[p];[o2]fifo[o3];[o3][p]paletteuse
  hlsOutOptions: -strict -2 -f lavfi -i aevalsrc=0 -acodec aac -vcodec copy
    -hls_flags delete_segments -hls_time 2 -hls_list_size 4
  username: <user>
location: Frontdoor

Try entering in ffmpeg all lowercase as the snapshot url. This should force the binding to not use the cameras url and to use the rtsp stream to create a snapshot.

I don’t know what you exactly mean. Can you specify the configuration parameter/option you mean, please?
Do you mean a snapshot Url overrwrite maybe?

It is documented in the bindings thing configuration, see snapshotUrl which you can use the json to just add it that way, or it can be seen in the UI if you tick SHOW ADVANCED.

Normally you would rotate the image by going into the cameras setup/webpage directly, however if your camera does not have that ability are you wanting to use ffmpeg then you will need to stop the binding from using the cameras snapshot directly and force the binding to create a snapshot from the RTSP stream which will increase your CPU load. Give it a go and see how it is, but getting a camera that will rotate on board would be worth the investment IMHO but we all have different requirements. If you get it working then be sure to post what worked to help others that find your post.

Yes you are right about the camera, but my model does not support it and I want to give it a try.

I was wondering if it could work that way you told, due to the label of snapshotUrl: “Leave blank to use the autodetected URL for snapshots, or enter a HTTP URL to where a snapshot can be seen if entered into any browser.”

You wrote RTSP stream will be used, but I only can provide a HTTP URL?!

But I tried:

  snapshotUrl: http://192.168.xx.xx/mjpeg/snap.cgi?chn=0
  snapshotOptions: -an -vsync vfr -q:v 2 -update 1 -vf transpose=2

and I only see this in the log:

21:29:22.432 [DEBUG] [camera.internal.servlet.CameraServlet] - GET:/ipcamera.jpg, received from 192.168.178.45
21:29:22.446 [TRACE] [mera.internal.handler.IpCameraHandler] - Sending camera: GET: http://192.168.xx.xx:80/mjpeg/snap.cgi?chn=0
21:29:22.472 [TRACE] [mera.internal.handler.IpCameraHandler] - Sending camera: GET: http://192.168.xx.xx:80/mjpeg/snap.cgi?chn=0

And the result is always the same, no rotated image. And it looks like the ffmpeg isn’t used at all. I do not see any log regarding ffmpeg.

Only if I’m opening the mjpeg openhab URL of the camera I see something like:

21:53:59.392 [DEBUG] [camera.internal.servlet.CameraServlet] - GET:/ipcamera.mjpeg, received from 192.168.xx.xx
21:53:59.399 [DEBUG] [camera.internal.servlet.CameraServlet] - First stream requested, opening up stream from camera
21:53:59.467 [DEBUG] [nhab.binding.ipcamera.internal.Ffmpeg] - Starting ffmpeg with this command now:-rtsp_transport tcp -hide_banner -i rtsp://<user>:********@192.168.xx.xx:554/live/ch0 -q:v 5 -r 2 -vf scale=640:-2 -update 1 -vf transpose=2 http://127.0.0.1:8080/ipcamera/f7b653e0f0/ipcamera.jpg

Do not give a url, as per the readme/documentation for the binding you need to enter in ‘ffmpeg’ and then it will use ffmpeg and not fetch the snapshot from an auto discovered or supplied url. Please read docs as they should have all info in them, if not then let’s work at adding what is missing.

If your ipcamea.mjpeg is working then you can get a snapshot from the same source by doing this.

I misunderstood you somehow…

now I tried:

    ffmpegInput: rtsp://192.168.xx.xx:554/live/ch0
    ffmpegInputOptions="-f mjpeg",

and also without the ffmpegInputOptions but with no luck.
Is that what you meant?

I think he means you should try

snapshotUrl: ffmpeg

Text from Readme
Leave this empty to auto detect the snapshot URL if the camera has ONVIF, or enter a HTTP address if you wish to override with a different address. Setting this to ffmpeg forces the camera to use FFmpeg to create the snapshots from the RTSP stream.

Oh man, how stupid can I be…
Thanks in advance for all your feedback. This works.

Current configuration:

UID: ipcamera:onvif:f7b653e0f0
label: Frontdoor Camera - ONVIF IP Camera
thingTypeUID: ipcamera:onvif
configuration:
  mjpegOptions: -q:v 5 -r 2 -vf scale=640:-2 -update 1 -vf transpose=2
  ipAddress: 192.168.xx.xx
  updateImageWhen: "0"
  gifPreroll: 0
  onvifPort: 80
  ffmpegLocation: /usr/bin/ffmpeg
  ipWhitelist: DISABLE
  mp4OutOptions: -c:v copy -c:a copy
  pollTime: 2000
  password: "<pw>"
  port: 80
  snapshotUrl: ffmpeg
  snapshotOptions: -an -vsync vfr -q:v 2 -update 1 -vf transpose=2
  ptzContinuous: false
  onvifMediaProfile: 0
  gifOutOptions: -r 2 -filter_complex
    scale=-2:360:flags=lanczos,setpts=0.5*PTS,split[o1][o2];[o1]palettegen[p];[o2]fifo[o3];[o3][p]paletteuse
  hlsOutOptions: -strict -2 -f lavfi -i aevalsrc=0 -acodec aac -vcodec copy
    -hls_flags delete_segments -hls_time 2 -hls_list_size 4
  username: <username>

This rotates the mjpeg as well as the jpg url:
http://:8080/ipcamera/f7b653e0f0/ipcamera.mjpeg
http://:8080/ipcamera/f7b653e0f0/ipcamera.jpg

System:

  • Rasperri PI 3B+

Installed SW:

  • OpenHab3
  • Homematic
  • MQTT
  • Zigbee
  • … many other

CPU Load:

  • For the snapshot (jpg) request the CPU load is less than 1 (out of 4)
  • For the mjpeg the load is around 3

The CPU load is okay for me, because I only look for a short time on it and close it afterwards, but @matt1 you are right, the CPU is heavily increased.

Has anyone an idea how to stop ffmpeg from encoding in the times I do not use the image or video. It is running always in the background…

This topic was automatically closed 41 days after the last reply. New replies are no longer allowed.