How to display RTSP streams from IP Cameras in Openhab and Habpanel (Linux only)

I believe I have seen this before and it was solved by adding this to the INPUT options which I will edit and add this to my first post so you can use it as an example…

Issue is your network is not delivering the UDP packets in time so using TCP instead fixes the issue.

-rtsp_transport tcp

Hi , i passed my afternoon on this trying to understand…

i followed step behind from the first post,
read multiple contents about ffmpeg an ffserver…
And i still can’t view my webcam in my sitemap in H264

if someone can help me i’ll appreciate it…

  • Openhabian
  • RPI3
  • my webcam model is foscam Fi9826p V2
  • the HTTP port is 4143
  • webcam IP
  • ONVIF port 1024
  • RTSP format :rtsp://USER:PWD@IP:PORT/videoSub
  • the sub stream is configured in h264

my ffserver.conf

HTTPPort 8090
MaxClients 10
MaxBandwidth 1000

<Feed camera1.ffm>
File /tmp/camera1.ffm
ACL allow localhost
#uncomment if you want an IP range to be able to access the stream
#ACL allow

<Stream camera1.mjpg>
Feed camera1.ffm
Format mpjpeg
VideoFrameRate 10
VideoSize 640x360
VideoBitRate 2048
Strict -1

<Stream stat.html>
   Format status
   ACL allow localhost
   ACL allow

My binding configuration:

when i launch

ffmpeg -rtsp_transport tcp -i 'rtsp://USER:PWD@IP:PORT/videoSub'

it report this log :

ffmpeg version 3.2.14-1~deb9u1+rpt1 Copyright (c) 2000-2019 the FFmpeg developers
  built with gcc 6.3.0 (Raspbian 6.3.0-18+rpi1+deb9u1) 20170516
  configuration: --prefix=/usr --extra-version='1~deb9u1+rpt1' --toolchain=hardened --libdir=/usr/lib/arm-linux-gnueabihf --incdir=/usr/include/arm-linux-gnueabihf --enable-gpl --disable-stripping --enable-avresample --enable-avisynth --enable-gnutls --enable-ladspa --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libcdio --enable-libebur128 --enable-libflite --enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-libgme --enable-libgsm --enable-libmp3lame --enable-libopenjpeg --enable-libopenmpt --enable-libopus --enable-libpulse --enable-librubberband --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libssh --enable-libtheora --enable-libtwolame --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx265 --enable-libxvid --enable-libzmq --enable-libzvbi --enable-omx --enable-omx-rpi --enable-mmal --enable-openal --enable-opengl --enable-sdl2 --enable-libdc1394 --enable-libiec61883 --arch=armhf --enable-chromaprint --enable-frei0r --enable-libopencv --enable-libx264 --enable-shared
  libavutil      55. 34.101 / 55. 34.101
  libavcodec     57. 64.101 / 57. 64.101
  libavformat    57. 56.101 / 57. 56.101
  libavdevice    57.  1.100 / 57.  1.100
  libavfilter     6. 65.100 /  6. 65.100
  libavresample   3.  1.  0 /  3.  1.  0
  libswscale      4.  2.100 /  4.  2.100
  libswresample   2.  3.100 /  2.  3.100
  libpostproc    54.  1.100 / 54.  1.100
Guessed Channel Layout for Input Stream #0.1 : mono
Input #0, rtsp, from 'rtsp://yoann:yo28ann05@':
    title           : IP Camera Video
    comment         : videoSub
  Duration: N/A, start: 0.000000, bitrate: N/A
    Stream #0:0: Video: h264 (Main), yuv420p(progressive), 1280x960, 90k tbr, 90k tbn, 180k tbc
    Stream #0:1: Audio: pcm_mulaw, 8000 Hz, mono, s16, 64 kb/s End of file

thanks in advance


only with this brands. Sure? I have rtsp link from my reolink. Will not work?

I have tried, but after starting i get

[21:12:37] openhabian@openhab:~$ ffserver
-bash: ffserver: command not found

Im on OH 2.5M4 and Raspi Pi4

So why it do not start?

I have had this method working with hikvision as well so it works for many brands, what you quoted was for the example I posted on how to get dahua and Amcrest working, which now I have added better methods into the ipcamera binding I would use that for cameras which have http urls for the streams. If your camera only has rtsp stream access then this thread would be the way unless you setup another server to do conversions using blue iris or motion project on a pi or similar.

Please help me with ffserver and ffmpeg.
I tried what you wrote earlier but I don’t understand how they worked for you.
First without httpport I couldn’t start ffserver.
I used the config from the first post and camera1.ffm doesn’t exist. 404
but camera1.mjpg is 200
If I start ffmpeg to http://ip:8090/camera1.mjpg I got:
av_interleaved_write_frame(): Broken pipe
Error writing trailer of http://…/camera1.mjpg: Broken pipe

What did I do wrong?

I don’t know what is different, but this is working now:

HttpPort 8090
MaxHTTPConnections 20
MaxClients 5
MaxBandwidth 100000
CustomLog -

<Feed feed1.ffm>
File /tmp/feed1.ffm
FileMaxSize 200K
ACL allow

<Stream test1.mpg>
Feed feed1.ffm
Format mjpeg
AudioBitRate 32
AudioSampleRate 16000
VideoSize 2560x1440

Hello All,

first of all I’d like to thank you for pointing me for this solution. I want to share solution I “create” which might find someone usefull especially regarding to high CPU usage.

Eventhough it is not issue in my solution (my HPE microserver can handle the streaming with no issue) I believe it is wasting of resources when 30% of CPU is used when nobody is watching current stream.

Solution: turn on streaming on demand.

  1. Configure ffmpeg as outlined in this post
  2. Create dummy switch for turning on the streaming - Dummy/Virtual item without a binding
  3. Create simple script for running the streaming from cammera and save it i.e. in /etc/openhab2/scripts/
timeout 60 ffmpeg -rtsp_transport tcp -i 'rtsp://rstp_cam_rstp_stream'
  1. Create simple rule like:
rule "Streaming on"
			Item dummy_switch changed from OFF to ON
  1. Assing the dummy switch into HABPannel dashboard to turn on streaming on demand.
  2. Enjoy the result

PS1: the streaming will be available for 60seconds
PS2: saving the PID will help us also to turn off the streaming on deman.

Hopefully will find it out helpfull.


1 Like

It works with yi-hack-v4

ffmpeg -rtsp_transport tcp -i 'rtsp://'

MaxBandwidth 10000 in /etc/ffserver.conf

Just what I wanted to start to use.

However one question:
What happens if you have multiple devices? This will run always, consuming (I think) a measurable amount of CPU. Has anyone tried using this with more than 1 camera on RPI?

That is why some people:

  1. Buy cameras that already create mjpeg streams and the IpCamera binding has brands listed that support this. You then can have multiple cameras with zero cpu load.
  2. Setup a dedicated video server to do the conversion and provide other great features and nice ways to review the footage. Cost of the server and ongoing electrical costs can add up and make it better to choose option 1 for some people.
  3. Use HLS streams instead, see IpCamera binding as this can do it from a rtsp source and the readme file covers it.
  4. Buy a new server for Openhab that has intel quicksync features to allow ffmpeg to do some processing with hardware acceleration.
  5. Only do this when needed see this post only made a few days ago:
    How to display RTSP streams from IP Cameras in Openhab and Habpanel (Linux only)

Thanks I will first try and see how this works.
Basically I don’t want to invest too much into this (this is why I bought a cheap Xiaomi camera, where I will hack the firmware, which only supports RTSP) because it will be only indoor cameras (and only a few, max 3) and I don’t want to record it, just to have a view if I’m away.

This is what I wanted to ask, but it seems that there is no way to detect when a specific sitemap/group is viewed in BasicUI…

Also ffserver is not available on RPi. Did you used this with an RPi and if yes, how?

Isnt ffserver part of the ffmpeg package?

Yes it is installed when you install ffmpeg however it is only on Linux it was never done for the other platforms. Motion, shinobi and a few others can also do this and more.

Lots of ways you can choose to go.

I successfully did install ffmpeg but ffserver is not there. Could it be that ffserver has been removed from the package in the current version? If so, is there any alternative way to get your approach working?

That is possible as it was discontinued in 2006 and the ffmpeg website recommends other ways, just google it or use the packages I mentioned in my last reply.

However give me a few days I will see how easy it is to code into the ipcamera binding.

That would be great if this could be made in the binding!
My (hacked) Xiaomi camera also supports ONVIF. Can I use that with the IP camera binding?

Try the latest ipcamera binding, I put a bit of code in that need testing to look for bugs but it is working for me if you only open it once. Ability to open the stream on multiple devices are the same time will be looked at over the next few days. If you leave the normal stream source field empty, it will now uses rtsp and ffmpeg to create the mjpeg.

If the camera does not have bugs in the onvif support it should work and even get auto found by the binding now. If the camera breaches onvif requirements are does not report a snapshot url it can still be used by setting it up manually as a HTTPONLY thing type.

Thanks for sharing your method, really good to have multiple ways to choose from and in the forum for people to find. The binding can now create the stream on demand without the user needing to push a button, if the UI is open and needing the stream the CPU load is there creating the stream. To disable it like your method if the UI is left open then you would need to use a string that a button can clear or fill that is used in the Video item. However it is far easier to just close the UI and the binding stops.


I’m working with Ip camera Binding now and so far I have couple of troubles :slight_smile: first of all I kinda don’t get MJPEG (even it is foscam camera - supplier informed me that mjped is no longer supported - looks like true for as I’m no able to turn the mjpeg type stream via CGI command). Will in this case the binding perform coversion for me? ( I have setup the FFMPEG input and output).

Second issue is the zoom motion detection looks like not working at all, any suggestion?


Zoom motion never heard of that so probably a feature that is not implemented from their api?

Next version of the binding you will be able to do it with any thing type.

Until then you could set the camera up a second time as httponly, or you could just use the snapshots.mjpeg feature Outlined in the readme.

Sorry, I wrote it wrongly without commas, the second issue is the features like zoom, motion detection, enable/disable motion detection, pan, tilt and infra LED control aren’t working.

The issue is my Foscam is not supporting mjpeg (as I wrote above) and I’d love to use RTSP > MJPEG conversion on the fly when the UI is opened.

I will try to use suggested method with snapshots.mjpeg feature.