How to display RTSP streams from IP Cameras in Openhab and Habpanel (Linux only)

It works with yi-hack-v4

ffmpeg -rtsp_transport tcp -i 'rtsp://192.168.1.61/ch0_1.h264' http://127.0.0.1:8090/camera1.ffm

Note:
MaxBandwidth 10000 in /etc/ffserver.conf

Thanks!
Just what I wanted to start to use.

However one question:
What happens if you have multiple devices? This will run always, consuming (I think) a measurable amount of CPU. Has anyone tried using this with more than 1 camera on RPI?

That is why some people:

  1. Buy cameras that already create mjpeg streams and the IpCamera binding has brands listed that support this. You then can have multiple cameras with zero cpu load.
  2. Setup a dedicated video server to do the conversion and provide other great features and nice ways to review the footage. Cost of the server and ongoing electrical costs can add up and make it better to choose option 1 for some people.
  3. Use HLS streams instead, see IpCamera binding as this can do it from a rtsp source and the readme file covers it.
  4. Buy a new server for Openhab that has intel quicksync features to allow ffmpeg to do some processing with hardware acceleration.
  5. Only do this when needed see this post only made a few days ago:
    How to display RTSP streams from IP Cameras in Openhab and Habpanel (Linux only)

Thanks I will first try and see how this works.
Basically I don’t want to invest too much into this (this is why I bought a cheap Xiaomi camera, where I will hack the firmware, which only supports RTSP) because it will be only indoor cameras (and only a few, max 3) and I don’t want to record it, just to have a view if I’m away.

This is what I wanted to ask, but it seems that there is no way to detect when a specific sitemap/group is viewed in BasicUI…

Also ffserver is not available on RPi. Did you used this with an RPi and if yes, how?

Isnt ffserver part of the ffmpeg package?

Yes it is installed when you install ffmpeg however it is only on Linux it was never done for the other platforms. Motion, shinobi and a few others can also do this and more.

Lots of ways you can choose to go.

I successfully did install ffmpeg but ffserver is not there. Could it be that ffserver has been removed from the package in the current version? If so, is there any alternative way to get your approach working?

That is possible as it was discontinued in 2006 and the ffmpeg website recommends other ways, just google it or use the packages I mentioned in my last reply.

However give me a few days I will see how easy it is to code into the ipcamera binding.

That would be great if this could be made in the binding!
My (hacked) Xiaomi camera also supports ONVIF. Can I use that with the IP camera binding?

Try the latest ipcamera binding, I put a bit of code in that need testing to look for bugs but it is working for me if you only open it once. Ability to open the stream on multiple devices are the same time will be looked at over the next few days. If you leave the normal stream source field empty, it will now uses rtsp and ffmpeg to create the mjpeg.

@rkrisi
If the camera does not have bugs in the onvif support it should work and even get auto found by the binding now. If the camera breaches onvif requirements are does not report a snapshot url it can still be used by setting it up manually as a HTTPONLY thing type.

@z4nD4R
Thanks for sharing your method, really good to have multiple ways to choose from and in the forum for people to find. The binding can now create the stream on demand without the user needing to push a button, if the UI is open and needing the stream the CPU load is there creating the stream. To disable it like your method if the UI is left open then you would need to use a string that a button can clear or fill that is used in the Video item. However it is far easier to just close the UI and the binding stops.

Hi,

I’m working with Ip camera Binding now and so far I have couple of troubles :slight_smile: first of all I kinda don’t get MJPEG (even it is foscam camera - supplier informed me that mjped is no longer supported - looks like true for as I’m no able to turn the mjpeg type stream via CGI command). Will in this case the binding perform coversion for me? ( I have setup the FFMPEG input and output).

Second issue is the zoom motion detection looks like not working at all, any suggestion?

Y

Zoom motion never heard of that so probably a feature that is not implemented from their api?

@z4nD4R
Next version of the binding you will be able to do it with any thing type.

Until then you could set the camera up a second time as httponly, or you could just use the snapshots.mjpeg feature Outlined in the readme.

Sorry, I wrote it wrongly without commas, the second issue is the features like zoom, motion detection, enable/disable motion detection, pan, tilt and infra LED control aren’t working.

The issue is my Foscam is not supporting mjpeg (as I wrote above) and I’d love to use RTSP > MJPEG conversion on the fly when the UI is opened.

I will try to use suggested method with snapshots.mjpeg feature.

Y

It would be a good idea to test if your camera supports the api by entering in the URLs into any browser. And contact Foscam support as it may be as simple as ticking a box to enable cgi/api but as I don’t own Foscam I do not know.

Hi, sorry for dumb questions, I’m totaly confused now with all these being said in the topic, but can you please forward me in right direction. I have several “yoosee” cameras. I have installed IP Camera bindings and have added HTTPONLY Generic IP Camera by simply writing IP, username and password. Seems like camera is online as states in the thing. Then I opened the thing and tried to link channel (tried RTSP and Image) but frame with video does not appear neither in control nor on the BasicUI page. Maybe I’m doing it wrong, but how to get frame, let’s say 200x300 on the BasicUI page with video or picture from the camera? My RTSP string which works in VLC is like:
rtsp://user:password@192.168.1.119:554/onvif1

Hi
I am new to this. I have a RPI 3+ and I sucessfully installed you IP Camera binding and added a Prikim camera to my openhab then I wanted to be abe to see my stream through Habpanel so I followed your instalation process but it doesnt show up. From what I understand you want people to tweak information in here but am not sure what

<Feed camera1.ffm>
ACL allow localhost
#uncomment if you want an IP range to be able to access the stream
#ACL allow 192.168.0.0 192.168.255.255
</Feed>

<Stream camera1.mjpg>
Feed camera1.ffm
Format mpjpeg
VideoFrameRate 6
VideoSize 640x360
VideoBitRate 2048
VideoIntraOnly
NoAudio
Strict -1
</Stream>

This is what apears in Habpanel

Also what should I test in the two SSH windows? Thank you for your help.

Install this widget and read the ipcameras readme file on how to create mjpeg with ffmpeg.

1 Like

https://trac.ffmpeg.org/wiki/ffserver#Creatingtheconfigurationfile

What do we do now? Is there a way to do all of this without having to use an old, discontinued, unsupported piece of software?

Your answer is literally in the post above yours. The ipcamera binding can do it, you no longer need to use ffserver.