How to configure a hacked Xiaomi Dafang to work with OpenHAB

I was hoping to use my old iPad as a habPanel display today but I struggled all day to get a picture feed from newly hacked xiaomi dafang camera in habPanel with chrome/safari on my iPhone/iPad.

http://xxx.xxx.x.xx/cgi-bin/currentpic.cgi
This string finally got a picture in my sitemap/habPanel when Inviewed it on my PC with Chrome.

But nothing on the iPhone.
Any tips?

any way to contorl the microphone or the speaker with this hack ?

That is interesting. I mainly use Windows and Android so had not encountered this problem.

I had issues displaying the picture feed on Android until I disabled HTTP to HTTPS redirection on the camera. The process for doing that is mentioned in the third post. Have you tried that yet?

Yes, but as far as I am aware it does not support push-to-talk functionality. You can play pre-loaded audio files and initiate an audio recording. Either of these could be started manually or by adding a command to a script.

From the Dafang Hacks FAQ:

Cool thanks!

Thanks for the information. I will try to export the pictures stored on the SD card to my NAS. Maybe this way I can somehow embed the pic in habpanel.
I did not get MQTT to work with the new binding (lack of skill on my side unfortunately) released for v2.4. Is there any way to test/log this in openhab (for example I saw in the openhab logs that the cam items received updates)?

Not to worry, I have updated the first post in this thread with a working configuration for use with the new MQTT 2.4 Binding. :wink:

Awesome! Thanks a lot, now everything seems to be working

It is even easier as I just found out. No need to change/write scripts. The image of the motion is already provided in the topic “myhome/dafang/motion/snapshot”.
In the configuration file you just need to set

save_snapshot=true
publish_mqtt_message=true

Then you can add to the sample you provided in the first post:

mqtt.things:
Type image : motionimage "Motion Image" [ stateTopic="myhome/dafang/motion/snapshot" ]

camera.items:
String Cam01_Motion_Image "Motion Image" <snapshot> (GF_Camera_01) {channel="mqtt:topic:mosquitto:somecamera:motionimage"}

I have not added it to the sitemap yet but it displays perfectly as image widget in Habpanel.
Thanks again for the help and links!

2 Likes

Hi,
I’m interested too in this topic but I need extra help to create the mosquitto thing.
Would you please give me some tips about the steps I need to do to configure the Dafang and the mosquitto broker?

Where are you stuck? Is the firmware already running on your camera? For the mqtt part Max explained everything that needs to be done (see also links to the binding doc).

Sorry for the stupid questions:
The firmware is running on the camera.

  • Should I change some parameters in the root\config\mqtt.conf ?
  • Where should I copy files mqtt.things and camera items? OpenHab or Dafang?

After you installed the broker in openhab you add the address in the conf file of the camera. The other files you mentioned will be created in openhab in the usual folder for things and items.

Have you tried this with the non-panning original Xiaofang? The Dafang Hacks has support for the Wyze V2/Xiaofang 1S… so, will this example w/o the panning commands? Has anyone tried it?

I just hacked two WYZE V2 cameras and am hoping to get them working with HABPanel. Has anyone had any success? On a Mac, I can display the web interface, but for dashboard purposes it is not great because the video has a large frame around it. I’d like to just have the stream fill the panel. I ultimately want to use the dashboard on an iPad and Raspberry Pi.

Have you tried this method at all? If it works in VLC then I would be surprised if these steps wont work.

Ok people, I managed to get video stream from Hacked Dafang in OpenHab (also in iOS app)!!

First need to transfer video through ffmpeg. As described in here:

On camera You need to set stream server as H264. Then start ffmpeg:

nohup ffserver &
nohup ffmpeg -i rtsp://xxx.xxx.xx.xxx:8554/unicast http://127.0.0.1:8090/camera1.ffm &

as xxx You nedd to eneter Your camera IP adres.

Then in sitemap You need to use Webview

Webview url="http://xxx.xxx.xx.xxx:8090/camera1.mjpg" height=5

here xxx must be IP adres of ffserver (like openhabian).

And You will get Your stream working!:wink:

I have had similar issues displaying video with my foscams. One of the big issues is that the RTSP feeds have a big delay on them, which is not very useful for live video feeds.

The way I solved it is to use motion (which I’m sire you are familiar with) https://motion-project.github.io/ but I changed the motion source code to read directly from my foscam firmware. This isn’t a Foscam firmware hack, Foscam has an undocumented low level (socket level) access to the h264 video feed (and other features).

So I added access to this video feed to motion, and then serve up the mjpeg feeds from motion (at 640x480 resolution) via web pages. These web pages can be embedded in any Openhab page using the Webview widget in the sitemap. Works on everything I have tried, web, IOS, classic UI etc. (I haven’t tried android).

I also use motion for motion detection, motion tracking (with my foscam pan-and-zoom cameras), etc, all the usual motion good stuff.

You could do this with the normal motion interface (which supports RTSP), I didn’t use that interface because of the 20s lag on the foscam RTSP feed, but there may not be the same lag on other cameras.

This is similar to the ffserver/ffmpeg method, but motion is designed for cameras, can handle multiple cameras, and allows lots of customization, text overlays (I overlay the weather), motion highlighting, motion detection, motion tracking (if you have the right interface - I added my own foscam one), and is intended to run as a daemon.

It’s a bit processor intensive, running 5 cameras takes about 60% of one of my (2.4GHz) server cores, but overall, that’s not bad.

I’ve attached some pictures of what it looks like, the screenshot is of my iPad (live 640x480 video) - recordings are in full HD.20190112-114702

The other nice thing is that because it’s served up in a web page, you can lay it out how you like, and make actions if you click on an image. Here there are 4 live video feeds, clicking on any of them zooms the video feed to full screen, click on it again, and it goes back to the 4 up display. You need your own web server, and a little html skill, but not too much, and most people running OH2 have their own web server.

Hope this helps!

2 Likes

@Nicholas_Waterton
Motion is definitely a good way to go especially if you have multiple cameras and if they don’t have motion detection available via an API then it is a must to check it out. I have stumbled onto people who are using ffmpeg and a filter to do motion detection via scripts, but people who use it say the detection is not great and the Motion project has far better motion detection then using that hack method.

What may interest you is that the IpCamera binding now has ffmpeg features built in which allows you to cast your camera feeds to chromecasts with no scripts, no extra servers and it should work on all platforms not just linux.

My Dafang should arrive soon so I’m going to start watching this thread

Hi all,
My V2 cam is working fine and the integration in OH2 is working too. So good so far.
The motion detection via MQTT is working and I got an picture in PaperUI. Thanks

Now I’m trying to deactivate the motion detection by a rule, but I found no command to get it deactivated.
I have a thing like this:

Type switch : motion “Motion Detection” [ stateTopic=“zennix/ipc1/motion/detection”, commandTopic=“zennix/ipc1/motion/detection/set” ]

and a Item like this:

Switch IPC1_motion_detection “IPC1 Alarm” {channel=“mqtt:topic:Mosquitto:IPC1:motion”}

This item is added to a PaperUI control to send On or OFF to the Item. But the motion detection will not deactivate.
What is wrong in my construct?

Thanks / Zennix