Running openHAB in the cloud using a Raspberry Pi 0 W as a local USB Server

Hi Everyone,

I’m currently running OpenHAB 2.5 on a local Raspberry Pi 3 B+ with limited issues to date. My setup is pretty straightforward and the only home automation devices on my network are Z-Wave and Zigbee. These home automation devices are controlled a Nortek USB controller.

Given the simplicity of this setup and the prevalence of free cloud computing services I’m starting to investigate the possibility of moving my OpenHAB 2.5 to the cloud and using a simple USB Server to host my Nortek USB controller.

I know this may seem like a Rube Goldberg way of implementing OpenHAB but I think there are some benefits with this approach. The first and most important for me is that it will move the OpenHAB instance to a robust storage media not susceptible to wearing. While local storage media could be used to alleviate the wearing concern some cloud platforms offer comprehensive backup services. The next benefit of this approach is that the hardware is scalable and fault tolerant. The only negative I can think of is that a constant internet connection between my USB and cloud server is required.

The setup I’m currently investigating would look something like this.
*A Google Cloud Micro Compute instance running Linux and OpenHAB 2.5
*A Raspberry Pi 0 W with the Nortek USB Controller running the VirtualHere USB over network service

The Google Cloud instance would be configured to access the Nortek USB Controller using VritualHere or some other USB over Internet application.

I’ve searched the forums and haven’t found any references to this kind of setup. I’m wondering if anyone has tried a setup like this in the past and if they have been successful before I start going down this endeavor.

One BIG issue is openHAB 2 has no authentication or authorization security mechanism. OpenHAB 3 has the very start of some authentication & authorization.

You do not want to open up your home automation system to be controlled by everybody on the Internet!

I can’t comment on the cloud, but let me ask you this: how much automation do you want to retain when you lose the Internet connection?

Our ISP’s fiber line was recently cut, and we lost Internet connectivity for more than a day. That was painful, but all my home automation was still working, as I intentionally chose to run OH locally and also make sure that all devices can operate without needing the cloud.

1 Like

Think twice if you really want to make yourself, your home and reliability of your local functionality dependent on your internet connection and ‘the cloud’, which - in Kai’s terms - is “just other people’s computers”.
Even more so given it obviously isn’t a no-brainer and chances are your setup will introduce errors you wouldn’t encounter on a local server.

Move to use openHABian instead, it has ZRAM to mitigate SD wearout, plus an elaborated backup concept.

@Bruce_Osborne I appreciate the concern regarding authentication and security. Seeing that I would install openHAB on AWS or Google Cloud I would be leveraging their security services that limit access to the virtual machine. One could make that case that this built in security would compensate for any short falls with openHAB. In addition the USB server housed on perm provides encrypted traffic and certificate authentication. These two things combine would result in a more secure implementation of openHAB when compared to what I’m current running.

@yfaway I did consider the possible loss of internet. I’m fortunate to live in a area where my ISP is up 99% of the time. A majority of my automations control my lights at sunset and midnight. I’m not overly concerned if I miss one of those events. Unfortunately I have had my Raspberry Pi and SD card die while I was on vacation preventing any control of my lights which is less than desirable. While it is easy enough to recover from a backup it still takes time, effort, and the procurement of hardware. My thought was that if openHAB was running in the cloud the chances of something going wrong with that instance are slim. The Raspberry Pi 0 W USB server would have to be recreated by that could be a simpler project than trying to recover openHABian.

@mstormi I have been using openHABian but have still been subject to hardware and SD card failures even after implementing ZRAM. Completely understand that if implemented I will be subject to the mercy of my ISP and cloud provider. I’ve been fortunate to have an ISP with high reliability and i haven’t had any issues with AWS or Google Cloud. I agree that this configuration is non trivial but it does open up some interesting applications for openHAB such as the ability to control multiple sites from a single openHAB instance.

You’re the first to report that…are you sure ZRAM was installed correctly and working properly ?
And that the SD failure was not just bad luck ?

openHABian provides a Tailscale VPN for that purpose.

A good option for basic internet control is https://www.openhab.org/addons/integrations/openhabcloud/
I use sitemaps in combination with android and ios apps for “outside” control.

If you want more, look into VPNs. :slight_smile:

@J_Cat definitely give the cloud a try and let us know how that goes :).

The backup need is still gonna be there even if you just run a Pi 0 for the USB stick. At the very least, you would need a Pi 0 on standby, in case it dies. Now I do agree with you that as less is run on the PI 0, the chance it dies compares to a Pi 3 with OH is lower, but it definitely is not zero.

I’ve been lucky so far. My Pi 3 has been rock solid since I started with OH a few years ago. That said, I understand that it could happen, and as such does regular backup whenever I make changes either to OH or the OS image itself. Due to historical reason, I am not running OpenHabian. These are the two commands I use to do the backup/restore:

Backup:
sudo dd bs=4M if=/dev/sdb of=~/image.img

Restore:
sudo dd bs=4MB if=~/image.img of=/dev/sdb conv=fsync

Most of us also run something else on the PI, so the backup need is already there. And as you seen above, it’s not too bad.

@mstormi I’m fairly confident that ZRAM was set up correctly per the instructions provided and that the SD card did fail. I do agree that Tailscale can properly secure the openhab environment and I will leverage it for this experiment.

@NilsOF openhab-cloud does not resolve the issue that I’m trying to resolve. What I’m looking to do is have the openhab system run in the cloud itself and not locally. My thought is that I could create a somewhat dumb device locally that would only be used for controlling my USB Z-wave/ZigBee device. By moving the openhab system to the cloud I would have a little more resilience with regards to the file system.

@yfaway the chances of failure are not zero but I do think it is less give that there would be less activity on the Pi 0. If successful I’ll document the steps taken to getting this type of configuration working. Right now I’ve installed openhabian on the cloud and have it configured. I’m just waiting for the Raspberry Pi 0 W to show up to finish the setup.

I don’t want to interfere with main topic, but maybe you could have a look on my topic: ConnectorIO Cloud - seeking for alpha testers.
We have our own linux build which is based on yocto (not yet ported to arm/raspberry). It can be entirely controlled and reduce load on SD card to bare minimum. Long term plan is really to turn edge device to an gateway. Operating system and device authorization can be set up in a way which makes one write per 10 minutes (can be adjusted to one per hour) to store authorization token.

Whole point of it is reducing load on devices running OH. It is still in fairly early incarnation. Anyhow, looking forward to hear your opinion there.

1 Like

Instead of waiting for my Raspberry Pi 0 W I tried the setup with my existing Raspberry Pi 3. The good news is I was able to control my Z-Wave and Zigbee lights using this setup. Below are the steps that I took to create this setup. I used Google Cloud for this but I suspect that AWS or another cloud service can be used

Creating the Google Cloud Virtual Machine with openHAB
From the Google Cloud Console I created a new Virtual Machine with the following parameters:
Name:openhab-on-cloud
Machine Configuration:Series N1, Machine Type f1-micro
Boot disk:Ubuntu 18.04 LTS Minimal with 30G hard drive
Firewall:Nothing selected

The rest of the settings I pretty much left was is. I did not use a static IP or any other special settings. Given this this is a f1 instance with standard settings it is covered by the always free portion of Google Cloud services

*Note I was using Ubuntu 18.04 has it had the required USB/IP library

Configuring the Virtual Machine
The first think I did with the Virtual Machine was create a swap file. This was done by executing the following commands.

sudo  fallocate -l 1G /swapfile
sudo chmod 600 /swapfile
sudo mkswap /swapfile
sudo swapon /swapfile
sudo sed -i '$ a /swapfile   none    swap    sw    0   0' /etc/fstab

With the swap file created I added this Virtual Machine to my tailscale network. Instructions can be found at tailscale download Ubuntu 18.04.

With tailscale installed I proceeded to install openhab (not openHABian) using the instruction provided on the installation page. I decided to use Zulu Java 8 and I installed it via the instructions provided by Zulu.

With everything installed and running as a service I was able to access my openhab installation via the tailscale IP access that was assigned to my Google Virtual Machine.

I did try to go a backup and restore by my openHABian configuration to my openhab-on-cloud setup but that resulted in the system hanging. I decided not to debug it and just rebuilt my setup on openhab-on-cloud.

Configuration the Raspberry Pi and Google Cloud for USB/IP
I installed the USB utilities on the Google Virtual Machine using the following command.

sudo apt-get install usbutils

After that I configured the USB/IP server (Raspberry Pi) and client (Google Cloud) using the instructions found at Setup a Raspberry Pi as a USB-over-IP server. The instructions are very straight forward and work with Ubuntu 18 (instructions will not work with Ubuntu 20). I was able to identify my Nortel USB Controller and setup a service that made it available. Using my tailscale ip addresses I was able to connect my Google Cloud Virtual Machine to my USB device. These instructions made the device look like it was attached to the Virtual Machine itself.

The Google Virtual Machine did tell me that I had to install two packages along with linux-tools-generic to get everything working. The prompt was straight forward with the names of the packages I just forgot to write them down.

You will also need to make sure that the usbip demon is running on the Raspberry Pi or else you will see connection errors on the Google Virtual Machine.

Rebuilding my openHAB configuration in the Cloud
With everything configured I rebuilt my openHAB setup on my Google Cloud Virtual Machine. I started with the basic setup like location and time and then moved on to my Z-Wave and Zigbee bindings. Finding the right ports was pretty ports was straight forward but did require some trial and error. The other setting were directly from the Z-Wave and Zigbee configurations. Rebuilding the Z-Wave network was extremely easy as that comes over from the USB stick itself. The Zigbee network had to be recreated from scratch with was not a big deal.

Summary
I did have to rebuilt the virtual machine a couple of times to get everything just right, but, once I figured everything out I was able to get this configuration working. I did trying breaking it a few times by powering on and off the Raspberry Pi and had no issues. I’m going to try and continue to break it to see what could go wrong.

Overall it wasn’t a bad rainy day project. I’ll try setup it up with the Raspberry Pi 0 W when it comes in. Let me know if you have any questions about this setup. Also interested if you have any comments.

Edit
I also installed hwdata on the cloud server. This prevented an error from being displayed when usbip was run.

Well it was not as robust as it seems. After a few hours I no longer had control of the local USB device. The Virtual Machine lost access to the USB server and was reporting that it could not connect to the tailscale ip address. Looks like I will have something to look at over the holiday break.

Latency between USB server (your rpi) and client might be sufficient to destabilize link over time. Under normal circumstances serial port communication is done within millisecond while in case of network bridging it is subject of fluid changes. Depending on routing conditions on your ISP side (which you do not control) it might start going over different path resulting in increased delays. The end result might be 20-30-50 ms slip to ACK a basic serial port frame. It can not work in reliable way in long term. Depending on communication kind - for request/reply you have quite good chances to miss ACK window from time to time.

I was doing development with remote radio devices using forwarded serial port and I had to restart socat every two hours or so. The keep alive flags didn’t help much, it is not reliable way. From bright sides - you can still try to setup watchdog scripts to prevent link from getting stale.

Best,
Łukasz

Honest question: What’s the benefit of moving a smarthome solution in the cloud? as mentioned in many ways here already: openHAB is aimed for and advertised as Intranet of things. Moving it into the cloud only makes you dependend on so many levels and that’s not even calculating the cloud plan you have to pay additionally.
So,I could live with “because I can” (as we all know exactly this is our first off the hat excuse for many things we all do), but is there another motivation behind it?

edit:
if it’s solely on distrust on SD Cards - there are plenty of ways to utilize openHAB (e.g. as mentioned on a external USB-disk (magentic, SSD, whatever) or using a NAS system (many run OH on docker on a Synology) or even an old laptop/computer)…

2 Likes

@splatch it looks like it was a self inducted error. One of the changes that I made post check out inhibited the usbipd service from starting on reboot. This prevented the client from access the server. With that issue fixed I’m having no issues. I also modify the client service to reattach to the usb/ip device 30 seconds just in case it drops.

@binderth You are right there are a number of different ways to solve the distrust of SD cards. I could have spent some money and bought a NAS, external USB drive, or implemented any one of the other solutions. The implementation I tried is definitely a Rube Goldberg way and started off as “because I can” and I have a free cloud resource that was not being used (originally had it to host my own openHAB-cloud instance). All the tools were available and just needed to be integrated. After the successful demo I started to think of use cases and benefits for this type of setup. One thing that came immediately to mind was the ability to control multiple smart homes under one instance of openHAB. For just $10 plus dongle(s) and internet connection I can add a second or multiple smart houses to my single openHAB instance. This will allow me to see the temperature, other sensors, or control devices from multiple homes on a single page (no need to know multiple IP address or reconfigure the openHAB phone client every time I want to see another home. Now this could have also been done with a server at home but that solution may not have the same availability of the cloud. Having access to the always free Google Cloud services made me look at that as a viable option. If I had to pay for computing power and network access I may have looked at the home based solutions but free is good enough for me. I know what giveth can be taketh away. If the cost curve inverts I will be looking at home solutions again. Either way it was a interesting rainy day project that polished my linux skills.

I do just that with multiple openHAB-instances using MQTT for exchanging states and commands. You could use a “free-of-charge” Cloud-MQTT or (as I do), I VPN all remote instances and use my internal MQTT-server for that. I basically have multiple sitemaps for different purposes and one “main” openHAB-instance, which accounts for central logic (aka rules). If I need to trigger something on a remote instance, I send a command to the remote binding. Of course, I am reliant on a internet connection, but as of now even the LTE-based instance in a remote mountain cottage is as stable as it can get.

I do have a connection to openHAB Cloud and it works just fine… :wink:

I wouldn’t be sooo sure. I started with 1TB of free googleDrive 10 years ago - and now it’s down to 100GB. google/alphabet is not known for stable plans and kicks even paid ones (ask Revolve, Nest, …)

That I trust it was! :wink:

how was the response time on the z-wave and zigbee stuff?

now I’m I mistaken or could your solution considered in this thread be capable of being rebuilt remotely? I understand that if remote Pi zero with dongle goes belly up, onsite hardware replacement would be needed but if main openhab server on cloud goes sideways, repairs/rebuild could be performed remotely right?

@Andrew_Rowe The response time was just a hair slower than running everything locally. Best way for me to describe is that it was not as slow as using Wink but just not has instant as using the local installation. The additional delay is probably on the order of 100 mseconds. I think the odd of openHAB going sideways on the server are very small. I’m still at the mercy of the local Pi which is the single point of failure that can’t be rebuilt remotely. Given that the openHAB server is running on Google’s cloud it’s pretty easy to access that and change things remotely.

:open_mouth: awesome
my z-wave can be kind of laggy at times, reason I was asking

cool idea and experiment for sure even if it doesn’t fit everyone’s idea of the perfect solution… thanks for sharing and step by step instructions :+1:

@Andrew_Rowe I’m using the Nortek HUSBZB-1 and have never had an issue with lag or response times.