Best hardware platform

Which is what the UPS is for.

Agreed, USB is no better.

And yet we all seem to experience the problem at some point or other. And often it is ready to tell the difference. For example, a files that revert to the same older version on restart of the OS every time is not caused by the corruption of a single block.

And stopping all upgrades and updates, dealing with the writes to fake hwclock, etc. Agreed, you can run as read only but you have to disable a lot of other important features like OS and OH upgrades. If not, then you have only reduced the amount if writes, not eliminated them. But if you reduce enough of them then it’s no big deal as you have extended the life of the SD card beyond what anyone should need.

And you have a fool proof way to help people I’d such a counterfit?

From a practical perspective on the forum, it is exceptionally difficult if not impossible to help or guide a user through the process of diagnosing the difference between corruption because of power loss. So we have two choices. We can risk waisting time and have them rebuild on the old SD card it were can have them go but a new one and stress the need for backup and restore procedures and a move to a network mount or SSD or HDD.

Sorry, I’m not convinced. Guess I’ll just have to keep spreading my missinformation.

My appolgies that was not directed at you, I got ranting about what people in general write and should not have done so after hitting reply on your post.

It seems like there is a need for hardware targeted specifically for OpenHAB. I have a Wink Hub 2 which has Zigbee, Zwave, BT and Wifi radios and yet it is largely useless when it comes to bindings for my stuff and no way to improve it. If I made a Wink Hub2-like Linux board with all the radios would people be interested? My current thinking:

  • Uses multiple Ember EFR32 chips which can be configured for Zigbee, Zwave or BTLE
  • Uses Octavo OSD335x which is basically a BeagleBone Black in SoC form.
  • Would still have usb, GPIOs, I2C, etc…all the stuff a BeagleBoneBlack has.
  • Would have built-in flash plus sd-card…but configured in a way that the OS doesnt get corrupted. I’ve had this happen on some of my RPi/BBB projects as well.
  • This thing would be small and cute. Probably about the size of a hockey puck would do it.
  • It would not be RPi cheap unfortunately…scale/demand can’t compare to RPi demand and thus economics. The Octavo chip itself is $40, PCB, case, plus 3 Ember radios and a Wifi one I might be able to target $150.

You can’t even get RPi processor chips so RPi-based is a bust unless I just make a Hat. Would people prefer just a Hat with the radios? That would likely be cheaper, but doesnt solve the corruption issue.

I am new to HA in general but this is trully the Inter-mess of Things. The industry is still too fragmented but OpenHAB is the first thing I’ve seen that seems to bridge the divides. I just got some NXP based chips working with custom firmware based on their SDK and going to try out the Ember ones next as their chips can become any mesh-net standard - zigbee, zwave, btle, etc.

Someone already has done this, at least they did it using a RPi compute module as the “computer” part. I can’t find the original postings right now. I’m not sure how much it cost though.

One if the challenges is zwave is proprietary and needs to be licensed by device manufacturers which drives up cost quite a bit.

On the other hand, building something like this is exactly why the core of OH was split out into a separate project under Eclipse so that companies get assurances that the IP had been properly transferred and they face no down stream risk of an IP violations. So what you propose is right in line with where we want to go.

This would be very hard to do in a then key system like you propose without not supporting persistence and not preserving lids across the reboot.

Dealing with corruption caused by power failures will be challenging as well, see Matt’s posting above.

That’s an idea. They to an approach that a lot of people take (e.g. OpenSprinkler) and you can sell it BYOP or with the RPi. Lots of people already have RPis lying around.

One challenge will be managing interference. I do know lots of people have issues trying to enable BT and the Razberry zwave hat at the same time. I’ve had problems running the BT in scanning mode with the wifi on RPi0w and RPi3.

Another form factor that would be even more useful to me and those who don’t run on a SBC would be a stand alone device that I can access over the network with all of those radios. Then I can put it in the ideal location and host my OH in the basement, for example.

Making something like that turn key might be a bit of work but I bet enabling some Discovery over the network and scripts to set up socat would be doable.

Thanks Rich. I found the other thread with the RPi Compute3. I forgot about the RPi compute sticks and they are actually cheaper than the Octavio SoC and more powerful.

I was hoping to keep the IP/proprietary stuff in the Ember chips so there are no violations. Really I would like to just make the Embers like “wired in usb devices” without usb. I.e. the dongles seem to be Ember+FTDI device so I can just skip the FTDI device. I’m not certain of this yet. Also, perhaps being integrated means it then violates Zwave licensing, I dont know yet. How mature is the Ember coordinator? I hear it works great with zwave but lots of issues with zigbee.

I believe there is a way to prevent corruption if the OS was prepared for this hardware. Perhaps partitioned into a read-only root/OS, config and data parts. Config part can be configured for immediate sync/flush and the once in a blue-moon writes would rarely coincide with power loss, etc. The data part would be the only risk, mostly logs and state data. This could corrupt but should be easily recoverable or just wipe if you dont care about history. The config part could be backed up to cloud. It also helps to pair down the Linux OS to minimal services. I’ve had these same issues on my RPi converted treadmill which I cant wait to control via HAB+Alexa “Alexa, double time!” haha

I was thinking this puck would be stand alone like you see. I would still have built-in Ethernet too but I tend to favor Wifi for most things.

I am now thinking Hat or Compute board is the ticket. Maybe I will just wait for Sergio to come out with his board though I am not seeing mesh radios on there.

I think this thread seems to concentrate all to much on hardware failure (specially regarding SD/writing to disc). And maybe this is correct. What worries me is, this is a rather general matter on all platforms and all kinds of storage systems. And there is really only one solution - Backup.
You can not have a 100% fail-save system, and specially not a system like OpenHab running on an Rpi from a SD card. So while it might be a good idea to use an external HD for your system, you still need to backup the system.

Now… I doubt many disagree with that. Well, what really makes me wonder - Where is the backup option in OpenHab? Heck, we´re building automating systems here, and yet, for backup, there doesn´t seem to be any good and easy automatic solution…(easy - It has to be easy for people to use it).

Why ??

I have notices in openhabian-config there is something called Amanda backup. To be honest. If thats what it takes to do a backup. I´ll rather take the chances, make some manual copying from a SSH client once in a while, or otherweise start all over… This is NOT a motivating backup solution for anyone else, than people who already knows everything about liunx. I´m not one of those! Untill I started of with Openhab several months ago, I knew nothing about Linux. Today i know some, but thinking about it. First thing to know should have been a easy backup solution for “dummies” likes me!

I wonder why there isnt an option inside PaperUi to configure and run somekind of backup system. A simple scipt which add all important files into a zip file to download. Maybe even an advance option to do it automaticly and save to a network share path, cloud, whatever. And a super advance option, running in a crone job.
My bet is, this would motivate users to do more backups.

But backing up Openhab may only be half the need thing to do. Reading about the Amanda backup it also mentioned someting like z-wave networking. I admit, I never thought about this untill I read it. But one actually have to make a backup of the z-wave network as well in case the controller fails. Well, now things gets a bit complicated. And then I wonder again - I have the z-wave binding installed. Why isn´t there an option to backup the z-wave network from the binding as well? This leave me to search and install some 3.part utility to do this backup… But why? Is it a limit in the Rpi where my z-wave controller is connected to? I doubt that, to be honest. My windows PC can do it. But my controller is connected to my Rpi. So it makes not sense to remove the controller from the Rpi, place it in my PC to do backup, and then put it back to my Rpi. It does not make the over-all backup procedure an easy to remember thing. And it most probably ends up with not beeing done.

My guess is, if developers of systems (software in general) first thoughts was, “What to do when things goes wrong” (again note, it´s not an “if” question). Things like bad SD cards, hardware failures etc would be short term discussions in general. (And I would have saved myself lots of time writing this :slight_smile:)

Well just some thoughts I have been doing lately.

1 Like

I’m no expert nor an a lawyer, but my understanding is that if you sell a hardware device that claims to be zwave compatible, it has to be certified (i.e. go through testing) and you have to pay a license. It is the terms required to use the zwave trademark and name. I don’t think using the Ember chips will get you past that hurdle unless Ember has already paid the dues and done the testing for you, in which case that cost is already included in the cost of the Embers.

I’ve never heard of it so I can’t say.

See Matt’s posting above. If they are on the same SD card then a write to the data partition can corrupt the read only partition on a power failure because of the way the SD technology works. Using a separate partition won’t avoid this. Only using a separate physical SD card will.

And honestly, if it were me I’d more concerned about corruption on the data partition than the OH partition. I can rebuild the OS but if I lose my database that data is gone for good. And if you are looking for a turn key system, you must find a way to prevent corruption of the database. You can’t just throw it out on every reboot.

Have a look at the Docker Images.

However, be careful pairing down too far. One of the more powerful workarounds for integrating a lot of technologies that do not have a binding is through the Exec binding or executeCommandLine. If you pair down the OS too far you render both of these useless. This is actually a pretty big problem with the Docker containers and is a reason why I have a small python service running that I can send an MQTT command to to execute command line stuff for me (I run in Docker). The commands I need don’t exist in the container. Heck, the Network binding is neutered in Docker because there is no ping or arping command available. Heaven help you if you want to use some Python script you found that implements the protocol for your alarm system or something like that.

I’m intrigued. Bet it would make a great write-up.

You will see Markus and I push this very strongly all over the forum. If you have a good backup AND tested restore procedure then it doesn’t really matter when/if you have an SD card failure. And if you have a good backup and restore procedure, then that really is the solution and you don’t really need to spend any time on reducing writes and the like.

However, when an SD card wears out, you still have the problem that it does so silently. There is no SMART for SD cards to tell you when it is about to fail. Weird stuff just starts to happen. And once you finally notice it you don’t know how long it has been wearing out and you don’t really know the extent of the corruption. Consequently, you don’t know how far you have to go back to get to a good backup.

openHABian has Amanda backup and recovery built in. Since OH 2.2 OH ships with backup and restore scripts in the bin directory. A lot of us use software configuration management tools like git to backup and restore our configurations. Even more people take images of their SD cards using WinDiskImager or Etcher or whatever.

The JSONDB where all the configurations done through PaperUI or the REST API gets backed up automatically.

The disk image approach is by far the simplest for non-technical users. Just shutdown the RPi, pull the SD card, make the image on your Windows machine, plug it back in and reboot.

The problem is backup to what? How? Everyone’s system is different. They all have different

Because OH runs on just about everything and the nature and method of the backup would be different for each and every OS that OH can run on. That isn’t to say that it is impossible but it isn’t easy and no one has taken it upon themselves to implement something. The fact that Benjy has implement the backup and restore scripts that ship with OH to make them work across all OSs is honestly pretty awesome already.

And then you have to problem of what to backup. A simple script like you mention and what Benjy wrote pretty much already does that. However, what about your Mosquitto configuration? What about the contents of your InfluxDB where all that data you have been saving for the past year resides? There is no way that stuff can be backed up from PaperUI because all of that stuff is third party and it may or may not even be there. That is why openHABian uses Amanda as that backups up the whole system.

This is another thorny problem. Not everyone uses zwave and not every zwave controller supports being backed up and restored.

Because it hasn’t been implemented. OH is an opensource project. People donate their time to work on those things they want to work on. There is no way to force anyone to work on anything in particular.

Usually that 3rd party tool comes from the manufacturer. I can’t say for certain but I wouldn’t be surprised if the method and API calls necessary to do a backup and restore of the SD card controller varries from vender to vender and even potentially from model to model. I don’t think there is a standard way to do it. I know for sure the Aeontech Gen 2 doesn’t even support it. That’s one of the reasons I recently switched off of it.

So this is just one binding. Multiply that by 300+ bindings and it really starts to become an untenable problem. Perhaps someday it will be addressed but to do so will take extensive addtions to the core of Eclipse Smarthome.

So your best bet is to backup the whole damn system which means:

  • cloning the SD card like described above
  • running a tool like Amanda

If all you care about is OH, then you just need to grab /etc/openhab2 and /var/lib/openhab2, which is what Benjy’s scripts that ship with OH does.

It spounds so simple. But how simple does it sound to produce a backup and restore system that:

  • works on Windows, openHABian, other Linux, QNAP, Synology, and Docker
  • backups up not just OH stuff but backups up the configuration of the data stored on external hardware (e.g. the zwave controller)
  • backups up the configuration and data stored by third party service including one or more of five separate database servers not counting the near infinite number of database servers that support JPA or JDBC (JPA and JDBC are standard interfaces impemented by databases so OH doesn’t even know what the database in if using these)?
  • backups up other externalities like that one Python script you found on Github that lets you talk to your alarm system using the Exec binding
  • backs up the operating system itself
  • Oh, and that hardware or those external services that you are trying to back up may not even be running on the same machine as OH

It’s a pretty challenging problem.

1 Like

I have been testing the Telegea Smart Hub for some time now and while it was a bit of a learning curve to get OpenHABian loaded onto the RPi compute once I did it’s been pretty solid.

It’s funny though because I’m now comfortable with imaging the SD card in my main raspberry pi regularly as a backup and now I’m nervous as to how I’d backup the compute board. I’m sure there is a way though.

But the smart Hub does have plenty of functionality so do take a look at it! I was lucky enough to get a review unit to test out so not sure of the final cost though.

Just my two cents.

I’ve got OH2.2 running on a Ubuntu in the virtual machine together with couple of others (plex server, openvpn…)
It’s been running great aver a year now. The hardware is a small barebone PC with i3 I think. So it consumes minimum power. Z-Wave usb paththrough is also stable.

Had some issues installing ESXi as the network adapter was incompatible but with a custom distro works really well.

My path looks like this:

  1. Single Windows 7 OS
  2. Single Ubuntu OS
  3. Raspberry Pi2, SD card failed
  4. Raspberry Pi3, SD card failed
  5. ESXi with Ubuntu 16.04 LTS as a VM with Z-Wave Aeon get5 stick.

I have an off-site server with Veeam and site to site vpn tunnel, so it is backing up all VMs twice a day, in case I mess it up I just revert to the backup snapshot.

I’ve also was experimenting with iSCSI and a datastore on my Synology NAS, but as I only got 1Gb/s switch decided that SSD will run the VMs much faster.

Check out raspiBackup as an alternative :kissing_heart:

This is not said to criticze the developers. It´s mainy said to put more focus into this matter. It´s an general speech regarding all kinds of software and computer systems, wether or not you´re a developer or a user.

When you look at OpenHab in it´s perspective, you´ll notice how much developing have been done to make it easy and simple to get started. In short: 'download the image, copy it to the sd card, insert the sd card, and you´re good to go. Simple, easy, anyone can do this.
But when it come to backing up the system, you´re facing a major challenging, just as you mentioned.

I took the challenge with Amanda the other day, sat down a read about it. (Because my OpenHab is acting up lately). I only got half way through the docs before I start to think, “It will take me forever to understand this and get it to work in a proper way”. Two seconds after, I found myself searching for new features/bindings in Openhab. I simply gave up on Amanda.
I can´t help wonder, is it me beeing too stupid to understand things like this, or is it the task itself which is too challenging. Maybe it´s a combination of both. I´m challenged even from not knowing anything about linux. But does that disqualify me? I knew nothing about linux/OpenHab untill a few months ago. Yet I managed to get “something” to work

In an ideal world, a backup procedure should be as simple as the procedure to even get started with the software itself. Meaning, if I can get the system up running, I should be able to backing up the system just as simple.

I know it sounds easy, and I know in real life it isn´t. But if we keep on telling it´s not easy, it will never get easier. This is why I adress the developers to put some more focus into this matter, as well as I try to adress the users (myself) to spend some more time on this matter. Both parts needs to face this challenge.

Actually, it IS that easy.
To install openhabian, you download an image to your PC and write it to your SD-card.
To backup, you put your SD-card in your PC and make a copy of the image.

Both can be made with for example win32diskimager. That’s how I make my full backups, plus I make regular backups of openhab’s conf-folders and the openhabian user’s home folder where i have all my custom scripts.

And there is a backup approach that is just so simple. You had to write an image to an SD card to get started right? You can use the same tool to make a backup copy of that SD card. The draw back is you have to take the system down to do it.

Anything else is going to be more difficult.

But this is a procedure which require repeating manual operations over and over again, unlike installing openhab (software/OS/whatever) which is a one time operation.

The procedure itself of making an image of the sd card is easy. But the repeating manual operation is, in my opinion, the reason why many users wont get it done, in time.

So, we know at least one good and easy way to do backup. Now turn the focus into handling the challenging with the “repeating manual operation” part of it, and making it an easy, simple and automatic operation.

I´m not saying it´s a easy challenge. But it should be the way to think, when you are developing something in a data and computer environment. Just as well as it should be a feature users are focusin on.

See me reply to pacive.
I do not agree with taking down the system is a draw back. The challenge is, the repeating manual operation.
But its depending on how big the system is and how long the backup procedure will take, ofcouse. A huge system with several TB of data will require fare to much down time to do such a backup. A small system, like Openhabian (OpenHab) should be possible to backup in just a few minutes.

Difficult, yes. Impossible, no.

Any other procedure is going to require you to set up a place to back up to since you must backup everything on the drive to catch everything. And that is the hard part of setting up Amanda. Once set up, Amanda takes a single command to run or can be set up to run automatically.

In short, automatic backups so not really get any simpler than Amanda because they all will require you to configure a place to back up to.

We’ll, Amanda is difficult for some but not impossible.

But I will argue that any backup system that:

  • works across operating systems
  • supports restore (in which case there is no point in backing up)
  • captures everything from the OS on up
  • doesn’t require an external system to back up to
  • doesn’t require any configuration by the user

Is indeed impossible.

Openhab/openhabian itself require something.
It´s not the requirements itself which is the major hassle. If the Rpi had a second SD card reader, it would be obvious. Rpi doesn´t, then move on to next step. USB disk, NAS, network path etc…

When moving on to possible requirements, make sure the procedure to include these requirement is as easy and simple as installing the software (openhabian/openhab). Meaning, I should not have to study linux for decades to include these requirement, to be able to do a simple, easy and effective backup prodcedure. It breaks the idea of beeing hasslefree.

This leads to your next sentence.

Exactly…
So if we can agree setting up Amanda (or Amanda´s requirements) is the hardest part, then developing Amanda should focus on making that part easier. I´m not asking for procedure a monkey can deal with it. But some where in between.

This leads me back to my frst suggestions, PaperUi (or infact something simular Ui system like openhabian-config, or perhaps an amanda-config) for setting up and running Amanda and it´s requirements. In PaperUi there could be an option to make this an automatic crone job by rules or whatever, perhaps even setting up Amanda as well.

I know this sound simple. But I fail to understand why this would be close to impossible, no matter what enviroments we´re speaking of. Ofcouse the user has to get the necessary hardware requirements, (backup storage/cloud service/whatever). Connect it to the Rpi (or whatever hardware beeing used) and then configure Amanda. Anything else, it´s in controle of the developer to make possible.

Amanda (the backup procedure) will end op creating an image backup file, which the user can start over with using the same procedure as the first starting point, writing the image file to an SD card, insert the SD card into the Rpi. And the system is up running from the date the image bakup file was created.

(This is a shot story. I know there will be more questions to answer going through the setup. But I hope you get the point).

Now, where is the problem?

I know the above procedure will not deal with everyone of us. Some have even less knowlegde. But in a principal, it´s a matter of making this most hasslefree as possible, despite of the users knowledgement.

And OH would have to support backup to ANY of those options. And most work differently and have lots of options in how they can work. Just setting up and maintaining an automated system to work with all of these potential external devices is a project almost as big as OH itself.

So let’s assume we leave that up to the user to provide a path to a folder that represents some external drive or network shared drive. Now we are back in Amanda territory that apparently you “have to study linux for decades” to figure out how to mount a USB or network file system. We’ve solved exactly nothing.

I’m honestly not sure what could be done to make the setup of backup and restore on openHABian any easier. Assuming a USB drive for backup:

  1. run openhabian-config
  2. Choose 50 Backup and Restore
  3. Choose install and configure
  4. Answer the questions asked (password for backups, etc)
  5. Say OK to back up to Locally attached or NAS storage
  6. Provide the path to the external drive. Apparently the hard step
  7. Answer the rest of the questions

If your main complaint is that it is hard to ssh in to the RPi to do the initial setup then perhaps you should be running on Windows or something more familiar. If the main complaint is step 6, well ALL backup systems will require this of you. If projects that are nearly as big as the OH project do not automate this for you what chance is there that OH would have the people with the skills and willingness to implement it?

From an OH perspective the problem is that Amanda only runs on certain Linux distros and Windows. That leaves out Mac and the NAS systems (QNAP, Synology) which means adding it to PaperUI would make at least that part of OH incompatible with those platforms. Based on the standards of contributing to the project as I understand them, that means it would be not allowed. A solution that works across them all would have to be provided.

So now the problem has become bigger and we can no longer rely on Amanda. In fact, I know of no backup system that supports all the platforms that OH is supported on so we have to figure out multiple backup and restore solutions. Then we have to somehow automate all the dozens of different ways that someone may choose to mount an external file system to backup to on each of those platforms.

Then we have the problem that some of the platforms like Windows don’t allow complete backup of the system without making very deep hooks into the OS to prevent certain things from being written to while the system is running. Now we have a huge amount of code to write that is specific to only one platform. Oh, and that code can’t be written in Java so we have to either have a subsystem that exists outside of OH for Windows backup or we need to rewrite all of OH in another programming language.

Because of this line of reasoning.

  1. No changes will be allowed to the OH UIs that are platform specific
  2. Your home automation system consists of more than OH
  3. OH cannot know exactly what other services you are using in your home automation system
  4. Therefore to backup your full home automation you must backup the full drive that OH is running off of, operating system and all (NOTE: this still won’t catch everything as many if not most OH users run other services on other hardware separate from the OH server).
  5. Every platform works differently so almost every platform will require it’s own backup system. At a minimum Linux (Amanda), Mac, and Windows (Amanda won’t backup the full drive, just selected files).
  6. Certain platforms like Windows will require code written in some other programming language, I’ve no idea what would work for something like QNAP or Synology though perhaps those won’t matter since they should be running on RAID anyway.

Once you get to 5 we are looking at at least three different subsystems , at least one of which will be as large as OH itself in terms of the amount of code required, to solve a problem for which there are already third party software, both FOSS and commercial, to solve.

Cool, that is mostly how Amanda on openHABian works. But what about OH on a Mac, Windows, etc? There is no SD card to copy. The drive is likely to be too large to fit on an SD card to back up. What about those who have a NAS and want to backup to that?

This solves the problem nice and easy for your specific configuration and deployment. Your’s is not the only way to configure and deploy OH.

1 Like

Guys, just some quick input on this.
It’s obviously out of scope for openHAB(ian) and therefore not mentioned in the openHABian Amanda Readme, but you can run Amanda on MacOS and almost any UNIX (including Synology and QNAP NAS).
And there’s even a Windows client available, too, so as long as you have a UNIX backup host to run Amanda on you can even use this to backup your Windows openHAB or even Windows based gateway or desktop PC.

@rlkoshak : any Amanda setup can backup (and restore) both, directories as well as complete raw devices (such as SD cards or HDDs). It’s even the openHABian default to do both in parallel, and the Readme recommends to use this to create a SD clone as your first action after you ran your first backup.

@Kim_Andersen, I honestly don’t understand what you feel to be difficult about setting up openHABian Amanda. Rich quoted the steps. It’s as simple as it can get given the range of HW and functionality we want it to cover.
Yes you need to read a single somewhat lengthy doc but doing so does not seem to be a problem given the time you already kept spending on this thread alone. Yes it’s easier for UNIX people but there’s quite a number of Windows people who successfully made it and I’ve spent quite some effort to incorporate their findings into the Readme.
If you have questions or problems, put them up in this Amanda thread and I’ll try to give an answer.
If you feel the Readme is missing anything essential or it’s incomprehensive at any point, let me know where exactly and why.

I was under the false impression that Amanda could only backup files, not the running os drive in Windows. It’s good to know it can do that.