How to: The method I use to backup my PI and C2 based Openhab

There are a number of ways to backup a SD card including Amanda which is popular. I’m sure all of them are good, but I thought I would share my backup system not just the software I use, but also the method and the reasons why, in case you find it useful. I recently found a program called Paragon Hard Disk Manager (PHDM) which is free for home use. There is also an opensource Program called Clonezilla

https://clonezilla.org/

Edit: Looks like Paragon split the free features into a separate program, called Paragon Backup and Recovery Free, and is currently found here.

The Paragon program is great for two reasons.

  1. You can restore the image created on a 64gb card back to a smaller 8gb SD card just by ticking a resize box in the program. No need to use the linux program “parted” or “gparted” to resize partitions. More on this topic below…
  2. It creates images that save space when most of your SD card is empty. Unlike Win32diskimager that creates very large images, PHDM creates a 2gb image for my Openhab backup even when taking a full image of a 128gb card. Larger cards have faster read and write specs.

Clonezilla appears to only do the second feature so would require a partition resize before the backup is made to achieve the first feature.

I wanted to explain why I chose this method and some topics which not all users will be familiar with as this particular method/system is different.

Some things to consider about your backup method are:

  • Have a hardware backup as well as a software backup: What happens if your Openhab hardware gets stolen or the hardware physically breaks. Is your backup files stored on that same device? I like the cheap raspberry PIx and SD cards because I wish to always keep a second one around setup ready to go. A full hardware backup ready to go, not just a software backup. Downtime should be able to be fixed in under a minute plus booting time by any family member.
  • Never backup a system that has been running a long time: Bit rot, soft error (and many other terms you can google for more info to help you sleep) all work to destroy data over time as computers do make mistakes which then accumulate. Sure you can buy ECC ram, and use methods like parity hard drives to error check creating a ‘better’ servers to run Openhab on, but then the cost of having a HARDWARE backup goes up. The point I am trying to make here is, I do not like to make backups of a system that has been running for any length of time. What is the point of backing up if the files already contain corruption?
  • Use a UPS: Dirty shutdowns are when you interrupt the power when the system is running. Never pull the USB plug unless the server is shutdown correctly and also protect your system from this with a UPS. Don’t forget what happens when the UPS runs out of battery life and then the power comes back on and fails in and out. This kind of thing can quickly destroy data on flash based systems as during the boot up period the SD card under goes a lot of WRITES and you never want the power failing when linux is booting as the chance for corruption is very high at that time.
  • Learn 1 system that works for all cases: If you use multiple raspberry PIs or Odroid clones for multiple linux projects, this method will work for them all without the need to re-learn how to use Amanda for openhab and FooBackup for this X project. 1 Method works for all and the system is very simple.
  • Automatic daily backups should not be relied upon: They can be used in addition to this method to capture small daily config changes, but if they are corrupt and you never use them to find this out, the time when you actually need the backup you will get a big shock.

Now for the details of the backup system

I keep 2 to 3 SD cards which are used for:

  1. The MASTER card which is setup and then backed up once working. This card lives in the backup hardware PI ready to go, but not running to prevent corruption from accumulating.
  2. The BACKUP card which is what I use to run my system with and proves the backup file is working.
  3. An optional TEST card. If I need to experiment or trial a new upgrade (newer is not always better) I put the BACKUP card aside and work with the TEST card whilst I am home, then when I go to work the BACKUP card can go back in until I find some time to continue with the TEST card. Often I find I skip the test card as I can easily make cards from my backup files at any time.

When I need to make a change to my setup, I will load the MASTER card, make the change which I have already tested and then I create a new BACKUP file and card which is then used. The cycle keeps following this process and the MASTER card is never used or left running. Never throw old backup files away.

How to create the Image file:

  • Download, install and register the Paragon Hard Disk Manager. It is free for home use if you register.
  • Insert your SD card into a usb reader on a windows based machine with above software.
  • Follow the wizard step by step to create an Image file of the entire SD card.
  • The backup file of my large cards is only taking up 1-2gb on my hard drive because this software does not store every bit that is empty space and compression is used.

How to load/write the Image file onto a smaller SD card:

  • Use Paragon Hard Disk Manager as above but follow the wizard to restore the image.
  • Tick two boxes, one to resize and the other to change the disks unique ID. This is at the top right corner of one of the restore steps.
  • If an attempt fails to work, then the key is to restore the BOOT (first partition) first, and then restore the second partition (larger one) afterwards running the wizard twice to restore only a partition and not the whole disk. By doing this in two steps it prevents the BOOT from being resized and the card should then work. If you don’t use PHDM you can resize the larger partition before taking the image but I recommend taking an image first before playing with the linux program ‘parted/gparted’ to do that.

Why does a restored image not work to an identical SD card with other programs, yet PHDM work fine?

When SD cards (SSD and other flash based storage are the same) are made in the factory, they do quality control testing on the newly made cards and then mark any bad sectors that fail these tests. They get marked in a way so you can not use the failed areas and then they still sell these cards as only a small area of the flash has a minor issue. Because of this you can and probably will have two cards that are meant to be the same, ie same brand, size and model, yet software will tell you they are not the same size.

10 Likes

Thanks for sharing. I am happy that you found a solution that works for you! Also, happy you shared your knowledge.

I am going to look up the paragon software, I had never heard of it. Sound like a good tool to backup my wifes computer.

I like a full backup solution. I am a big proponent of backing up the full disc vs just data files. I have found that if you only backup data files, there comes a day when you need that obscure operating system file for something and don’t have it. Also, like that you use the backup card as your primary as this proves the backup is good as you stated!

Yes thanks for sharing. Always good to hear about working solutions.

Reminding you and readers, though, that taking a backup that way will require you to take down your openHAB server.

You might not be aware but Amanda can backup and restore partitions as well (in addition to doing selective directory/file-level backups), and it can do that while your server and openHAB remain alive. Another advantage is its builtin versioning.
I attached a card writer, too, so just like you, I create BACKUP and TEST SD cards, but can do that right from my running Pi.

PS: I went the fdisk/gparted route to shrink my / partition so it’ll fit onto every SD card to have 4GB or more.
Luckily I just had to do that once.

Just a note…
I don´t see Paragon Hard Disk Manager beeing free. Whats free is their Backup & Recovery program located in their free download section. But these are two difference programs.

I use Win32diskmanager. Though this is not good, as it can´t shrink the image size. My old OpenHab system was running a 16GB SD, which ended up in a 128GB SSD when I did the change.
Now when using Win32diskmanager I end up with an 128Gb image file :slight_smile:

But it still works as an backup though.

hi, i use “raspibackup”, google for it. It can start/stop services, shrink the image and can restore it. i use a ssd 16gb for my openHAB and the image is a tar with around 6GB, can be restored on disks/sd-card with > 6GB, and i tried many restores :wink:
you can unpacking the tar and grab files if you need to restore single files.
try it out!
Holger

1 Like

The backup and recovery is now discontinued as it is built into the PHDM, they combined 4 programs into the one so you can see what else they do whilst using the free part of the program and then pay to unlock the extra stuff. It is free for a single computer and home use, all you do is register. Give it a go and your backup files will drop greatly.

Sounds great but does not suit me and the way I prefer to do backups from a non running sd card.

Okay, good to know… It´s not that obvious from their docs.

Yes their website is not the best. I thought the same until I dug further.

Hello,

I want to thank you for your article about how to use Amanda to backup Openhab(ian).   

Until now I was able to configure and perform a backup of some directory but I don’t know how to do it for entire SD Card (Backup raw SD card). I don’t want to use AWS S3 solution for this scenario so I was wandering If you can tell me how to use other local media (usb stick/ HDD/NAS) as a location for backup.

Thank you in advance,
Marian

P.S I try to add /dev/mmcblk0 - my internal SD reader in /etc/amanda/openhab-<config>/disklist but I don’t know is the right way.

Yeah that’s the right way.

Thank you very much !

I tried to get hold of the Paragon Hard Disk Manager today but was unsuccessful.
Many companies that offer free versions of their product make it difficult, by repeatedly steering you towards the paid version by way of large flashy arrows and big green buttons, while the free version has a small gray button or a fine-print link down by the end of the page.

In the case of Paragon, that does not seem to be the case – I must have spent at least half an hour on their site and could not get to a free version. I had already registered an account and logged in. When I finally did find a link on their own site, it took me to an empty “downloads” page in the account section. I also googled and found version 16 of Paragon Hard Disk Manager and installed it on an 8th gen Core i5 utility PC with Windows 10. It installed but would not even start. There’s an hour of my life i’ll never get back.

Anyway here’s what I did instead. It took all morning but finally got the job done. Here are the steps that worked (skipping the ones that did not, heh)

First I cloned the entire 128gb SSD to my NAS with Clonezilla. It did not take long because only 6gb is actually used.
Then I restored that clone to a different 128gb SSD
I then booted a utility PC with a GParted live USB stick, and used that to shrink the cloned SSD down to 14 gigabytes. (i figured i may as well make it small enough this time).
Then I ran clonezilla again to clone the 128gb SSD to the NAS again, this time containing only small partitions.

Then I restored the smaller image to TWO different 64gb SSDs using clonezilla in “Expert” mode with the -icds option, which does let you restore a larger image to a smaller drive provided that the partitions have been shrunk.

Then successfully booted one of the 64gb SSDs in a raspberry pi, and tried raspi-config to enlarge the “sd-card”. That didn’t work – it complained that it’s not an SD card. Well, yes, I know that, I just didn’t think it would be that fussy. :slight_smile:
So finally I booted the utility PC with the GParted live USB stick once last time, and resized the the partitions on both the 64 GB SSDs to 55000 MB (so that they’ll also fit on a 60gb SSD later without me having to go through all this again).

Thene you have it. It’s a roundabout and pain-in-the-neck way but it did get the job done.
I’ll look into Amanda one of these days too.

Thanks for the heads up, it appears they may have gone back to a stand alone free program like they used to offer which has a different name. Backup and Recovery free…

I have added Clonezilla to the first post as I do prefer Opensource programs and it looks great.

3 Likes

HI guys,

please tell me if there is a way completely to backup the image without disconnectiong the disk from Raspi and stopping OH at all?

Thanks!

Along the same lines as your advice of learning one solution, I only use CloneZilla these days.

I have tired lots of other OS dependant bits of software (meaning that they run within Windows or a Linux Desktop) , but I ALWAYS had some kind of {time consuming} issue.

Regardless of the device I’m trying to backup / restore, CloneZilla has consistently worked.

(Other than that 1 occasion I ticked the “repair sectors” option, which caused new C2 eMMC cards to not boot)

The only down side I have found with CloneZilla is that it refuses to restore an image that comes from a larger partition, even if 50% of that image was empty.
There may be a way around this, but as I tend to restore 8Gb images (from my donor / development / master card) onto larger 16Gb cards, this isn’t a problem.

That’s a great bit of advice, next time I’m building a new machine, I’ll have a play with shrinking the partition on the donor card before using CloneZilla.

As I believe CloneZilla will use every bit of available space on the destination card / device when restoring.
{Obviously, I’ll check}

Just for information sake,

I can have a new machine up and running, from being unwrapped to connected to the myopenhab service is just under 20 minutes.

Importing Velbus hardware assets and building a UI takes slightly longer.

I use a similar method. Originally I installed Openhab on an SD card, but after a few months I switched over to an SSD. I started running out of space so I resized the partition slightly using a linux virtual machine so that the partition is now big enough.

To make backups I just plug the SSD into my computer and do a backup with dd…

dd if=/dev/rdiskX of=pi_image_XX_XX_XXXX.img bs=1m count=4500

Basically the important part is that bs=1m and count=4500. That basically means that I want to copy 4500MB only. This is to avoid copying the blank space. The 4500 is an arbitrary number slightly higher than the size of all the partitions… just so I don’t forget to copy the last few bytes or something like that.

1 Like

Yes I believe that can be done with dd under Linux via command line. Amanda can do it too I believe. However I strongly recommend against backing up a drive that has been running for any length of time as you do not want to backup the corruption that creeps in. A $35 computer is not the same as an expensive server so what may be fine to do on a server is not a good idea on a great priced $35 computer.

That works well and since larger SD/flash cards perform much faster you can benefit from the larger card and reduce its size down so the backups don’t take up as much space and copying to cards that are ‘the same size’ that fail it fixes this issue. You can also resize the partition without loosing data at any time, but I would take an image of it before doing so in case you make a mistake. Parted and GParted can do this.

1 Like

Why do people keep asking how to reinvent the wheel ?
Because they’re too lazy or impatient to read up first what’s already available.
Apart from that capturing a thread is a bad habit, you should have searched the forum first
How to ask a good question / Help Us Help You - Tutorials & Examples - openHAB Community
You would have found many references to Amanda and posts to answer your question.

Because if it was really so simple, there was no 18 posts with different opinions.
Of course i have searched a lot, but i have not found an answer to my question.

Hi

I’ve looked at the images that CloneZilla makes , without any adjustments or resizing of partitions and it looks like either the blank parts are compressed or ignored.

:slight_smile: