Inspired by this thread from @Kim_Andersen and my own need to document these things, this is one (out of infinitely many possible) solution for regular OpenHAB backups to remote storage.
Preliminaries
These instructions assume that you have:
- An OpenHAB 3 installation;
- on a Linux system (I use OpenSUSE);
- with a Bash shell; and
- access to a Nextcloud installation (or if you’re not bothered about cleaning up old backups, any other WebDAV service will do).
Other prerequisites are:
curl
xmllint
grep
tail
sed
Of those, xmllint
is the only one not likely to be already installed in your system. On OpenSUSE, the incantation is
zypper install libxml2-tools
sudo
For some reason, openhab-cli backup
wants to be run as root, so the user running this script should have sudo
powers to invoke the above command.
Nextcloud / WebDAV credentials and remote directory
You will need to provide a set of credentials to the script for it to be able to upload the backups and clean up old files. You will also need to create a directory where OpenHAB backups (and nothing else) will be stored. Note that if you do store anything else in this directory it will likely be deleted by the clean up routine.
Workflow description
The back up is done via a Bash script, which may be called from a crontab at regular intervals, unattended.
The script:
- Calls
openhab-cli backup
to create a backup file named after the current date and time. - Uploads the resulting file using
curl
’s WebDAV capabilities. - Uses
curl
again plus a bunch of common command line tools to query the server for existing backups and deletes the oldest ones, to conserve space.
Bash script
This is the Bash script that hopefully does the magic. Or rather, a template: you will need to:
- Replace
MYUSERNAME
with your actual user name on the remote Nextcloud / WebDAV server. - Replace
OpenHAB/Backups
with the actual path to a directory where your backups will be stored in the Nextcloud / WebDAV server.
#!/bin/bash
###
### VARIABLES
###
# The backups will be saved locally (i.e., in the OpenHAB server) in this directory.
# An attempt will be made to create this directory if it doesn't already exist.
BACKUPDIR=$HOME/backups
# The base name of the backup file.
FNAME=$(date -Isec |cut -c 1-16)
# The full local path of the backup file
FPATH=$BACKUPDIR/$FNAME.zip
# The Nextcloud / WebDAV login credentials. Recommended to create an application password.
# NOTE: MYUSERNAME is a stand-in for your actual username. Do a search and replace.
DAVLOGIN=MYUSERNAME:9522ba3b-d40f-4a16-b03e-5038f199fc5a
# Your Nextcloud / WebDAV hostname
DAVHOST=https://EXAMPLE.NET
# The WebDAV root of your server
DAVROOT=$DAVHOST/remote.php/dav
# The full WebDAV URL to the directory where your backups will be saved.
# This directory must exist on the server and be write-accessible prior to
# running this script.
DAVURL=$DAVROOT/files/MYUSERNAME/OpenHAB/Backups
# A Nextcloud WebDAV search expression to clean up old files.
# NOTE: the string `/files/MYUSERNAME/OpenHAB/Backups` is included verbatim in
# this expression because I was too lazy to use a variable. Do a search and replace
# with the actual path used in your installation – unless 1) the actual path used in
# your installation is `OpenHAB/Backups` and 2) you have already done a global search
# and replace for MYUSERNAME.
DAVSEARCHEXP='<?xml version="1.0" encoding="UTF-8"?><d:searchrequest xmlns:d="DAV:" xmlns:oc="http://owncloud.org/ns"><d:basicsearch><d:select><d:prop><oc:fileid/><oc:size/></d:prop></d:select><d:from><d:scope><d:href>/files/MYUSERNAME/OpenHAB/Backups</d:href><d:depth>0</d:depth></d:scope></d:from><d:where><d:like><d:prop><d:getcontenttype/></d:prop><d:literal>application/zip</d:literal></d:like></d:where><d:orderby><d:order><d:prop><d:getlastmodified/></d:prop><d:descending/></d:order></d:orderby></d:basicsearch></d:searchrequest>'
###
### CODE
###
# Create the local backup directory if it doesn't exist.
# NOTE: The script will fail if "$BACKUPDIR" exists but
# is not a directory or not writable by the user running
# the script.
[ ! -e "$BACKUPDIR" ] && {
mkdir -p "$BACKUPDIR"
}
# Delete old backups from local filesystem
rm "$BACKUPDIR/*.zip" 2>/dev/null
# Create the backup (use --full for a full backup, whatever that is, or omit for a plain backup)
sudo openhab-cli backup --full "$FPATH" >/dev/null 2>&1
# Upload to remote
curl -T "$FPATH" -u "$DAVLOGIN" "$DAVURL/$FNAME.zip"
# Delete all but the most recent two backups from remote (Nextcloud specific)
# If you want to keep more or less files, replace `+3` in the `tail` command
# with another number.
curl -s -H "Content-Type: text/xml" -X SEARCH -u $DAVLOGIN "$DAVROOT" --data "$DAVSEARCHEXP" |xmllint -format - |grep "<d:href>" |tail -n +3 |sed -r 's|\s*</?d:href>\s*||g' |while read RPATH; do
curl -s -X DELETE -u $DAVLOGIN "$DAVHOST$RPATH"
done
Crontab
A line such as this will do the job:
# Run on Mondays and Thursdays
7 7 * * 1,4 $HOME/bin/backup.sh >/dev/null
Conclusion
Hope this helps. It may work for you, it may not.