How and what to backup automatically from Openhab running in a docker on Ubuntu to Synology

I have now a working Openhab system running in a docker on ubuntu.
I can access the files from windows with VSC.
The last step I want to implement is run every 6 hours Openhab backup from Ubuntu to my Synoloy NAS. When this all is working I can move Openhab from Synoloy to Ubuntu.

I assume I need to follow the steps below.

  1. Create a Openhab backup from conf and userdata. Is there a standard procedure for what I can trigger every 6 hours?
  2. Copy the result file to Synoloy. What is the best / simple way to automate this. Is rsync the best option?
  1. Everything that you mount into the container as a volume needs to be part of the backup. Most people put everything into the same root folder so just tar up that and you have a backup.

  2. rsync will keep two folders synchronized., Changes made in one get copied over to the other. That’s one way to keep one backup. However if there’s something messed up and you need to go back to before the current backup you are out of luck. Your rsync will have backed up the bad stuff and it doesn’t keep a history. Therefore, even just taring everything up and slapping a timestamp onto the tar file and scp’ing that tar file to the NAS would be a better over all backup solution. Or you could do both to have a kind of “hot swap” backup always there and ready to use.

I just mount the backup folder from the NAS to my machine and have a cron job that tars everything up and moves the file to that mounted folder.

For example:

echo "Backing up openhab"
file=/srv/backups/openhab/openhab-$(date +%Y-%m-%d_%H%M).tgz
cd /srv/openhab
tar cfz $file .

fsize=$(ls -lh $file | cut -d ' ' -f 5)
toc=$(tar tfz $file)
body=${file}'\nBackup size: '${fsize}'\n\nContents:\n'${toc}

to='To: '$email'\n'
from='From: '$email'\n'
subject='Subject: openhab Backed Up\n\n'

echo -e "$msg" | $sendmail $email

My OH volumes are in /srv/openhab and the backups folder from the NAS are mounted to /srv/backups. The actual backup part of the above is just the first five lines. The rest is generating an email report so I know the backup ran.

I‘m using duplicati since two years without any problem. See here for my solution.

Maybe rsnapshot is available for the Synology.
rsnapshot is a script which uses rsync but keeps virtual snapshots (frequent, hourly, daily, monthly) and it’s up to you, how many of these snapshots are stored (so maybe the last 10 frequent - for example every 10minutes - the last 8 hourly, the last 7 daily and the last 6 monthly).
rsnapsot uses hardlinks for these snapshots, so only changed files will take more of your diskspace. rsync ensures that only a diff is transferred, so it’s really fast - and in my experience very reliable, too.
You can restore individual files as well as whole directories.

Thanks rikoshak. I have a long path but is working.
Is this always working well. No problems with open files during tar?

echo "Backing up openhab"
file=/run/user/1000/gvfs/afp-volume:host=NAS.local,volume=backup_nuc/openhab-$(date +%Y-%m-%d_%H%M).tgz
cd /home/mynuc/docker/openhab4.1.1
tar cfz $file .

How can I get write access to ubuntu files from windows 11?
To share I installed
sudo apt install nautilus-share
I access the file from windows with \\\docker
I can open the files but not modify. How to solve?

This are the rights

rwxrwxrwx   5 mynuc mynuc 4096 feb  8 11:13 ./
drwxrwxrwx+  4 mynuc mynuc 4096 feb  9 21:03 ../
drwxrwxr-x   2  9001  9001 4096 feb  8 11:13 addons/
drwxr-xr-x  15  9001  9001 4096 feb  8 11:29 conf/
drwxr-xr-x   9  9001  9001 4096 feb  8 11:29 userdata/

This are the settings from the root folder docker


I’ve never had it fail but OH isn’t doing much over night when this runs. If it fails I’ll know from the email report.

I mainly use this for persistence backup since I use rrd4j . Everything else I’ll restore from git when needed. But I use this script for some other services and I’m lazy and have disk space so I never bothered making it more precise.


I’ve never used nautalis share before so for the second question :person_shrugging:

if they are writeable for world/other then it should be possible to write to directories and files whereas for security reasons that shouldn’t be done.
Instead of using nautilus-share a full samba server might be the better option.

As far as I understand nautilus-share is a very simple samba server which does not offer enough settings to be configured.

Solved with additional rights sudo chmod -R a+rwx

1 Like