Running backup script via cron, correct user/permissions

Hello all,
I am trying to implement a better backup solution than I previously had, and my attempt at setting up Amanda previously was a failure (backups were empty, and huge time in setting it up in the first place) so I am opting for more familiar and tangible options.

Background:
Raspberry Pi4 4gb running openhabian 2.5.3, on same network as ubuntu system (referenced by the ip address in the script).
I’ve already created a user on the ubuntu system (also named openhabian), created an ssh key on my raspberry pi and successfully installed it on my ubuntu system. I have successfully run this entire script manually (including the rsync portion), with the correct files showing up on the ubuntu system.

I have already set-up a crontab entry, to be run at 4AM on the first day of the week every week (I believe)

0 4 * * 1 sh /home/openhabian/backupall

but I am expecting a permissions issue.

1. My SSH key is from the raspberry pi ‘openhabian’ user. Cron will probably not run the script as that user. It isn’t too much difficulty to add an ssh key for another user on the pi, if this is an option…
2. The script also needs to be run as sudo, to allow for the openhab-cli backup command to be run correctly. I am not sure how to best run this.

Any suggestions?

For reference the script is below, thanks to @tailor , adapted from his Original Post

backupall

#!/usr/bin/env bash

#####################################################
# Date with Timestamp
#####################################################
echo "+-+-+-+-+-+-+ Set Timestamp  +-+-+-+-+-+-+-+-+"
DATE=`date +%Y_%m_%d-%H_%M_%S`

#####################################################
# Backup openHAB
#####################################################
echo "+-+-+-+-+-+-+ openHAB Backup +-+-+-+-+-+-+-+-+"
sudo openhab-cli backup

#####################################################
# Backup Grafana
#####################################################
echo "+-+-+-+-+-+-+ Grafana Backup +-+-+-+-+-+-+-+-+"
#Stop Grafana service
sudo systemctl stop grafana-server
#Backing up ini file
mkdir /var/lib/openhab2/backups/tmp_grafana/
mkdir /var/lib/openhab2/backups/tmp_grafana/$DATE/
sudo cp -arv /etc/grafana/grafana.ini /var/lib/openhab2/backups/tmp_grafana/$DATE/grafana.ini
#Backing up a database
sudo cp -arv /var/lib/grafana/grafana.db /var/lib/openhab2/backups/tmp_grafana/$DATE/grafana.db
#Start Grafana service
sudo systemctl start grafana-server
#Create Zip File
cd /var/lib/openhab2/backups/tmp_grafana/$DATE
sudo zip -r /var/lib/openhab2/backups/grafana-backup-$DATE.zip ./*
#delete tmp folder
rm -rf /var/lib/openhab2/backups/tmp_grafana



#####################################################
# Backup Influxdb
#####################################################
echo "+-+-+-+-+-+-+ Influxdb Backup +-+-+-+-+-+-+-+"

#Backing up the metastore
influxd backup /var/lib/openhab2/backups/tmp_influxdb/$DATE/
#Backing up a database
influxd backup -database openhab_db /var/lib/openhab2/backups/tmp_influxdb/$DATE/
#Backing up conf file
cp -arv /etc/influxdb/influxdb.conf /var/lib/openhab2/backups/tmp_influxdb/$DATE/influxdb.conf
#Create Zip File
cd /var/lib/openhab2/backups/tmp_influxdb/$DATE
zip -r /var/lib/openhab2/backups/influx-backup-$DATE.zip ./*
#delete tmp folder
rm -rf /var/lib/openhab2/backups/tmp_influxdb


############################################
# Sync to ... end distination
############################################
echo "+-+-+-+-+-+-+-+ RSync to MServer +-+-+-+-+-+-+-+"
sudo cp -vnpr "/var/lib/openhab2/backups/." "/home/openhabian/backups/openHAB openhabian-nuc"
rsync -a /home/openhabian/backup openhabian@192.168.0.100:/home/openhabian

echo "================== Done ==================="

  1. How did you set up the crontab entry? Did you use sudo or did you run it as user openhabian? What ever user you ran crontab as is the user that the script will run as.

  2. Does the script call sudo or the script itself needs to be run with sudo. If the former, make sure that those commands can be run as the openhabian user as sudo without password (search for “visudo” on google for details). If the latter, use sudo to run crontab and the cron job will be run as root to begin with.

What ever user the cron job is running under will need to have the proper ssh certs configured.

1 Like

Thanks Rich! I had forgotten about visudo.
I ran the crontab as the user openhabian (without the sudo command). This should mean the script is run as openhabian, it seems.
I ran visudo and added the below line

openhabian   ALL=(ALL)  NOPASSWD: /sbin/shutdown, /usr/bin/openhab-cli, /bin/systemctl, /bin/mkdir, /bin/cp, /usr/bin/zip

Translating this series always takes me a little while. For others, this is modifying what the user ‘openhabian’ can run, from ALL hosts, can switch to (ALL) users, using NOPASSWD for the listed commands.

If any of this is incorrect, please let me know.
I am still running into obstacles. Despite modifying the sudoers file, If I run my script (removing the sudo commands), the systemctl commands fail to run and throw errors.
Oddly enough, rsync gives errors in the script (run as sudo or not):

rsync: getcwd(): No such file or directory (2)
rsync error: errors selecting input/output files, dirs (code 3) at util.c(1221) [Receiver=3.1.3]

yet when I run the exact same command outside of the script, it works:

[17:37:41] openhabian@openhab:~/backups$ rsync -av /home/openhabian/backups openhabian@192.168.0.100:/home/openhabian
sending incremental file list
backups/
backups/openHAB/
backups/openHAB/grafana-backup-2020_03_25-15_36_06.zip
backups/openHAB/grafana-backup-2020_03_25-15_36_56.zip
backups/openHAB/grafana-backup-2020_03_25-15_40_40.zip
backups/openHAB/grafana-backup-2020_03_26-16_51_35.zip
backups/openHAB/grafana-backup-2020_03_26-17_04_39.zip
backups/openHAB/grafana-backup-2020_03_26-17_07_46.zip
backups/openHAB/grafana-backup-2020_03_26-17_14_05.zip
backups/openHAB/grafana-backup-2020_03_26-17_20_42.zip
backups/openHAB/grafana-backup-2020_03_26-17_26_05.zip
backups/openHAB/grafana-backup-2020_03_26-17_28_56.zip
backups/openHAB/grafana-backup-2020_03_26-17_36_13.zip
backups/openHAB/grafana-backup-2020_03_26-17_36_23.zip
backups/openHAB/grafana-backup-2020_03_26-17_38_40.zip
backups/openHAB/influx-backup-2020_03_25-15_36_06.zip
backups/openHAB/influx-backup-2020_03_25-15_36_56.zip
backups/openHAB/influx-backup-2020_03_25-15_40_40.zip
backups/openHAB/influx-backup-2020_03_26-17_04_39.zip
backups/openHAB/influx-backup-2020_03_26-17_07_46.zip
backups/openHAB/influx-backup-2020_03_26-17_14_05.zip
backups/openHAB/influx-backup-2020_03_26-17_20_42.zip
backups/openHAB/influx-backup-2020_03_26-17_26_05.zip
backups/openHAB/influx-backup-2020_03_26-17_28_56.zip
backups/openHAB/influx-backup-2020_03_26-17_36_23.zip
backups/openHAB/influx-backup-2020_03_26-17_38_40.zip
backups/openHAB/openhab2-backup-20_03_24-17_20_02.zip
backups/openHAB/openhab2-backup-20_03_24-21_22_56.zip
backups/openHAB/openhab2-backup-20_03_25-15_27_37.zip
backups/openHAB/openhab2-backup-20_03_25-15_36_06.zip
backups/openHAB/openhab2-backup-20_03_25-15_36_56.zip
backups/openHAB/openhab2-backup-20_03_25-15_40_40.zip
backups/openHAB/openhab2-backup-20_03_26-16_51_35.zip
backups/openHAB/openhab2-backup-20_03_26-16_56_43.zip
backups/openHAB/openhab2-backup-20_03_26-17_04_39.zip
backups/openHAB/openhab2-backup-20_03_26-17_07_46.zip
backups/openHAB/openhab2-backup-20_03_26-17_14_05.zip
backups/openHAB/openhab2-backup-20_03_26-17_21_11.zip
backups/openHAB/openhab2-backup-20_03_26-17_26_05.zip
backups/openHAB/openhab2-backup-20_03_26-17_28_57.zip
backups/openHAB/openhab2-backup-20_03_26-17_36_13.zip
backups/openHAB/openhab2-backup-20_03_26-17_36_23.zip
backups/openHAB/openhab2-backup-20_03_26-17_38_40.zip

sent 397,477,640 bytes  received 807 bytes  5,407,870.03 bytes/sec
total size is 397,377,031  speedup is 1.00

This one has me still scratching my head. If I can get everything other than rsync working correctly, I may end up going with one of the other polished solutions (raspiBackup or another attempt at amanda)

Partial update.
The rsync error was an odd one. Thanks to this source , the culprit in my case was the current working directory from the perspective of the script. I added cd /home/openhabian to the line before the rsync command, and the script worked. This is odd, as the rsync command from my perspective shouldn’t be affected by the working directory, as both the source and destinations are defined in absolute terms. Regardless, that was my fix for rsync.

I am left with sorting out the sudo permissions for systemctl (visudo didn’t seem to work), and will continue to chip away at this.

Hi Ben, thanks for this subject, I too have been stung by the Amanda Backups not containing any data. Have you managed to successfully fully deploy this backup via cron job? Anything I should be aware of other than in this thread before I set this up?

Amanda just backs up what it’s told to, which by default doesn’t include neither InfluxDB nor Grafana because these by themselves are optional components.
But it’s all documented.
Once installed, you can simply add more lines to /etc/amanda/openhab-dir/disklist.

So if it didn’t why didn’t you ask about this on the forum or opened a issue?

In the end, backup is a complex topic and it’s no good idea to ignore that in the first place, decide to go for an alternative solution and then find out about all those bits and pieces you should have thought about in the first place.

Hi Markus I hope you are well,

So if it didn’t why didn’t you ask about this on the forum or opened a issue?

I went down the Amanda route which didn’t work out for me initially (out of the box) so with the limited time I’ve had I have been manually backing up as and when.

In the end, backup is a complex topic and it’s no good idea to ignore that in the first place, decide to go for an alternative solution and then find out about all those bits and pieces you should have thought about in the first place.

I totally agree about backup being the (hopefully) silent partner that always has your back. Through my journey, I have been interested in cron for a while so now my current focus is back to this subject I would like to follow this path to see where is leads. Bens footprints look interestingly similar to the ones I want to make so I’d like to see how he is with this now.

Hi James,
Sorry for the late reply.
I think the backup strategy boils down to make sure it works before you depend on it.
Amanda was a bit of a black box for my understanding, thus why it failed. Explicit scripts, on the other hand, worked, as I didn’t have any areas I hadn’t made/modified myself. If you go my script route, make sure you micro-manage the first few runs. If you go the amanda route, make sure you have a way to find and test the backups that are made. Amanda is a bit more flexible with where they are sent, but I have been completely happy with my inflexible-yet-obvious route with the script.

Make sure you know what it is you are wanting backed up.
?full RPi Os
?config files
?items/rules
?HABPanel pages
?influxdb
?graphana
etc
Think through everything you painstakingly created, then make sure you know what is important for restoring that item. In my case, I learned this when I tried to restore the items, then had to re-painstakingly-create them. In my case, it is a hobby, and everything could work manually in the interim. If you are depending more on it to not be down for long with a potential failure, then have a separate system to do a test restore run on (ie separate pi). Once you have proven the backup to work, and then once you’ve proven the automated/intermittent backup system to successfully backup everything and work, then you can sit back and relax.

If you’re going to go about tinkering with your system, I’d suggest you also get in the habit or manually running your backup script before you go about potentially breaking it. That habit has recently helped me!

Cheers.

No worries thanks for getting back Ben, I do take full image backups when I hit milestones as and when. I have started playing with cron and getting backups off ship to my NAS successfully! I will be checking out your script as it looks very comprehensive.

Cheers

FYI had to add this also to make it work!

cd /home/openhabian

rsync -a /home/openhabian/backupsCRON /nas/automation/backups/openhab-backups

1 Like