JSR223 Jython Backup Job Problem

I managed to create a backup job that does a backup and copies it to a share, and FINALLY got it to work, but now I’m unable to delete the local copy of the file. Anyone spot my issue?!?

The only thing relevant in the openhab log is this: (And these are from my deliberate logging so not an actual issue)

2020-02-20 10:04:00.291 [ERROR] [223.jython.Jython Delete Backups Job] - Delete Local Backups Job Running...
2020-02-20 10:04:00.299 [ERROR] [223.jython.Jython Delete Backups Job] - Delete Local Backups Job Username: openhab
2020-02-20 10:04:00.408 [ERROR] [223.jython.Jython Delete Backups Job] - Delete Local Backups Job Permissions Output: 
2020-02-20 10:04:00.470 [ERROR] [223.jython.Jython Delete Backups Job] - Delete Local Backups Job Output: 
2020-02-20 10:04:00.472 [DEBUG] [e.automation.internal.RuleEngineImpl] - The rule '33a5f59b-d92b-4c26-8776-23a7532d5a3f' is executed.

(I know the sudoers file edit is a security issue but I’m trying to troubleshoot now.)

backup_job.py: (works)

"""
This script uses a decorated cron rule that will generate logs every 10s and can be used to test your initial setup.
It requires this:
sudo apt-get install ncftp
sudo apt-get install samba-client
sudo adduser openhab sudo
sudo usermod -aG sudo openhab
sudo nano -f /etc/sudoers.d/openhab_scripts
openhab   ALL=(ALL) NOPASSWD: /usr/share/openhab2/runtime/bin/backup
"""

from core.rules import rule
from core.triggers import when
import subprocess
#from datetime import date
from datetime import datetime, time

@rule("Jython Backup Job", description="Backup Job", tags=["Backup Job tag", "Backup Job"])# description and tags are optional
# Every 5 seconds
#@when("Time cron 0/10 * * * * ?")
# Every minute
#@when("Time cron 0 * * * * ?")
# Every Wednesday at 11:29pm
@when("Time cron 0 00 03 ? * THU")
#@when("Time cron 0 56 09 ? * THU")
def backup(event):
    backup.log.error("Backup Job Running...")    
    # Get hostname
    hostnameExec = subprocess.Popen(["hostname"], stdout=subprocess.PIPE)
    hostnameOut, hostnameErr = hostnameExec.communicate()
    hostnameOut = hostnameOut.rstrip()
    backup.log.error("Backup Job Hostname: "+hostnameOut)
    # Get todays date & format
    #today = date.today()
    today = datetime.now()
    todayFmt = today.strftime("%Y%m%d_%H%M")
    # Set backup path with hostname & date
    file = "Backup_" + hostnameOut + "_" + todayFmt + ".zip"
    #path = "/etc/openhab2/Backup_" + hostnameOut + "_" + todayFmt + ".zip"
    path = "/var/lib/openhab2/backups/Backup_" + hostnameOut + "_" + todayFmt + ".zip"
    
    # Do backup job - sudo $OPENHAB_RUNTIME/bin/backup /etc/openhab2/Backup_openhab1_20200101.zip
    backup.log.error("Backup Job Cmd: " + "/usr/share/openhab2/runtime/bin/backup " + path)
    backupJobExec = subprocess.Popen(["sudo", "/usr/share/openhab2/runtime/bin/backup", path], stdout=subprocess.PIPE)
    backupJobOut, backupJobErr = backupJobExec.communicate()
    backup.log.error("Backup Job Cmd Output: " + backupJobOut)
    
    # Upload file via smbclient - works via cmdline: smbclient -U openhabian%password //10.0.0.5/Backup -t 600 -c 'lcd /etc/openhab2 ; put "Backup_openhab1_20200219.zip"'
    #uploadCmd = 'lcd /etc/openhab2 ; put "' + file + '"'
    uploadCmd = 'lcd /var/lib/openhab2/backups ; put "' + file + '"'
    backup.log.error("Backup Job FTP Cmd: "+uploadCmd) # lcd /etc/openhab2 ; put "/etc/openhab2/Backup_openhab1_20200220.zip"
    ftpExec = subprocess.Popen(["smbclient", "-U", "openhabian%password", "//10.0.0.5/Backup", "-t", "600", "-c", uploadCmd], stdout=subprocess.PIPE)
    ftpOut, ftpErr = ftpExec.communicate()
    backup.log.error("Backup Job FTP Upload: "+ftpOut)

delete_backup_jobs.py: (Doesnt delete the files)

"""
This script deletes old local backup files that should have been uploaded already to NAS Backup share
Not sure if this requires these:
sudo adduser openhab sudo
sudo usermod -aG sudo openhab
sudo nano -f /etc/sudoers.d/openhab_scripts
# Allow openhab user to execute reboot and poweroff commands
openhab   ALL=(ALL) NOPASSWD: ALL
"""

from core.rules import rule
from core.triggers import when
import subprocess
#from datetime import date
from datetime import datetime, time

@rule("Jython Delete Backups Job", description="Delete Local Backup Files Job", tags=["Delete Backups Job tag", "Delete Backups Job"])# description and tags are optional
# Every 5 seconds
#@when("Time cron 0/10 * * * * ?")
# Every minute
#@when("Time cron 0 * * * * ?")
# Every Wednesday at 11:29pm
@when("Time cron 0 04 10 ? * THU")
def backup(event):
    backup.log.error("Delete Local Backups Job Running...")    
    
    # Get username running this script
    import getpass
    username = getpass.getuser()
    backup.log.error("Delete Local Backups Job Username: "+username)
    
    # Set permissions - sudo chmod -R 777 /var/lib/grafana/plugins
    perExec = subprocess.Popen(["sudo", "chmod", "-R", "777", "/var/lib/openhab2/backups"], stdout=subprocess.PIPE)
    perOut, perErr = perExec.communicate()
    backup.log.error("Delete Local Backups Job Permissions Output: " + perOut)

    # del files
    #delExec = subprocess.Popen(["sudo", "rm", "/var/lib/openhab2/backups/*.*"], stdout=subprocess.PIPE)
    delExec = subprocess.Popen(["sudo", "rm", "-fv", "/var/lib/openhab2/backups/*.*"], stdout=subprocess.PIPE)
    #delExec = subprocess.Popen(["rm", "/var/lib/openhab2/backups/*.*", "-f"], stdout=subprocess.PIPE)
    #delExec = subprocess.Popen(["sudo", "rm", "/var/lib/openhab2/backups/*.*", "-f"], stdout=subprocess.PIPE)
	#delExec = subprocess.Popen(["sudo", "rm", "-f", "/var/lib/openhab2/backups/*.*"], stdout=subprocess.PIPE)
    delOut, delErr = delExec.communicate()
    backup.log.error("Delete Local Backups Job Output: " + delOut)

Not sure why the above version in OP doesn’t work, but I did get it to work with Python code (as opposed to Linux rm cmd via subprocess.Popen/communicate()

delete_backup_jobs.py: (Works!)

"""
This script deletes old local backup files that should have been uploaded already to NAS Backup share

Not sure if this requires these:
sudo adduser openhab sudo
sudo usermod -aG sudo openhab
subprocess.Popen wouldnt work so using python file delete
"""

from core.rules import rule
from core.triggers import when
import glob
import os

@rule("Jython Delete Backups Job", description="Delete Local Backup Files Job", tags=["Delete Backups Job tag", "Delete Backups Job"])# description and tags are optional
# Every Wednesday at 11:29pm
@when("Time cron 0 48 19 ? * THU")
def backup(event):
    backup.log.error("Delete Local Backups Job: Running...")
    # del files
    filelist=glob.glob("/var/lib/openhab2/backups/*.*")
    for file in filelist:
        os.remove(file)
    backup.log.error("Delete Local Backups Job: Complete.")

If things are working, then this may not matter, but instead of subprocess, you could be using executeCommandLine.

More options for next time are certainly welcomed so thanks for that. I’m not great with Linux and have been finding permissions are my biggest enemy so I’m thinking I needed a chmod or chown somewhere in there. Its very strange tho, that I could run the openhab back as sude (via my setup & instructions in the scripts above), but not a rm cmd. (I’m sure however advanced Linux users will have an explanation.)

I wanted to both gain experience with Jython rules and openhab so that’s why I chose to do it this way. But in the end I think I should’ve done it just in Linux with a cron job & sh script…

If you are using openhabian it’s possible that the confirm deletion option has been turned on at the system level. That message I believe prints to stdout so you wouldn’t catch it with your code. You can bypass it by doing rm -f but beware using force with rm, it can unscrupulously delete anything and everything if you make a typo.

I suggest you stick with the Python functions that you are using now.

This topic was automatically closed 41 days after the last reply. New replies are no longer allowed.