Openhab upgrade from 4.0.1 to 4.1.2 breaks

Intro
1.1. I have Pi4 it was runing debian 11 updated november 2023.

1.2. It was running OH 4.0.1, and i decided to do openhab upgrade for better performance as 4.0.1
was a bit slow to start and it used more cpu than i like 50 to 60%.

1.3. I ran apt update && upgrade and it failed with error 127. Here is the apt log.
term.log (4.4 KB)

I think i have a backup, Nuked the openhab
2.1. I have a full system backup from november 2023 and i have all configs copied, but i forgot to copy
userdata folders like bindings and databases so i dont have latest from today.

2.2. Ehhh and i did a stupid move i pissed it off with apt purge openhab as it wouldn’t install any
version of openhab anymore, wierd error right?.

2.3. As i was doing a remote maintenance i was unable to load new USB drive from backup and pop it in Pi4. So i had to fix by reinstalling.

2.4. Why did this install borke and how can i prevent it next time?

Fresh install
3.1. I did fresh openhab install of 4.1.0 this time as i didn’t quite like the 4.1.2 behaving on me and the 4.1.0 installed fine.
I had to do fresh setup install 6 bindings and i called over the config to the new setup and i is now back alive.

3.2. Than i needed to do initialization of my settings for devices as i lost database, so it took me 2 hours to fix my mistake of upgrading without latest userdata backup.

openhab built in backup
4.0. So if i knew right, i should run openhab backup command befor upgrade, does it back up all necessary userdata databases and bindings as well? So i could restore in seconds right? As copy of full usedata folder is quite big and backup of userdata is small, thats why i ask?

4.1. I noticed in linux that all latest aplications tend to install perfectlly one day but fail in denvers the next as they evolve daily. I faced the issues countless times that i couldn’t make the system work back if i didnt have a backup as it messed up sectacularly, thankfully openhab didn’t leave any issues behind after a purge.

Prompt for backup befrore upgrade
4.2. Could a openhab enforce or do automatic backup of config and userdata to home directory or somewhere external to folders that get purged so users don’t nuke the latest settings that may not be backed up yet

4.3. An automated backup job that runs before every upgrade and on daily/weekly basis could be usefull right? with embeded command like this

sudo $OPENHAB_RUNTIME/bin/backup home/pi/Documents/openhab_userdata_$(date +'%d.%m.%y_%H:%M:%S').zip

4.4 So if i am right i can call restore like bellow to copy back the zipped files.

sudo $OPENHAB_RUNTIME/bin/restore /home/pi/Documents/openhab_userdata_9.5.24_21:13:12.zip

Autimatic backup
5.1. One may write a script to be ran by systemd or openhab, but i dont know how to triger it before upgrading openhab to make sure thinhgs are backed up.

5.2. Maybe this is also the idea that OH devs can implement in new version as a prompt to do a backup?

5.3. I guess OH users will appriciate even if there were a prompt to do a backup with a comand to copy as there is not obvious in docs what command to call for backup it is all about the Amanda that is a bit too complicated and burried down is this simple comand.

Like UniFi does automatic backups, OH can too.

I am sharing my experience
Backup solution ideas appreciated, especially built in OH :smiley:

Cheers
Matej

You can do this perhaps?:
https://www.cyberciti.biz/faq/debian-ubuntu-linux-hook-a-script-command-to-apt-get-upgrade-command/

Put your command in a file under here:
/etc/apt/apt.conf.d/

You can also use rsync and copy any files you need to another location.

I backup OH like this:

I use bareos for my backups. Index of /current
I have several server I backup using bareos. Probably overkill for what you need?
Lots of options.

1 Like

Thanks ,This sounds great

And this is also amaznig, i will do combination of both for OH config backups

The bareos is a bit too much

For system backup i only trust image backups, i used Synology Active Backup, Acronis or dd over SMB so far.

On raspberry only dd over SMB works fine, sadlly no ARM support for other tools.

I am not an expert but have you considered using the Openhabian image for installing openHAB on the raspberry? As far as I understand upgrading the OS and/or openHAB on a raspberry may not be straightforward and the Openhabian image offers invaluable help.

The openhabian-config script that comes with that image has several options to copy the entire SD to another one. It also includes a periodic sync.

Before doing anything on my system I perform a SD sync so that if anything goes wrong I can go back to a working situation by booting the synchronized SD.

Concerning upgrades on openHAB, my limited experience is that it is better to start from a fresh install and transfer configurations. The tool openhab-cli is useful for this purpose, but it only works if you keep the same version of openHAB after the fresh install. After that, you can use the openhabian-config options to upgrade version of openHAB. This option may run some scripts to adapt the openHAB configuration to the new version of openHAB.

The Openhabian image comes with Amanda as a backup solution which is well documented in the docs.

All databases will be backed up, but no bindings. Restoring a backup on a different version of openHAB would reinstall wrong binding versions.

Use openHABian and all the tool it brings instead of installing OS yourself then waste hours and hours build and maintain all the secondary but critical maintenance stuff like software updating and implementing a proper backup strategy from A to Z (i.e. from researching, selecting a tool to planning to implementing to operating, not to mention the need to debug stuff that doesn’t work right away).

No. It’s already all there so why ?
You just have to use it. Here’s the summary and starting point for you.

Wow lots of great comments, appreciate all of your ideas.

This is really good to know. So far the bindings i used on this system were backward compatible, i understand that bindings need to be the correct version.

I only got annoyed when only new binding worked on OH4, and is so different that it breaks integration parameters, or changes formatting like Luxtronik binding did, i still can’t forget about writing new scripts just to fix parameter names and formatting back.

I see the OpenHabian is becoming a great OS, i use it on test VM. I agrre it is the way to go, i may need to offload some of software i also run to different machine.
The wierd thing is that the Pi4 i have on remote location is runing all kinds of stuff aside from OH, it desktop installed, has rtl_433, remote desktop, visual studio code, python scripts for scraping web sources for irigation weather, that i cuold possibly hack into OpenHAbian, but probably not intended for.

Thats good practice, i run mine on USB drive. So if i would install sd card also and have a copy of a system there, i should be able to have a failsafe hot backup that it would boot if USB wouldn’t right?

Now i only dd USB drive to the image copied over SMB to NAS. But if Pi needs restoration i can’t fix it remotely, the second USB or SD card may be great to boot from, would it change over by itself?.

Fine no issues there, i am thankfull for this recomendations, but that bellow will be the way i go with integrated backup for now to not cange over all the system that runs now.

Cheers
Matej

Offloading makes sense but that’s unrelated to the OS choice.
You should not be running remote desktop and Visual Studio on your home automation RPi, no matter if openHABian or not. On average it’s harmful to openHAB availability. Any PC is a better place for them.
The rest you can easily install and operate that on top of openHABian. None of these conflicts with openHABian.
In the end it’s just Linux you can add anything to.

I didn’t see this fully answered here yet.

  • openhab-cli backup includes: /etc/openhab and most of /var/lib/openhab excluding the cache and tmp folders

  • openhab-cli backup -- full includes: everything above and the tmp and cache folders

The cache is where add-ons get installed to. /var/lib/openhab/persistence is where the embedded databases (mapdb, rrd4j, sqlite) get saved to. External databases like InfluxDB have their own folders elsewhere and are not included in backups through openhab-cli.

It is rare that add-ons from an older version of OH are compatible with a new version of OH so a backup created with --full should only be restored to the same version of OH or, run openhab-cli clean-cache after the restore but before running OH to purge the contents of cache and tmp.

And even then, there are steps taken during an upgrade that might get missed when you restore a backup from an older OH to a newer version of OH. So caution should be taken even if --full was not used to take the backup.

All I can comment here is that it sounds like your system is unhealthy overall or has an inconsistent or jumbled package management configuration. If you use the OS provided apt repos updates and upgrades have undergone significant testing. You shouldn’t see updates and upgrades breaking so much.

This doesn’t help you specifically, but this is what the Docker image does when it detects that the container is going to start with a version of OH different from the version identified in the config files.

openHABian has lots of built in backup options.

Or use sudo openhab-cli restore <filename>

An issue was opened and work was done. However, unlike UniFi, OH has to support a lot of different operating systems and installation methods. Implementing automatic backups by openHAB itself ran into unsurmountable technical difficulties as a result and has not been completed.

4 Likes

Feels good to have you Rich @rlkoshak join in, the way you explain things is amazing.

Thankfully i fixed the openhab with my intuition and purge cleanup, regarding OS health i didn’t mess with it other than installing mentioned packages, but maybe i didn’t install oh 4.0.1 cleanly as i seen some permission errors in karaf console back then and ignored it, than didn’t update system for few months as it its remote location and over winter i better not touch it, as it is no way to get there.

I have installed all kinds of apps, like openhab, pivpn, rtl433, netprobe_lite, tailscale, casaos, anydesk, and some had those issues that they upgrade live repo daily and cause issues.

I feel like discussion was in place even tho i fixed it on my own. Thankfully same day, as system is like required to run all the time. I am quite new at linux and OH even tho i use it for 4 years, i am a young student so i am getting experience.

The automation and linux is so wast that it takes time to learn to do things properly.
Seems like i need to learn still how to write better integrations for my mqtt devices, they work fine but use too much CPU as i pars strings in rules. Thats for other thread…

I am thankful to learn from community of experienced people, indeed i found good solutions out of this thread and learned how to improve my setup for better stability.

I have a question before a close the thread, that connects to the reason of why I opened the thread on how to reverse damage fast.

What free apt based linux supports rollback system changes with sth like snapshots on OS installed on hardware? Seems like it has to run on btrfs like Synology does.

Or is it the only way for failsafe remote repair to have host running docker or VM.

The only failsafe remote repair of a remote system is to not touch the system when you aren’t there. No modifications and certainly no upgrades. Even reboots can be risky.

Avoid running pre-release versions of software or, if so, only upgrade rarely when you know it’s been tested and going to work. You should be running stable releases, not versions that change daily in the repos.

When you are there and it’s time for an upgrade, backup everything and have a plan for when something goes wrong. Using Docker or VMs provide ways to make taking a backup and rolling back easier but the idea is the same in all cases. In short, always have a plan to restore from backup when things go wrong.

Usually, that means setting something up ahead of time. openHABian comes with SD card mirroring and Amanda but they are not the only options. I know of no OS wide rollback for LInux except taking backups. You want to be careful when using journaling file systems on SD cards and some increase the number of writes .

2 Likes

I see, to not touch the running system if working is the best way, I agree.

The challenge I have to work around is that I have little time to maintain OH on-site as we usually work outside on the property.

I should have staged the upgrade on a spare Pi at home. Then bring over the changes or swap the USB drive when I arrive at the remote location.

I have a spare old mini PC that I may implement as a onsite backup system, it has OH on linux and msquitto ready to spin up OH if needed.

It is exciting to say that I see a reliable strategy.

Thanks Rich, Greg and others that participated in discussion, where I learned great things :smiley:

I confess that I am also one of the few who uses a standard OpenHAB apt package installed on a regular Linux distro on a RPi. Similar to Matej I also use the RPi for other things, hence I did not consider Openhabian option. My current setup works great, so no complaints there. There is no performance reason to use more hardware for my use case.

But similar to Matej, I had lots of issues upgrading to 4.1.0, so I eventually started with a fresh setup (it was my first upgrade, so most of the issues was probably due to my ignorance). While perhaps not mainstream, tools to assist in upgrading OpenHAB and current configuration including bindings on a regular OS would be appreciated.

I don’t understand why some do like this.
openHABian perfectly allows for installing about everything on top as stated in the docs.
People just outright wrongly assume it is no option for them.
That being said, it is also the toolset to handle upgrades based on apt. It relieves users of testing efforts and minimizes risk and efforts in updating just like both of you have experienced them now.
Avoiding wasting time on all these efforts (plus others e.g. backup) that is right why we built it for you.

As said you can well switch to openHABian so there is also no need for another such toolset let alone that there’s people willing to build one. It’s reinventing the wheel for no reason and for very few users, would you spend your own spare time on any such?

1 Like

As said you can well switch to openHABian so there is also no need for another such toolset let alone that there’s people willing to build one.

It seems that I misunderstood how openHABian can be used. Will try this for my next openHAB upgrade.

I seems interesting to hear that upgrade issues shouldn’t happen, i assume cause of OHabian being tested better.

Now It got my spark going, i will test with installing apps, desktop, tailscale and anydesk on Openhabian on spare Raspberry to see the results.

I assume that you @ccrause also wanted desktop like me, as it gives more options for GUI apps and remote desktop like anydesk.

There is also openhabian for install on x86 linux i tried on VM, it gives similar managenent toolset in cli, is it the recomended way for scaling to big systems with openhabian also?.

My background story:

My Pi is on 4G modem so no vpn, untill i got tailscale and unlimited 4G data a month ago, tailscale eats too much for 20GB a year plan.

I can get remote maintenance done over anydesk if anything breaks. The ability to do so was the main resson i didnt go with OHbian as first i didnt know how to install desktop on any linux back than 4 years ago, than i kept the system going.

Back than on OH2 begining days openhabian was unstable to install, so i revisited it few months ago as it evolved greatly.

So the only viable solution for me at time was openhab package, than i didnt upgrade if no issues arrised.
Then if cloudconnector kept staying offline or 4G data was wasted on remote location i upgraded OH to improve it.

Interesting aspect, at home i have two systems also established also 4 years ago, OH2 and now OH4 that syncs items from OH2, its on beafy mini pc running linux, and openhab apt packages.

Some bindings i use on OH2 are unavailable or diferent on OH4.

A big no no for me is When new bindings change names of channels or make breaking changes it esentially prevents from upgrading OH, without fixing previouslly working stuff.

Cheers Matej

Installing remote desktop is what some ex Windows conditioned people tend to do because on Windows it’s the only way of maintaining a remote system but in the Linux world it is against Linux and openHAB philosophy, CLI is your friend here. Anything maintenance is CLI based anyway.
Particularly so on a remote system like yours with no monitor or keyboard attached and limited connection bandwidth. On a low mem system like a RPi it can even be harmful.
openHAB is designed to run 24/7 without any administration GUI client, to access the OH user web GUI you do not need any remote desktop, only a local browser. OH is just there on the remote IP on port 8080.

Please hold on for a second and ask yourself what it is you think you need a GUI for ?

Not really. Any Pi4 or 5 can well handle even the largest openHAB systems I know of, and it’s cheaper and easier to maintain in case of disasters, so - no.
It may make sense in specific situations like if you run a multi VM x86 server in your basement anyway or some such, but on average and even more so in remote locations, it’s not a recommendation to do.

Well i see there is no point in overloading Pi with apps, luckily desktop and ayndesk was lightweight so far.

RD was the way to access other devices on remote lan over web UI on Pi, cause of cgNAT preventing me to run OpenVPN. Now I will consider to ofload RD to mini PC.

The tailscale I have now is all I need to maintain OH configs remotely on Pi. OpenVPN cloud connexa with LAN access may be great option i need to explore.

Now the Pi on remote location has light weight jobs to do in OH and runs fine.

But I overdrive OH at home with data analysis from Influx DB, i nuked rrd4j as it wanted to eat my SSD, SD card would be dead already, with rrd4j rewriting agregations of so much data. Influx stores 8Gigs of data a year, storing many items every second.

I run intelligence for many greenhouses, smart heating, power management and lots of average since and sum since recursive DB reading tasks.
I have house power overload analysis to decide to turn on or of devices like SPAN does, but locally. I have so much rules for data analysis and actions.
Ther are like 2280 items or such. 76 devices some have lost of parameters like 30 to 60.

I am still gathering ideas on how to manage data analysis, storage and charts more optimally at home as Pi would struggle even with charts.
It’s a control system for my ESP32 MQTT-connected devices and a smart home.

I want to implement automatic MQTT discovery, and auto-configuration in OH, so it auto connects and makes channels, items, and sitemap by itself. And i have some scripts ready so…

I think it is ok to finish the thread here, and i mark a solution to not mess with it remotely at Rich @rlkoshak, and move on to new project and new exciting topic, how to make Auto configuration of MQTT ESP32 devices in OpenHAB.

Thanks, i feel so welcome here.
Cheers
Matej

Just setup nginx instead. It’s even in openHABian menu.

Tailscale can do for you, too. It’s also part of openHABian.
You’re not the first with requirements like these.

So again, why reinvent the wheel.

It’s okay to use a PC, too, but that’s nothing a Pi4 cannot handle if properly setup. In openHABian persistence data is in (Z)RAM, greatly speeding up stuff.
You should invest in preaggregation, though. Storing every item state every second just does not make sense, use onEveryChange instead. Reduce number of sensor readings, too, using reasonable intervals.

1 Like

I really like what i have just read in your last message @mstormi , thanks :smiley: