Log in header broken after upgrade to 64 bit Bookworm on RPi4

Platform information:
Hardware: Rasperry Pi 4, 8GB
OS: Linux openHABianPi 5.10.103-v8+ #1529 SMP PREEMPT Tue Mar 8 12:26:46 GMT 2022 aarch64 GNU/Linux
Java Runtime Environment: 17
openHAB version: 4.0.4

FWIW after upgrading my RPi4 to Bookworm and enabling 64 bit version, it seems that the header displayed at login via SSH is a bit broken.
The info on kernel says “Linux 5.10.103-v7l+” while

$ uname -a  
Linux openHABianPi 5.10.103-v8+ #1529 SMP PREEMPT Tue Mar 8 12:26:46 GMT 2022 aarch64 GNU/Linux
and $ uptime
07:03:26 up 27 min,  1 user,  load average: 1.28, 1.17, 0.97

Furthermore, uptime, CPU usage, CPU load and memory is empty. See below.

###############  openhabian  ##################################################
 ###############################################################################
 ##        Ip = 192.168.xx.xx
 ##   Release = Raspbian GNU/Linux 12 (bookworm)
 ##    Kernel = Linux 5.10.103-v7l+
 ##  Platform = Raspberry Pi 4 Model B Rev 1.4
 ##    Uptime =  day(s). ::
 ## CPU Usage = % avg over 4 cpu(s) ( core(s) x  socket(s))
 ##  CPU Load = 
 ##    Memory = Free: 0.00GB (0%), Used: 0.00GB (100%), Total: 0.00GB
 ##      Swap = Free: 0.00GB (0%), Used: 0.00GB (100%), Total: 0.00GB
 ##      Root = Free: 0.00GB (100%), Used: 0.00GB (%), Total: 0.00GB
 ##   Updates = 0 apt updates available.
 ##  Sessions = 1 session(s)
 ## Processes = 136 running processes of 32768 maximum processes
 ###############################################################################

No big deal but I figured it might be good to know.

That’s actually previous by a third party system openHABian installed called Firemotd. You might look to see if there is something reported in their issue list.

Also, some of that information is cached so it might just be that it hasn’t updated the cache yet.

1 Like

remove the file .firemotd-cache and logout and login again. you may need to repeat the login step.

Some stuff is stored in /usr/share/firemotd/data/FireMotD.json and updated by crontab.

1 Like

I reverted back to the Bullseye 32 bit version as I found a few problems (not directly related to openHAB) with Bookworm@64bit. The main reason for upgrading to 64 bit was that I wanted to be able to run InfluxDB v2+.

Right now I’m considering “outsourcing” Influx, Grafana and a few other applications to a separate RPi and run a pure off-the-shelf openHAB installation on the original RPi. There is so much data collection and processing going on in the house that it is motivated to separate the functions. Just need to figure out if openHAB can use a persistence layer on another machine without running into problems.

There are always some risks such as latency and network outages that might creep up. But external databases are designed to be accessible over a network. In fact, even when running on the same machine, the network interface is how OH connects to InfluxDB.

It’s a pretty standard configuration to run theses on another machine.

The main problem I experienced last time I tried to get openHAB to coexist with databases on other platforms was that I was not able to access data from databases other than the openhab_db when I had InfluxDB as my default persistence in openHAB. This was also true when I had other databases on the same InfluxDB instance. Maby this has improved with new releases of openHAB?
Right now, I have the openhab_db database and another database (externalData) running on the same InfluxDB instance and I can’t seem to be able to read/write data in externalData from openHAB. My patchy solution is to have Grafana read data from both databases and when I need data from/to both databases, I have Golang routines doing the lifting for me. Not a very ideal solution, but…
Ideally, I would like to access any database from rules and scripts but have not found a way to do that from within openHAB.

Without more details :person_shrugging: . There is nothing in InfluxDB nor in openHAB that would make it so that other software cannot access other databases. OH, is only going to access the openhab_db though so if that’s what you mean, it doesn’t matter if it’s on another machine or not. That’s how OH works. It can only access it’s own database.

OH can only read/write data to it’s own database. OH isn’t a data analysis platform. It only allows access to it’s own database.

If your database has a REST API you can use the sendHttpXRequest actions. If not you can use executeCommandLine to use the command line client for the database to run the query. You’ll then need to parse the results because all you’ll get back in either case is a String.

But again, none of this has anything to do with OH running on a different machine from the database service.

openHABian doesn’t support bookworm yet as you can (and should have upfront) read up in the release docs.

You can install the latest released openHABian 64bit image 1.8b which is bullseye based.

openHABian doesn’t support bookworm yet as you can (and should have upfront) read up in the release docs.

I certainly did read your excellent Github thread on testing, among other, Bookworm before doing the upgrade. My intention was to have a go at Bookworm@64bit and see if it worked and if not what did not work. I figured I’d give it a try since “not supported” does not necessary logically imply “not working” :wink:
Very nice tip on the 1.8b image, thank you!
I basically wanted to avoid running openHAB in 64bit mode due to the memory consumption issues you mention on Github. So my plan was running the OS in 64 bit mode while running the 32 bit version of openHAB. My only reason for running 64 bit OS was making possible to run InfluxDB 2.7+ that require 64 bit OS.
Right now, I’ll follow your progress on testing Bookworm and stay on Bullseye for the time being.
Once again, thank you for your dedicated work!

Yes, that is certainly true. Maybe my use case deviates a bit from the mainstream of openHAB users. I have a number of sensors in the house that collect data via different channels from different devices like e.g. the heatpump, floor heating, energy optimizer et.c. My thought was not to mix those data with openHAB’s data in openhab_db, for example to avoid possible future problems when upgrading openHAB.

I would like to be able to use data in any database as an Item in openHAB but unfortunately, I’m a GOlang person, not fluent in Java so there will be no useful bindings from this keyboard :slight_smile:

As you suggest, using REST API calls to InfluxDB, I can at least make this data available to rules/scripts in openHAB.

FWIW, released openHABian v1.9 which now is Bookworm based.

Thank you, Markus. Will give it a test ride asap!
Best /Brus-Per