openHABian shared or dedicated

Hello everyone,

I am completely new to OpenHAB but would love to give it a try.

I am currently running to RPi 4s with 8 GB each. I purchased them last summer with 8 GB as I was using them for Media and VPN Servers.

I saw that even for the newest OpenHABian version the 32bit version is recommended.
Is this really still an issue or is that info outdated? Would I be running into problems with OpenHABian 64 bit or would it be a bigger problem running 32bit with 8GB?

Is there a fix/workaround so I can spare my SD Card the heavy load despite my 8 GB RAM on a 32 bit system?

Thank you all for your help.

I would like to expand my question. From what I found, OpenHABian does not come with a desktop environment. Okay, this could probably be installed via ssh or VNC.
But would it generally make more sense to start from a “normal” Raspberry Pi OS and then install OpenHAB via terminal? Is OpenHABian just meant to be a simpler way of getting the server up and running or does it have true benefits to the classical Raspberry Pi OS?

The reason I would like to get it right is that my Raspberry Pi is supposed to not just serve as the brains of my home automation (which is less demanding resource-wise) but actually also handle all IT routing etc.
Meaning, it serves as MediaServer, NAS, VPN server (granting access to all my NAS drives while travelling and internet routing to watch the news etc. without geolocation restrictions). Some of these features to show quite a heavy resource load. Especially when e.g. 4k movies are involved.

Given my needs, I find it far more flexible to work with a desktop environment in place.

Your thoughts on this would be very much appreciated.

Thank you all :slight_smile:

openHAB is recommended to be run on a dedicated system in order to minimize HW & SW dependencies.
Most people don’t want to endanger stability of their home just to save a couple of bucks.
If you do then feel free to go ahead, but then openHABian is not the right tool for you.

1 Like

You can install openhabian on linux, read this page about half way down openHABian | openHAB look for this line " Other Linux Systems (add openHABian just like any other software)"

Although as a working system i would dedicated a pi to openhab as mentioned above.

Zram is installed as part of Openhabian which will protect your SD card.

From what I understand, ZRAM will not work on a 32bit OpenHABian OS. Meaning I would have to live with unnecessarily high SD card wear. Or am I mistaken?

Why exactly the need for a dedicated server?
Does OpenHAB consume so many system resources and will cause issue when combined with other tasks running in the background?
I have been using to RPi 4 with 8 GB since summer. One as VPN server and brain at home while I use the other when travelling abroad as media center. So the second one basically acts as a Fire TV replacement with many more features (check my e-mails, use Office etc.).

I have had no issues with performance on either end so far. I have also never experienced a SD card problem so far.
Is the OpenHAB not “just” a server running as a service with multiple local files? So basically a binary plus a bunch of files running as a network service.
Just like a tvheadend, a VPN server etc etc

P.S.: Just so nobody takes offense: I am truly asking because I would like to understand the approach. I understand if OpenHABian is supposed to help beginners set it all up. But if running a working Raspberry Pi, there must be some serious downsides if you need to buy and maintain a second system just to use OpenHAB reliably.
Waste, after all, is not just a question of money. I could not care less about the extra 50 bucks. But I do care about wastefulness regarding manufacturing, electronic waste, power waste, waste of time maintaining two systems etc.

Zram works with 32 bit on pi3, not sure about pi4 8gb as I don’t have one. Openhab will run fine with other things, I run a test setup on pi4 4gb with the full install of pios . It also runs home assistant and some other stuff. But I would run a real system on its own device.

It’s not a need. Read properly:

There is no single “that’s why” but there have been numerous “clever” guys before you that ignored this and ended up with some sort of dependency they never expected they would hit in the beginning.
But it’s just generalized advice you don’t have to take.
Just please don’t come back here asking for help should you run into issues.

Hmmm. But dependencies should be installed automatically if the installation process is correct, should they not?

OpenHABian, I would assume, is built on an existing distro. So in your github you manually added the required dependencies.
If these same dependencies are installed during the installation of OpenHAB (maybe by an automated script) would that not result in exactly the same setup as OpenHABian? Just with the desired distro (e.g. Raspberry Pi OS full) already running in the background?

P.S.: “Need” means need, not “requirement”. You basically are saying that OpenHAB will sooner or later run into dependency problems when not using the dedicated distro.
This to me equals a need.

I love these discussions, they’re useless.
I did not write package dependencies, did I ? It’s dependencies in general.
Peripherals, electrical, physical even, drivers, firmware, kernel parameters, yes also package dependencies, libraries, networking, npm and py modules, yadda yadda etc et al, you name it.

1 Like

Well, I thought this was the right place to ask questions.
I accept opinions and I accept experience. But I prefer real information and proper discussion.

So if OpenHAB is prone to problems because of dependencies, then that to me sounds like an issue that should be handled in the OpenHAB package.
So during installation the necessary packages and libraries should be installed. Whenever a new dependency is required, it should be installed automatically.

As a comparison, let’s look at Kodi vs. LibreElec. Both work equally well. Kodi installs the core components. Whenever an add-on requires dependencies to be installed, Kodi does so automatically.
The sams principle should be applicable here.

So to me, right now, it sounds like there is a problem in OpenHAB that you fixed locally in OpenHABian.

If I am mistaken, I will happily be convinced by details. But right now I do not understand why a customized OpenHABian based on Raspberry OS would show a different behaviour to a properly installed OpenHAB on Raspberry OS.
I would understand if you had concerns when using Mint, Ubuntu etc. But using the same base image as OpenHABian?

It just is suggested to run openHAB on a dedicated hardware since usually one wants a reliable smart home system.
Nothing more. What about if you install openHAB on a raspi that is already fully loaded with pi-hole, vpn, Kodi… Would you like that openHAB rules get delayed since the raspi doesn’t have enough resources? Would you like that the raspi throttles down because of too much heat and e.g. the UI of openHAB becomes not responsive or it takes several seconds until Alexa turns on a light?

Another example: if you install openHAB on a raspi that already runs z-way… Best luck getting zwave running, because the Z-Wave controller is already in usage.

What if you install openHAB on a system that already uses port 8080?

And so on. openHAB cannot assure that it runs on any combination of hardware/software. In order to make it easier for users: openHABian.

there is some merit to some of the points. Slaggish behavior should not occur since OpenHAB should not be resource hungry. So turning a light on or off etc. should still be easily possible even under max VPN load etc.
After all, most single commands are “basic”.

Port issues could and might occur. 8080 is a very common port. Is this not adjustable in OpenHAB?

Of course the Pi should not be running a different home automation environment. That would of course not be smart.

So from your experience, is it feasible to install OpenHAB and install a desktop onto it? Or will this by itself already cause problems?
A GUI and access using a screen would be much preferred for numerous other things.

Another issue I am having is that my Raspberry is behind an IPV6-ONLY DS-LITE ISP. My phone provider os IPV4-ONLY.

So VPN via Wireguard does not work. I have been using Zerotier. So I would need to install zerotier etc.
All of this is more convenient when you have a direct connection with a desktop because manual network rule and table changes quickly result in loss of ssh if you make a typo.
At least that was my experience the last time I worked with headless servers on a NAS.

it wasn’t so I moved your post / subthread.

If you want to run dedicated openHAB install openHABian and get on with it. You have a PI4 8gig like me so run the 64 bit version that’s unsupported and when you find a bug submit a pull request to fix it.

If you want to run rpi os with blah blah blah go ahead no one is stopping you.

If you want to run docker and want a easy way to install it all use something like

You don’t sound like a beginner and good for you. openHABian is what I use because it just works. Not all the time but thats because I didn’t RTFM

Currently it dose not work on the 32bit kernel. I try about once a week.

That is a very good example it is like saying openHAB vs openHABian. One is a application the other is a minimal os with some scripts to help get started.

The issue when running 64 bit OS combined with a shared system is the DRM for media.
Kodi will not function properly using a 64bit OS because the Widevine DRM is missing.

Raspberry has many small issues that need to be circumvented by different means.
That’s why I am less keen on a dedicated OS. This may limit what I can achieve.
E.g. right now, one new annoying bug I found was that the RPi cannot handle a BT keyboard and mouse while at the same time using wifi.
The BT keyboard will randomly repeat keystrokes. This is a known issue and no one cares. Everyone just starts using usb dongles and stuff like that.
But using certain external hardware means beeing able to adjust settings and install a bunch of stuff.
The more restrictive the OS and the less packages shipped, the more hassle it gets.

Is it possible to easily transfer ALL of my OpenHAB data from one system to another? So if I were to run into an issue on a shared system that I cannot solve, can I easily switch to OpenHABian without having to redo all the work?

From what I read so far, this does not seem to be true for the update from 2.0 to 3.0, which worries me a bit. After all, smart automation is a lot of work and has maaaaany functions that I would not like to have to redo everytime there is an update or I want to change the base OS.

Please stay on topic or open a new thread.

How to ask a good question / Help Us Help You - Tutorials & Examples - openHAB Community

I would say that it is vital to the decision making and hence this topic, to know if transfering from dedicated to shared and vice versa is possible and easy.

If it is not, this will influence the decision making significantly.

It is possible but the question remains whether it is something that is smart to do.

there are restore and backup scripts in /runtime/bin

Hmmm, why? Because of dependency issues?

With a backup I would expect it to backup all relevant files and information.
Meaning, if I installed an add-on and restored a backup, the add-on would automatically be installed on the new system as well. If this is not the case, then I would say that it is not possible.

If it is possible, then going from shared to dedicated should solve all problems encountered on the shared OS. At least according to the discussions here.

The backup and restore scripts included with openHAB are intended to be as platform agnostic as openHAB is. Restoring a backup should also install the add-ons and the settings for those add-ons. Of course, you’ll run into issues that you may be able to correct yourself depending on what hardware you’re swapping to and what bindings you have installed, such as changing from a Linux port “/dev/tty*” to a Windows port “COM*”.