OH on Synology

Hi all,

I’ve recently got myself a nice shiny new synology DS918+ which I use for backups etc and it’s only now that I’ve installed influxdb persistence on my OH install on my Pi that I’ve begun to wonder if it would actually make more sense to take the opportunity to move everything across and install OH onto my synology and run it from there.

It makes sense to me as it’s a nice beefy machine, lots of space, disk redundancy etc - all the yummy things that a decent NAS provides.

It also means I can have my persisitence dbs in the same place.

However, when I’ve trawled the community it seems there is almost as many people saying ‘don’t do it’ as ‘do it’ so I’m in two minds.

I have no problem spending the time moving my rules over as I’m in a good place with the system, the basics are all in and I’m about to embark on the expansion so perhaps a good time to restore, tidy and refresh everything.

I understand the arguements saying that OH should be on it’s own stand alone device, so it’s not affected or brought down by other things using the same hardware, but for me my NAS is always on, not massively used or over provisioned and far more powerful than the pi will ever be…not to mention the ease of backup / snapshots.

What are the other reasons to / not to move to a Synology for OH?

If there are none, is it ok to run 2 instances of OH on my network while I move things across?

I have no experience working with Synology but based on what I’ve seen over the years on this forum reasons not to run OH there include:

  • updates seem to be pretty behind
  • it seems to be really challenging to get set up and configured correctly, especially if you have to use a USB device with it like a Zwave controller
  • there are not as many users running on Synology so the support system here on the forum is pretty shallow.

But note that you could move your persistence servers to the Synology and keep OH on it’s own if that makes sense which would kind of split the difference.

And some users have been successful getting OH to work well for them on Synology.

It is OH to run more than one instance on OH on your network, but only one can have access to physical devices like USB dongles at a time.

OH! :smiley:
You mean OK to run? …:wink:

You can also run the Docker container instead of the Synology package on that NAS which would resolve that. :wink:

I also have a Synology DS918+ but prefer to run openHAB on a separate NUC so it’s still running when DSM is updated or the NAS is down for some other reason. Before I had a DS415+ which failed on me like everyone else due to a hardware issue but luckily my openHAB instance was still up.


I run on Synology using docker for OH, influx and grafana. Works well.

1 Like

I did this switch from rpi the Synology (in my case a 916+) about a year ago and never regret it.

But what has been mentioned already I would also vote for running OH and all the other things like influx in docker containers on Synology. I never tried the native package.

Benefits for me:

  • easy monitoring due to docker setup
  • easy update/rollback/separate test install due to docker setup
  • great backup possibilities
  • powerful hardware (which runs anyway 24/7 for me)

I think in the end it’s still up to your personal preference which setup works best for you

1 Like

Thanks all, so I suppose if I can have multiple instances of OH on my network, I could spin up OH in a docker on my synology and give it a try, move some rules over etc and see how it goes / experience the pit falls etc before I move over totally.

So, I guess all I have to do it make sure that when I transfer one rule across to the synology I comment out / remove the same rule from the pi?

I get the problem with the DSM update - I guess you could get round that by making sure OH has a mapdb type persistence for restoreonstartup and make sure any DSM updates happen either middle of the night or manually so I get to choose when to update and can then make sure it doesn’t interfere with OH rules etc?

…and there is a part of me that would prefer to keep my current learing to OH and related tech rather than have to learn something new like docker…

Oh the options…! Any one else have their real life experience tuppence to add?

I’ve run it on my Syno without Docker for almost 3 years now.

The update issue is resolvable by manually editing the spk file or blocking access via its host file and downloading the install zip manually and allowing it to install that zip. This has worked perfectly well for many upgrades. Getting z-wave working isn’t particularly hard either once you’ve set it up once.

Edit the spk with a text editor. Find this portion:

echo "Set instance variables..." >>$LOG

Just mangle the download_path url (put an extra m in com or whatever)

Then, download whatever version of openhab you want and rename it to openhab-2.4.0.zip and put it in your public share on the syno then install.

then run influx/grafana in a docker. that’s easy as there are plenty of influx+grafana dockers and you basically don’t have to learn anything to get it going.

What’s the difference / benefit / advantage between docker and just straight on the synology in real life?

I had a lot of trouble keeping it running using docker. Even pretty plain vanilla installs would lock up after 6 hours for no good reason so I gave up on using it in a docker.

Passing USB through (for Z-Wave) is a good bit harder as well. My personal experience is that running it straight on the Syno is preferable.

Cool, so did you just create an OH shared folder and then give it a file size limit and leave the rest for the syno to sort out and install it into there?

it gives you a couple options on where to install. the most sensible option is to put it in public/

You’ll see when you do the install.

I do all my DSM updates manually. And I think this is not a Syno issue at all. It’s the same for all setups. Even on the pi. If there is an OS update that requires a system restart you of course will have a short downtime and restart of OH running on that OS.

So why did you choose in docker over straight on syno? Did you find any differences or reasons why you chose docker, (in terms of functionality in OH rather than the benefits of docker as you state above)

Same for me. You will make services (OH, NAS) depend on each other when you co-install them.
You can never be sure that e.g. a NAS update won’t harm your OH install. Or that you can flawlessly run OH while your NAS needs to recover its RAID volume. I’ve seen that take more than 24 hours.
If you want to run stuff beyond OH (say InfluxDB, Grafana) you will need at least Docker images of these, too.
But my #1 reason to go for a separate box is: you can easily spend another $35 for a cold standby RPi to have it available should your server fail.

As mentioned above I never tried the native package. I picked docker because

  • I try to avoid 3rd party packages at my Syno at all. Had same trouble with this back in the days of DSM 4 that caused instabilities
  • I read some negative feedback about the native package here. But this might be not valid any more or maybe even never was. But it was my feeling at this time that docker is the better option
  • I really like the idea of docker and wanted to learn how to use it. Of course there is a learning curve to get familiar with the concept. But once I had everything scripted with compose files deployment became just fun.

Just my personal experience :wink:

There are some other Docker benefits of course:

  • There is a very clear separation between image and data. That makes it easy to decide what needs to be backed up or moved in case migrating applications (openHAB) to another system.
  • Docker works the same on any OS/distro. So if you later on decide to run openHAB on another machine with another OS or NAS brand it will work the same and migration will be very easy.
  • There are more Docker images than there are Synology packages available. So in the end you’ll probably end up with Docker any way. I wouldn’t be suprised if the Synology packages are deprecated in favour of Docker one day.

That is exactly the path Home Assistant has taken, to the extreme. They have their own OS that is basically just a hypervisor for their docker images to run on the Pi. Their other supported installation mode fetches and stalls docker images. They have deprecated running directly on the OS.

I am liking the concept of OH on syno/docker, it seems to make sense for me.

So, probably a silly question…on a syno in docker how do you access the OH landscape…currently I’m running openhabian and accessing OH via we browser :8080, SSL putty and visual studio code.

I guess the same would be correct for it in a docker as it would be for the pi?

Also, I’ve got syno quick connect set up for remote access, could my OH set up in a docker be accessed remotely, (in both a good and bad way - good I can tinker / bad it’s less secure)?

And what about OH cloud services, I use them a fair bit for IFTTT webhooks…

So many questions now!

Lots to learn about docker me thinks…

Hmmm. I thought IFTTT and adding new items to myopenhab was turned off. that could present a challenge to moving OH. @digitaldan would know for sure though.