Synology DiskStation

Hey Serge,

Wanted to remind you that my steps will NOT work exactly unless the file structure isn’t the same as OH 2.3 with the public folder install option (which I think is option 3).

Keep in mind; my steps are NOT to “upgrade” what is there already (OH 2.3). It’s a brand new clean install of OH 2.4 w/o any of your files and folders structures being upgraded.

The reason we renamed OH 2.5m1 zip file to OH 2.4 zip is to use the OH 2.5m1 bindings vs. OH 2.4 bindings at the very end of the steps. There is NO full OH upgrade to 2.5m1 on Synology yet.

I’m recommending you follow the directions step by step (making backup copies of your OH stuff to another machine before starting - very important).

Best, Jay

Hi Jay,

I finally got it working.

In my case (no ‘public’ folder) the folder to use in step 7 was: volume1/@appstore/openHAB.

There is probably an easier way to do it but since it is a root folder I used ssh to copy the zip and spk files to that directory.

With “user home service” enabled the configuration folders are also mirrored to the homes/openhab folder so the kar file can be copied to either ‘addons’ folders.

Thanks again for your help with this.

Serge.

2 Likes

Hi

Did any of you try updating to 2.5M2 ?

Kennet

Nope. Not yet. Actually starting to wonder about migrating from Syno to a RPi, but the backup scripts don’t work due to file location differences…

not yet - too many red flags for me. that and m1 works so well for me. it runs until i shut it down to make a change, never because of crashes…

I wanted to try installing 2.5M2 but I can not figure out how openHAB finds out what bundles to download.
First I was thinking why not just modify the URL that is used to download openHAB in the .spk file. I was thinking that somehow openHAB 2.5 M2 would know only to download 2.5 M2 bundles. So if I just download M2 in the spk it would run 2.5 M2. But when reading the post about how to install 2.5 M1 it clearly looks like this would not work.

I tried to figure out how this works from the Developer Appendix documentation but there is still some things I can not figure out.
I understand that each bundle has a pom file that described the requirements of each bundle (i.e. bindings, java modules etc.) and how Karaf is used to install bundles/modules at runtime based on the pom (i guess). And I guess somehow Karaf is given a URL to the Artifactory hosted by JFrog. but how does that tie into how 2.5 M2 is only downloading 2.5 M2 bindings ?
When running the 2.5 M2 on a windows this is working without changing anything so why is that different than when it is run a synology diskstation?

I realize this might be a bit complicated to explain but maybe you guys can give me some pointers.
Thanks.

2.5 for syno? any news or solution?

I have 2.5 M4 working for a week now and it has been running without major issues.
What I did was.

  1. Downloaded the 2.4 spk (Link)
  2. Extract the installer.sh file from the spk with 7-Zip
  3. Modify it to point the Milestone version.
    3.a. Edit. DOWNLOAD_PATH=“https://openhab.jfrog.io/openhab/libs-milestone-local/org/openhab/distro/openhab/2.5.0.M4
    3.b. Edit. DOWNLOAD_FILE1=“openhab-2.5.0.M4.zip”
  4. Then I put the installer.sh back into the spk with 7-Zip.
  5. Install the spk. ( I used the public folder and Zwave / tmpfs options )

After I did the install I did some permission changes to get the serial ports working but I am a bit unsure how much of that was actually required.

The only issue I have is with RFLink. When the Synology is restarted it will not go back online, but if I unplug it and wait a bit before reconnecting then it works again. I have not investigated this so much so it might just be my system.

an easier way that all that :stuck_out_tongue: is to edit the /etc/hosts file on your syno like:
127.0.0.1 bintray.com

then rename whatever openhab release you want to openhab-2.4.0.zip in your public folder and thats it

@Kennet Thank you very much, I follow your instruction and install openhab-2.5.0.M6 into my DS715 which is running on DSM 6.2.1-23824 Update 1.
it is working fine without issue oberserved.
why I need update 2.5 M6 from 2.4? Because I have one Z-Wave Fibaro FGR-222 defect, bought a new Z-Wave Fibaro FGR-223 which is not in openhab 2.4 database, need update 2.5 Mx, I choose the lates one : 2.5 M6, just was a few hour old.

@waspie can you share more details how to it ?

edit the hosts file?
ssh to your synology and vi /etc/hosts
or if you have nano installed from the syno community use nano

thank you for quick reply, can you share more details how to install the whole openhab 2.5 M6 in synology? thanks

Already running 2.4.002 on Syno for quite sometime, in order to upgrade to 2.5 or 2.5 Mx can it be done by

  1. backup (of course)
  2. adapt hosts file
  3. running "sudo /volume1/@appstore/openHAB/runtime/bin/update 2.5.0 "

???

Thx

Backup all of your data (/volume#/public/openHAB ← the whole folder)
When you install the new version it uninstalls everything and removes the openhab user from the system. It is as if openhab was never installed - so backup EVERYTHING.
download the M6 or whatever build from openhab. rename it to openhab-2.4.0.zip and upload it to the public folder on your syno. nano or vi the /etc/hosts file on your syno and add
“127.0.0.1 bintray.com
then get yourself the openhab spk and install as normal. it’ll use the zip since it can’t connect to bintray.
then restore your conf files and very selectively restore stuff out of the userdata folder. I wouldn’t copy the userdata folder from the backup straight over. I’ve run into problems with that. I grab the persistence folder, the openhabcloud stuff (and uuid file) habpanel.config file from userdata/config/org/openhab (i think that’s the path) and the jsondb folder. then let 'er rip.

I can’t answer #3, I have no idea.

First, thank you to all of you for the useful information shared in this topic.
I’ve learned a lot by reading through this tread and attempting different methods of installation.

I had OH2.5 M1 running on my DS718+ for a while but I can’t manage to upgrade/install OH2.5 M5 or M6 successfully.

Platform information:

  • Hardware: Synology DS718+/ INTEL Celeron J3455/2048MB
  • OS: DSM 6.2.2-24922 Update 3
  • Java Runtime Environment: Java8 upgraded to version 1.8.0_231
  • openHAB version: 2.5 M5 or M6 (the outcome is the same)

Issues:

  • Package installation failed.
  • Package installation completed but major runtime errors.

1. Package installation failed:

When trying to install either the original “openHAB-2.4.0.002-syno-noarch-0.001.spk” release or with the installer script modified to use the M5 or M6 zip file, the installation fail with the following DSM message:

And here is the installation log:

#### S T A R T  -  o p e n H A B  S P K ####
2019-12-05:13:31:49

Set instance variables...
  public:    false
  smarthome: false
  home:      true
  TMPFS:     false
  Z-Wave:    false
  port:    8080
  port:    8443
  tmp:    /volume1/@tmp
  share:  /var/services/homes
  oh:     /var/services/homes/openhab
  backup: -backup-201912
done
User that exec the Installation
root
Start preinst...
  Found java executable in PATH
  Java version 1.8.0_231
  Version is more than 1.8
  The User Home service is enabled. UH_SERVICE=yes
  The shared folder '/var/services/homes' exists.
  Get new version
  Processing openhab-2.5.0.M6.zip
  ERROR:
  There was a problem downloading openhab-2.5.0.M6.zip from the download link:
  'https://openhab.jfrog.io/openhab/libs-milestone-local/org/openhab/distro/openhab/2.5.0.M6/openhab-2.5.0.M6.zip'
  Alternatively, download this file manually and place it in the '/var/services/homes' shared folder and start installation again.

2. Package installation completed but major runtime errors:

If I proceed as instructed in the above log and place the zip file manually in the share folder the package script runs without a glitch and the DSM returns an installation completed message however I can’t access the UI and the start_debug.sh script returns the following errors:

MySynology:/volume1/@appstore/openHAB$ ./start_debug.sh
Launching the openHAB runtime...
Listening for transport dt_socket at address: 5005
java.io.FileNotFoundException: /volume1/homes/openhab/userdata/tmp/karaf.pid (Permission denied)
at java.io.FileOutputStream.open0(Native Method)
at java.io.FileOutputStream.open(FileOutputStream.java:270)
at java.io.FileOutputStream.<init>(FileOutputStream.java:213)
at java.io.FileOutputStream.<init>(FileOutputStream.java:101)
at org.apache.karaf.main.InstanceHelper.writePid(InstanceHelper.java:127)
at org.apache.karaf.main.Main.launch(Main.java:243)
at org.apache.karaf.main.Main.main(Main.java:178)
java.lang.RuntimeException: Exception instantiating lock class org.apache.karaf.main.lock.SimpleFileLock

Karaf can't startup, make sure the log file can be accessed and written by the user starting Karaf : /volume1/homes/openhab/userdata/tmp/lock (Permission denied)
at org.apache.karaf.main.Main.createLock(Main.java:520)
at org.apache.karaf.main.Main.doMonitor(Main.java:388)
at org.apache.karaf.main.Main.access$100(Main.java:74)
at org.apache.karaf.main.Main$3.run(Main.java:377)

Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.karaf.main.Main.createLock(Main.java:516)
... 3 more

Caused by: java.lang.RuntimeException: Karaf can't startup, make sure the log file can be accessed and written by the user starting Karaf : /volume1/homes/openhab/userdata/tmp/lock (Permission denied)
at org.apache.karaf.main.lock.SimpleFileLock.<init>(SimpleFileLock.java:55)
... 8 more

Caused by: java.io.FileNotFoundException: /volume1/homes/openhab/userdata/tmp/lock (Permission denied)
at java.io.RandomAccessFile.open0(Native Method)
at java.io.RandomAccessFile.open(RandomAccessFile.java:316)
at java.io.RandomAccessFile.<init>(RandomAccessFile.java:243)
at org.apache.karaf.main.lock.SimpleFileLock.<init>(SimpleFileLock.java:53)
... 8 more

The permissions on the share folder seem to be OK and I am running the latest Java8 syno package with the Oracle 1.8.0_231 upgrade.

Anyone encountered a similar problem so far?

permission denied.

use file station to set ownership and permissions of that folder to openhab and read/write

Thanks for your reply.

I’ve changed the ownership and permissions, restarted OH and still not working…

Well, what’s the error now?

spk seems to be too complicated. not only to install… backup, upgrade - all not clear (and messy ?!).
I gave up with it, after waisting too much time and moved to docker now. works in a few min on Synology and all is pretty clear. No errors. No issues. Easy.

Best
Softy

Here is the new log:

Launching the openHAB runtime...

                          __  _____    ____      
  ____  ____  ___  ____  / / / /   |  / __ )     
 / __ \/ __ \/ _ \/ __ \/ /_/ / /| | / __  | 
/ /_/ / /_/ /  __/ / / / __  / ___ |/ /_/ /      
\____/ .___/\___/_/ /_/_/ /_/_/  |_/_____/     
    /_/                        2.5.0.M6
                               Milestone Build   

Hit '<tab>' for a list of available commands
and '[cmd] --help' for help on a specific command.
Hit '<ctrl-d>' or type 'system:shutdown' or 'logout' to shutdown openHAB.

openhab> org.apache.karaf.features.internal.util.MultiException: Error:                                                                                                                                              
	Error downloading mvn:org.openhab.ui.bundles/org.openhab.ui.dashboard/2.5.0.M6
	at org.apache.karaf.features.internal.download.impl.MavenDownloadManager$MavenDownloader.<init>(MavenDownloadManager.java:91)
	at org.apache.karaf.features.internal.download.impl.MavenDownloadManager.createDownloader(MavenDownloadManager.java:72)
	at org.apache.karaf.features.internal.region.Subsystem.downloadBundles(Subsystem.java:457)
	at org.apache.karaf.features.internal.region.Subsystem.downloadBundles(Subsystem.java:452)
	at org.apache.karaf.features.internal.region.SubsystemResolver.resolve(SubsystemResolver.java:224)
	at org.apache.karaf.features.internal.service.Deployer.deploy(Deployer.java:393)
	at org.apache.karaf.features.internal.service.FeaturesServiceImpl.doProvision(FeaturesServiceImpl.java:1062)
	at org.apache.karaf.features.internal.service.FeaturesServiceImpl.lambda$doProvisionInThread$13(FeaturesServiceImpl.java:998)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
	Suppressed: java.io.IOException: Error downloading mvn:org.openhab.ui.bundles/org.openhab.ui.dashboard/2.5.0.M6
		at org.apache.karaf.features.internal.download.impl.AbstractRetryableDownloadTask.run(AbstractRetryableDownloadTask.java:77)
		at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
		at java.util.concurrent.FutureTask.run(FutureTask.java:266)
		at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
		at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
		... 3 more
	Caused by: java.io.IOException: Error resolving artifact org.openhab.ui.bundles:org.openhab.ui.dashboard:jar:2.5.0.M6: [Could not transfer artifact org.openhab.ui.bundles:org.openhab.ui.dashboard:jar:2.5.0.M6 from/to openhab (https://openhab.jfrog.io/openhab/online-repo-milestone/2.5/): openhab.jfrog.io]
		at org.ops4j.pax.url.mvn.internal.AetherBasedResolver.configureIOException(AetherBasedResolver.java:803)
		at org.ops4j.pax.url.mvn.internal.AetherBasedResolver.resolve(AetherBasedResolver.java:774)
		at org.ops4j.pax.url.mvn.internal.AetherBasedResolver.resolve(AetherBasedResolver.java:657)
		at org.ops4j.pax.url.mvn.internal.AetherBasedResolver.resolve(AetherBasedResolver.java:598)
		at org.ops4j.pax.url.mvn.internal.AetherBasedResolver.resolve(AetherBasedResolver.java:565)
		at org.apache.karaf.features.internal.download.impl.MavenDownloadTask.download(MavenDownloadTask.java:52)
		at org.apache.karaf.features.internal.download.impl.AbstractRetryableDownloadTask.run(AbstractRetryableDownloadTask.java:60)
		... 7 more
		Suppressed: shaded.org.eclipse.aether.transfer.ArtifactTransferException: Could not transfer artifact org.openhab.ui.bundles:org.openhab.ui.dashboard:jar:2.5.0.M6 from/to openhab (https://openhab.jfrog.io/openhab/online-repo-milestone/2.5/): openhab.jfrog.io
			at shaded.org.eclipse.aether.connector.basic.ArtifactTransportListener.transferFailed(ArtifactTransportListener.java:52)
			at shaded.org.eclipse.aether.connector.basic.BasicRepositoryConnector$TaskRunner.run(BasicRepositoryConnector.java:368)
			at shaded.org.eclipse.aether.util.concurrency.RunnableErrorForwarder$1.run(RunnableErrorForwarder.java:75)
			at shaded.org.eclipse.aether.connector.basic.BasicRepositoryConnector$DirectExecutor.execute(BasicRepositoryConnector.java:642)
			at shaded.org.eclipse.aether.connector.basic.BasicRepositoryConnector.get(BasicRepositoryConnector.java:262)
			at shaded.org.eclipse.aether.internal.impl.DefaultArtifactResolver.performDownloads(DefaultArtifactResolver.java:489)
			at shaded.org.eclipse.aether.internal.impl.DefaultArtifactResolver.resolve(DefaultArtifactResolver.java:390)
			at shaded.org.eclipse.aether.internal.impl.DefaultArtifactResolver.resolveArtifacts(DefaultArtifactResolver.java:215)
			at shaded.org.eclipse.aether.internal.impl.DefaultArtifactResolver.resolveArtifact(DefaultArtifactResolver.java:192)
			at shaded.org.eclipse.aether.internal.impl.DefaultRepositorySystem.resolveArtifact(DefaultRepositorySystem.java:247)
			at org.ops4j.pax.url.mvn.internal.AetherBasedResolver.resolve(AetherBasedResolver.java:767)
			... 12 more
		Caused by: shaded.org.apache.maven.wagon.TransferFailedException: openhab.jfrog.io
			at shaded.org.apache.maven.wagon.shared.http.AbstractHttpClientWagon.fillInputData(AbstractHttpClientWagon.java:1184)
			at shaded.org.apache.maven.wagon.shared.http.AbstractHttpClientWagon.fillInputData(AbstractHttpClientWagon.java:1072)
			at shaded.org.apache.maven.wagon.StreamWagon.getInputStream(StreamWagon.java:126)
			at shaded.org.apache.maven.wagon.StreamWagon.getIfNewer(StreamWagon.java:88)
			at shaded.org.apache.maven.wagon.StreamWagon.get(StreamWagon.java:61)
			at shaded.org.eclipse.aether.transport.wagon.WagonTransporter$GetTaskRunner.run(WagonTransporter.java:567)
			at shaded.org.eclipse.aether.transport.wagon.WagonTransporter.execute(WagonTransporter.java:435)
			at shaded.org.eclipse.aether.transport.wagon.WagonTransporter.get(WagonTransporter.java:412)
			at shaded.org.eclipse.aether.connector.basic.BasicRepositoryConnector$GetTaskRunner.runTask(BasicRepositoryConnector.java:456)
			at shaded.org.eclipse.aether.connector.basic.BasicRepositoryConnector$TaskRunner.run(BasicRepositoryConnector.java:363)
			... 21 more
		Caused by: java.net.UnknownHostException: openhab.jfrog.io
			at java.net.InetAddress.getAllByName0(InetAddress.java:1281)
			at java.net.InetAddress.getAllByName(InetAddress.java:1193)
			at java.net.InetAddress.getAllByName(InetAddress.java:1127)
			at shaded.org.apache.http.impl.conn.SystemDefaultDnsResolver.resolve(SystemDefaultDnsResolver.java:45)
			at shaded.org.apache.http.impl.conn.DefaultHttpClientConnectionOperator.connect(DefaultHttpClientConnectionOperator.java:112)
			at shaded.org.apache.http.impl.conn.PoolingHttpClientConnectionManager.connect(PoolingHttpClientConnectionManager.java:373)
			at shaded.org.apache.http.impl.execchain.MainClientExec.establishRoute(MainClientExec.java:394)
			at shaded.org.apache.http.impl.execchain.MainClientExec.execute(MainClientExec.java:237)
			at shaded.org.apache.http.impl.execchain.ProtocolExec.execute(ProtocolExec.java:185)
			at shaded.org.apache.http.impl.execchain.RetryExec.execute(RetryExec.java:89)
			at shaded.org.apache.http.impl.execchain.RedirectExec.execute(RedirectExec.java:110)
			at shaded.org.apache.http.impl.client.InternalHttpClient.doExecute(InternalHttpClient.java:185)
			at shaded.org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:83)
			at org.ops4j.pax.url.mvn.internal.wagon.ConfigurableHttpWagon.execute(ConfigurableHttpWagon.java:162)
			at shaded.org.apache.maven.wagon.shared.http.AbstractHttpClientWagon.fillInputData(AbstractHttpClientWagon.java:1095)
			... 30 more
	Caused by: shaded.org.eclipse.aether.resolution.ArtifactResolutionException: Error resolving artifact org.openhab.ui.bundles:org.openhab.ui.dashboard:jar:2.5.0.M6
		at shaded.org.eclipse.aether.internal.impl.DefaultArtifactResolver.resolve(DefaultArtifactResolver.java:413)
		at shaded.org.eclipse.aether.internal.impl.DefaultArtifactResolver.resolveArtifacts(DefaultArtifactResolver.java:215)
		at shaded.org.eclipse.aether.internal.impl.DefaultArtifactResolver.resolveArtifact(DefaultArtifactResolver.java:192)
		at shaded.org.eclipse.aether.internal.impl.DefaultRepositorySystem.resolveArtifact(DefaultRepositorySystem.java:247)
		at org.ops4j.pax.url.mvn.internal.AetherBasedResolver.resolve(AetherBasedResolver.java:767)
		... 12 more