Instructions for openHAB on FreeNAS?

I am not using anything USB on the FREENAS so i can’t help with anything about that.
Also, nothing special on jail properties.

I have seen the same problem just now, with 2.5.4. Please share if you have any hints. For what it is worth I may bypass the issue by having the Aotec Z stick and also RFXCOM run on a remote Pi with USB over TCP.

For those who followed @bob2b suggestion how to create the devfs ruleset, you may have noticed that it will disappear if you restart the jail. It will come back if you reboot FreeNAS, as his script runs on startup. However, here is a way how you can make it work when your jail starts, meaning you can restart it without having to reboot FreeNAS. Restarting the jail takes seconds, so this is an attractive option.

  1. Remove the script from Tasks Init/Shutdown scripts (if you followed @bob2b)
  2. Call that script in the OpenHAB jail’s Jail Properties exec_prestart
  3. Apply the ruleset in Jail Properties exec_poststart using something like devfs -m /mnt/YOUR_DATASET/iocage/jails/YOUR_JAIL/root/dev rule -s 99 applyset
1 Like

This is driving me nuts. I side-tracked and install Raspbian as a VM on FreeNAS instead, only to realize that USB passthrough it not possible there at all. LOL. So back to the jail approach… Going to dig deeper into the logs now. “Glad” to hear someone else is experiencing the same issue.

I built nrjavaserial from source - no change.
Rebuilt OpenJDK8 from source - no change.

Full log until crash:
crash.log (103.9 KB)
Obviously, this sticks out:

20:30:43.022 [ERROR] [ing.xml.internal.ThingTypeXmlProvider] - Could not register ThingType: zwave:aeotec_zw141_03_000
java.lang.IllegalArgumentException: ID segment ' switch_dimmer' contains invalid characters. Each segment of the ID must match the pattern [A-Za-z0-9_-]*.
        at org.eclipse.smarthome.core.common.AbstractUID.validateSegment(AbstractUID.java:97) ~[bundleFile:?]
        at org.eclipse.smarthome.core.common.AbstractUID.<init>(AbstractUID.java:75) ~[bundleFile:?]
        at org.eclipse.smarthome.core.common.AbstractUID.<init>(AbstractUID.java:49) ~[bundleFile:?]
        at org.eclipse.smarthome.core.thing.UID.<init>(UID.java:48) ~[bundleFile:?]
        at org.eclipse.smarthome.core.thing.type.ChannelTypeUID.<init>(ChannelTypeUID.java:40) ~[bundleFile:?]
        at org.eclipse.smarthome.core.thing.xml.internal.ChannelXmlResult.toChannelDefinition(ChannelXmlResult.java:135) ~[bundleFile:?]
        at org.eclipse.smarthome.core.thing.xml.internal.ThingTypeXmlResult.toChannelDefinitions(ThingTypeXmlResult.java:98) ~[bundleFile:?]
        at org.eclipse.smarthome.core.thing.xml.internal.ThingTypeXmlResult.getBuilder(ThingTypeXmlResult.java:148) ~[bundleFile:?]
        at org.eclipse.smarthome.core.thing.xml.internal.ThingTypeXmlResult.toThingType(ThingTypeXmlResult.java:156) ~[bundleFile:?]
        at org.eclipse.smarthome.core.thing.xml.internal.ThingTypeXmlProvider.addingFinished(ThingTypeXmlProvider.java:148) [bundleFile:?]
        at org.eclipse.smarthome.config.xml.osgi.XmlDocumentBundleTracker.addingFinished(XmlDocumentBundleTracker.java:265) [bundleFile:?]
        at org.eclipse.smarthome.config.xml.osgi.XmlDocumentBundleTracker.parseDocuments(XmlDocumentBundleTracker.java:424) [bundleFile:?]
        at org.eclipse.smarthome.config.xml.osgi.XmlDocumentBundleTracker.processBundle(XmlDocumentBundleTracker.java:398) [bundleFile:?]
        at org.eclipse.smarthome.config.xml.osgi.XmlDocumentBundleTracker.access$6(XmlDocumentBundleTracker.java:393) [bundleFile:?]
        at org.eclipse.smarthome.config.xml.osgi.XmlDocumentBundleTracker$2.run(XmlDocumentBundleTracker.java:363) [bundleFile:?]
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [?:1.8.0_252]
        at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_252]
        at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180) [?:1.8.0_252]
        at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293) [?:1.8.0_252]
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_252]
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_252]
        at java.lang.Thread.run(Thread.java:748) [?:1.8.0_252]

But then it goes on to add a bunch of devices before it dies… so not sure what to make of it.

Some people obviously seem to have gotten this to work?! Could you please share what versions of FreeNAS, FreeBSD, OpenJDK, OpenHAB etc you are using?

This is an unrelated issue, see this thread. Basically, the 2.5.4 version of the ZWave binding is currently broken because of a whitespace formatting error in some of its XML files.

The workaround is to not install automatically the ZWave binding (so remove it from addons.cfg or however you have installed) and to manually install the current 2.5.5 ZWave snapshot of the binding into the /usr/local/libexec/openhab2/addons directory. This will get you the current version:

fetch https://openhab.jfrog.io/openhab/libs-snapshot-local/org/openhab/addons/bundles/org.openhab.binding.zwave/2.5.5-SNAPSHOT/org.openhab.binding.zwave-2.5.5-20200504.110626-5.jar

You will also need to use Karaf to manually install

feature:install openhab-transport-serial

at which point the ZWave binding should silently start working. You can see if it is installed in Karaf using list -s | grep zwave.

We seem to be following each other at the moment, @MrRusch. :slight_smile: I have just run into this problem. Have you resolved the issue with the lock files? I understand it is somehow related to nrjavaserial.

Sadly no. From what I understand, OH has compiled a version of nrjavaserial without liblockdev library - to resolve issues like these. But, the JAR in the distribution package has “OH” in its name, insinuating that it is the recompiled version; and grabbing an older one and replacing it makes no difference either.

I’ve also tried adjusting the devfs rule to pass through the USB device with open permissions, and for the openhab user specifically

/sbin/devfs rule -s ${NUMBER} add path 'cua*' mode 777 group 235 unhide

Same error, same error same error…

Ironically, the closest to working I get is when not adding the openhab user to the dialer group. This way the JRE never crashes, but it’s stuck in a loop trying to recreate the lock file, so the Zwave binding goes offline/online once every minute. It actually stays up! But I mean, no…

I have a feeling the issues is related to how OpenJDK utilizes the nrjavaserial package. Something may have changed in later versions of OpenJDK8. Perhaps reverting back a few versions would give some more information.

1 Like

Thank you for your thoughts, @MrRusch. I have also tried using a different version of nrjavaserial, substituting it in the /runtime tree, but when I checked in Karaf it was still showing 3.15.0.OH2. I suspect the whole openhab-core needs to be recompiled with the newer version, have you tried that?

There are more issues related to the lockfiles, it seems, and they are known to the openhab-core developers. See this and this and even as far back as here. There is an open request to recompile nrjavaserial 3.20.0 for FreeBSD, see here. Unless you get to it before me, I can have a look next weekend.

I think I will report it as a bug in openhab-core next, as it seems that no one is able to run this on FreeBSD at the moment.

Other than that, all that remains is to run OH 2.5 in bhyve on Openhabian/Debian, with the ZWave stick on a remote Pi with USB over TCP using ser2net. Hopefully the new nrjavaserial would support RFC2217 port.

How I wish the ZWave binding was as easy to connect as RFXCOM, that was a breeze. :slight_smile:

I have not.

I might give this a try if the instructions referenced there are detailed, otherwise I’m not sure I can manage.

EDIT: OK, so this was either not so hard - or way beyond me. :rofl: But I think I managed to build it. Now what hmm…
nrjavaserial-5.0.0.jar.txt (711.8 KB)
32bit: libNRJavaSerial.so.txt (50.9 KB)
64bit: libNRJavaSerial.so.txt (63.6 KB)

Great! Hopefully we’ll get some attention from the pros.

Might as well just run OH on the RPi then no? :laughing:

If you have a Github account, you should fork the repo, do what you did, then create a PR for the built binaries, see this fresh comment.

As I am sure is the case for you, we are both trying to move away from having OH on the Pi. It is a bit slow and I would like it to be snappier in how it replies to events and commands. It is hard to manage: no snapshots. Above all, it is a weak point in what is increasingly an essential piece of home infrastructure. However, having Pi run the easy task of being merely a USB hub over TCP, with a spare one in a cupboard just in case, is more than ok for me—and a necessity, since I cannot get signal for ZWave and RFXCOM from the server enclosure, and so I need the dongles in the attic.

I have reported the issue on Github openhab-core repo.

About this… I don’t think you can simply replace the JAR file. In console, find the 3.15.0.OH2 bundle with “bundle:list” command. Then stop it using its ID, e.g. “bundle:stop ”. Then install the replacement JAR using “bundle:install file://<PATH_TO_JAR>>” and start it with “bundle:start ”.

I just tried this with my newly built JAR, but same error occurs. I think I need to build it from source with the liblockdev library disabled. I remember seeing instructions somewhere.

1 Like

Wishing you success @MrRusch!

@MrRusch, I have just recompiled the native code for nrjavaserial and submitted a PR to them. Interestingly, my executables had different filenames from yours. Did your not end up as:

src/main/c/resources/native/freebsd/x86_32/libNRJavaSerial.so
src/main/c/resources/native/freebsd/x86_64/libNRJavaSerial.so

somehow? In any case, this is what I have PRed. Hope it helps them help us, and in the meantime, if you have any success or otherwise with getting the ZWave stick not crash OH runtime, please share.

PS. Never mind—you just ended up somehow getting .txt extension, I suppose the files must be the same.

Upload restrictions on file extensions.

The recompiled version has just been merged into the master. Even if there is a release of 5.0.0 soon I am unsure if it helps. In the meantime, I am trying to rebuild openhab-core@2.5.0 having updated the only 2 references to nrjavaserial that I can see in its 2 pom.xml files to 5.0.0. If I succeed I will report. Unfortunately, I could not follow the older OH1.8 instructions from Serial port access from FreeBSD as the new openhab-core repo does not seem to have the nrjavaserial.jar anywhere in it. I hope it just pulls the right one from GH, somehow.

I think what needs to happen is the changes from the original NeuronRobotics repo need to be pulled into the OH fork: GitHub - openhab/nrjavaserial: A Java Serial Port system. This is a fork of the RXTX project that uses in jar loading of the native code.
After managing any conflicts in this, we rebuild the OH-specific version without liblockdev and other changes. Not sure how much work this means…

1 Like

I just compared the openhab/nrjavaserial with the current head of neurorobotics/nrjavaserial and the two branches merge cleanly. Last commit in openhab/nrjavaserial is from 2017, by the way.

Either all of OH modifications have been already merged into the neuronrobotics original, or (more likely) OH-specific work was done in another repo—but I cannot find it, and I cannot see any openhab-labelled PRs in the neuronrobotics original, either.

If you have hunted down the location of the OH-specific changes, I can compare.

In any case, my understanding is that the newer nrjavaserial introduces a native way of dealing with locking, hence I am keen to test it. My build succeeded yesterday and I have been asking for some help to figure out how to test it. I will have an answer this evening.

I have done a quick test using nrjavaserial-5.0.0. It still crashes OH runtime but now gives a different error in the rfc2217 issue—see the Github issues I linked for more info. I suspect, as you did @MrRusch, that we need to get those OH-specific patches to make that work.

Well, there’s some information in the repo’s README under “Some of the features we have added”. But if you managed to merge latest code from original into OH-fork, they should all be included still?