OH2 port conflict (Karaf?)

I seem to have a port conflict on my Synology with the latest OH2 - I suspect it’s an issue with the Karaf debug port which I believe defaults to 5005 (which my synology is using!), although the error isn’t very explicit in this respect -:

Launching the openHAB runtime...
./runtime/karaf/bin/karaf: ./runtime/karaf/bin/setenv: line 84: arch: not found
ERROR: transport error 202: bind failed: Address already in use
ERROR: JDWP Transport dt_socket failed to initialize, TRANSPORT_INIT(510)
JDWP exit error AGENT_ERROR_TRANSPORT_INIT(197): No transports initialized [../../../src/share/back/debugInit.c:750]
FATAL ERROR in native method: JDWP No transports initialized, jvmtiError=AGENT_ERROR_TRANSPORT_INIT(197)
Aborted (core dumped)

I’ve tried changing the debug port, but so far unsuccessfully (at least I can’t get past this error) - does anyone know how this is done?


Here it says you can set

export DEFAULT_JAVA_DEBUG_OPTS='-Xdebug -Xnoagent -Djava.compiler=NONE -Xrunjdwp:transport=dt_socket,server=y,suspend=y,address=5005'

So I would hope that you can define a different port than 5005 with it.

Thanks Kai,
I found this, and tried it already, but no success…


I should add that I tried hacking around with the scripts and added the -Xrunjdwp part direct to the startup - this also didn’t work, although it had a different effect (from memory it gave an error multiple options were set).

Hm, sorry, then I do not have a clue…
@davy, @maggu2810 any hints from you?

Normally karaf will only use port 5005 when it’s started in debug mode (using start_debug.sh)…

Yes, but the question is how to change this default, since port 5005 seems already be used on Synology…

Thanks - the reason I was running in debug (other than wanting to debug :smile:) is because if I run the normal script I get the following error -:

./runtime/karaf/bin/karaf: ./runtime/karaf/bin/setenv: line 84: arch: not found

Maybe this is unrelated, but I do want to run in debug, so I’d like to solve this - otherwise I’ll need to look at setting up another environment I guess.

Interesting, I would expect the same error to happen in debug mode too though…
So the command arch doesn’t work on your synology. What OS is it running and which JDK?
Does the JDK support the G1 garbage collector? You can check using java -version -XX:+UseG1GC
In the setenv, you can comment out lines 84-88, which set the JVM options, that should get rid of that arch error and hopefully the normal start should then work.
For the debug mode, try the following:

export JAVA_DEBUG_OPTS='-Xdebug -Xnoagent -Djava.compiler=NONE -Xrunjdwp:transport=dt_socket,server=y,suspend=y,address=5006'

Didn’t you just say it would only happen in debug though?

It’s Synologys system, which is busybox 1.16.1

Java is -:

java version "1.7.0_80"
Java(TM) SE Runtime Environment (build 1.7.0_80-b15)
Java HotSpot(TM) Server VM (build 24.80-b11, mixed mode)

The export config that you pasted looks similar to what I’ve tried before, although I used another port since 5006 is also in use… Anyway, it doesn’t help - sorry.

The port conflict yes. The error with the arch command should happen in both cases.

The export config that you pasted looks similar

Did you use export JAVA_DEBUG_OPTS or export DEFAULT_JAVA_DEBUG_OPTS? Small difference, but only the first one will work.

java version “1.7.0_80”

Given that OH2 is compiled against Java 8, that could be an issue :wink:

It does - it’s in the first error message I posted as well…

Ah - is that another change then? I thought 1.7 was the dependancy, not 1.8. For sure, the previous version which I loaded a few weeks back worked fine under 1.7 :frowning:

Commenting out the lines you suggested has worked to some degree - the runtime has started and I have the console, but there’s no web interface (I’m getting an error there which I’ll need to track down). However, ‘exit’ doesn’t work, so I’m stuck in the console - something isn’t right, but maybe that’s associated with running 1.7.

Having killed off the console, I confirmed that actually the export you posted does work (by which I mean I don’t get the error - it still doesn’t actually work yet) - when I ran it earlier I still have port 5006 which is also used on Synology.

I guess I now need to work out how to upgrade the system to 1.8 and see if that fixes things…

@kai - can you confirm that Java 8 is indeed now required - I thought that previous discussions were that 7 was going to be the baseline and I can’t find 8 stated in the docs anywhere (yet). This might be a problem for Synology as the docs say only 7 is currently supported :frowning:

Same problem here.
I am running OH2 on a jail in Freenas (Freebsd)

@chris: Does the command “uname -m” work on your system. This is the more common command to be used to get the machine hardware name.

If a java compiler 1.8 is used and a java runtime 1.7 this could throw errors, regardless if source / target 1.7 is used. See e.g. https://gist.github.com/AlainODea/1375759b8720a3f9f094

It just returns x86_64.

It’s running DSM 5.2 (Synology) and it’s BusyBox v1.16.1 (2015-10-28 13:23:20 CST)

Hmmm - this might be of interest @Kai

2016-01-16 21:10:59.442 [WARN ] [url.mvn.internal.AetherBasedResolver] - Error resolving artifactorg.openhab.core:org.openhab.io.rest.docs:jar:2.0.0-SNAPSHOT:Could not transfer artifact org.openhab.core:org.openhab.io.rest.docs:jar:2.0.0-SNAPSHOT from/to oh-snapshot-repo (http://oss.jfrog.org/libs-snapshot/): Failed to transfer file: http://oss.jfrog.org/libs-snapshot/org/openhab/core/org.openhab.io.rest.docs/2.0.0-SNAPSHOT/org.openhab.io.rest.docs-2.0.0-SNAPSHOT.jar. Return code is: 503 , ReasonPhrase:Service Unavailable: Back-end server is at capacity.

It looks like whatever is serving up the online system is broken (or maxed out)!

@chris “x86_64” is fine, it is the same as the (IMHO) newer arch (part of coreutils) returns.