Blockly rules after upgrade to OH 4.0.1 very slow on the first run

GraalVM itself is a fully compatible replacement for OpenJDK/OracleJDK. So there should not be a compatibility problem IF there is a version available. But I guess GraalVM is not available on every platform you mention.

The JS Scripting add-on provides GraalVM JS.

That is not a supported (expected) way of executing Graal.js. It is possible, but without GraalVM as the underlying optimizing compiler, you lose performance (exactly what you experience). There might be ways around that limitation, see Run GraalVM JavaScript on a Stock JDK - but again, that support might change over time.

unless GraalVM can support Jav 17

We are on latest Java version(s) available. Published is Java 20, our main line (to be published soonish) is Java 21 and we are working on 22 already. LTS branches offer support for Java 17. See

we also need to be careful about licensing.

Agreed, but I don’t think there is a difference between using Graal.js or the whole GraalVM.

Given the way OH is built and the fact that GraalVM JS is an option, not a requirement (though as an option it needs to run everywhere we want OH to run and RPis are our biggest install base), based on what I can tell so far, it would require quite some compromises to OH overall. It seems like we would need to throw out all the other rules languages we currently support or live with the performance. And it’s not clear to me that runtime development of rules would be possible. All the descriptions talk about compiling the app into a binary which seems to preclude that as an option.

It would be interesting to see what we can do to make it better. I’m not sure whether/how those command line arguments passed to the JRE are handled in an OSGi context. Clearly something is done to enable the JIT when the add-on is installed since it works. The module-path and upgrade-module-path have to be there somehow. But I don’t know if those other arguments are handled (-XX:+UnlockExperimentalVMOptions -XX:+EnableJVMCI -XX:+UseJVMCICompiler). Maybe those are required too in order to work at all in which case they have to be there too.

I might experiment a bit but over all I’m quickly approaching the limits of my knowledge.

It seems like we would need to throw out all the other rules languages

I don’t see why this would be necessary. You are using OpenJDK (?) now. You use GraalVM in the future. Everything else stays the same. All other (Java-based) tools work just as before.

All the descriptions talk about compiling the app into a binary

You are referring to GraalVM Native Image, which is a component of GraalVM, but not one you would be using. You’d be using the “normal” Java JIT compiler (“java.exe”).

what we can do to make it better

Without GraalVM: very little. Whatever you do won’t fix the root problem that the JavaScript code is not compiled. Nashorn does that by generating Bytecode; GraalVM has it’s own trick of doing that (called partial evaluation). The other options (as listed in the article I linked in my previous post) will work on any 11+ Java VM and are required to enable GraalVM as (one of the) optimizing compiler(s). Will still require to download and provide GraalVM’s optimizing compiler, which might not be available on every platform (or not be suitable, due to size limitations, etc.).

Looping in @Kai as he might have more insights if this might be an option.

I don’t. I would guess that @digitaldan is rather our specialist here.

1 Like

Nice to see, that the discussion about this topic has started and experts are involved.

I personal, as several times told now, have no problem if my notifications rules run 1 sec (on 3.4.4)
on now 30 sec on 4.0.2 to inform me, that outside rain has started, or bike charging has finished.
Real total not interesting for me. But I assume other users have the need here that rules run fast.

The funny thing is, that with the flag, (“Do Not Use Globals”), the dsl and javascript rule run fast,
but the Blockly will not more run. For my personal ~50% of my rules from total ~70 rules which way I go whith the flag or not, are impacted.

Thanks to all the contributors to find a suitable solution for an upcomming OH release to combine the run of blockly rules, and fast DSL and Java scripts…

The “Do Not Use Globals” flag is for and only used by JS Scripting. So it’s only going to impact the JS Scripting add-on and stuff that uses JS Scripting. As mentioned above, Blockly uses JS Scripting.

Rules DSL is not impacted at all, nor are any of the other rules languages.

Thanks for clearification…

In past under OH 3.4.4. with Java11 nothing has to set.
DSL, JavaScripts and Blockly were running fast without any user attention…

Now under 4.x?

Java Scripting needs to be installed as add on.
DSL Rules runs normal
and you can select, a flag for JavaScripting to run fast but then Blockly run not more.

Welcome come to the new world…

This is getting inappropriate.
You should be glad that after removal of Nashorn from Java we still can offer both solutions. And for giving you the choice, you need to install the appropriate addon.

1 Like

Hi :wave:,

I didn’t actually realize there was a performance difference by including GraalVM as a dependency, so that is something new to think about. @florian-h05 might find this fact interesting as well. Thanks @wirthi for letting us know ! I want to give this some more thought, and run a few tests on my home system, and maybe on a resource constrained one like on a Raspberry Pi 3 which i have a few running around the house. I think running GraalVM as an alternative JVM is very much an option, after all i think flexibility and choice and how people can run openHAB is a strength of ours. I’m out of town for a few days, so this may have to wait for next week. I’m also interested in this as i am still pursuing running matter.js in GraalVM as a possible Matter implementation, and optimized performance might be desirable.


This is totally misrepresenting the situation. You make it sound like the helper library is only there to slow everything down.

I’ve already outlined out how this works. I’ll just post an example of what the difference is between not using the helper library (which is the only way to “run fast”) and not using the helper library.

Here is a simple rule I run at sundown to reset a flag in the Item metadata that gets set when someone manually changes a light during the day.
With the helper library:

console.debug('Looping through the lights');
items.TOD_Lights_ON_WEATHER.members.forEach( light => {
  console.debug('Current override for ' + + ' is ' + light.getMetadata('LightsOverride').value);
  light.replaceMetadata('LightsOverride', 'false');

Notice half the rule is just logging. Without the logging it’s a one liner

items.TOD_Lights_ON_WEATHER.members.forEach( light => light.replaceMetadata('LightsOverride', 'false') );

Without the helper library:

if(typeof(require) === "function") Object.assign(this, require('@runtime')); // makes code compatible with Nashorn or GraalVM JS Scripting
var logger = Java.type("org.slf4j.LoggerFactory").getLogger("org.openhab.model.script.Rules.rules_tools.Debounce");

// Get Metadata query stuff
this.FrameworkUtil = (this.FrameworkUtil === undefined) ? Java.type("org.osgi.framework.FrameworkUtil") : this.FrameworkUtil;
this.ScriptHandler = Java.type("org.openhab.core.automation.module.script.rulesupport.shared.ScriptedHandler");
this._bundle = (this._bundle === undefined) ? FrameworkUtil.getBundle(ScriptHandler.class) : this._bundle;
this.bundle_context = (this.bundle_context === undefined) ? this._bundle.getBundleContext() : this.bundle_context;
this.MetadataRegistry_Ref = (this.MetadataRegistry_Ref === undefined) ? bundle_context.getServiceReference("org.openhab.core.items.MetadataRegistry") : this.MetadataRegistry_Ref;
this.MetadataRegistry = (this.MetadataRegistry === undefined) ? bundle_context.getService(MetadataRegistry_Ref) : this.MetadataRegistry;
this.Metadata = (this.Metadata === undefined) ? Java.type("org.openhab.core.items.Metadata") : this.Metadata;
this.MetadataKey = (this.MetadataKey === undefined) ? Java.type("org.openhab.core.items.MetadataKey") : this.MetadataKey;

var getValue = function(item, namespace, key) {
  var md = MetadataRegistry.get(new MetadataKey(namespace, item));
  if(md === null || md === undefined) {
    return null;
  else if(key === undefined) {
    return md.value;
  else {
    return md.configuration[key];

var setValue = function(item, namespace, value) {
  var key = new MetadataKey(namespace, item);
  var newMetadata = new Metadata(key, value);
  var meta = (metadataRegistry.get(key) === null) ? metadataRegistry.add(newMetadata) : metadataRegistry.update(newMetadata);

var members = itemRegistry.getItem("TOD_Lights_ON_WEATHER").getMembers()
                          .stream().forEach(function(light) {
                            logger.debug('Current override for ' + + ' is ' + getValue(light, 'LightsOverride');
                            setValue(light, 'LightsOverride', 'false');

Note, I didn’t exaggerate above. That’s close to the bare minimum required to implement the same rule in the same way.

Great to have an expert here :slight_smile:

Our GraalJS-based JavaScript Scripting add-on is working the way you think - we just include the required Maven artifacts.

I have actually already tried out GraalVM on both my dev and prod systems, and it is running fine on my production system (Debian x86_64). However I haven’t thought that it makes such a large performance difference …

Unfortunately GraalVM as a JDK has no support for 32-bit ARM, which I guess is one of the main architechtures openHAB is running on. At least older openHABian installations are only 32-bit, and I guess most people install 32-Bit only because on the release page, where you download openHABian there is a warning for 64-Bit and increased memory usage.

So the best option would be to enable GraalVM as optimizing compiler.
I will see if I can do a few performance tests.

@digitaldan Thanks for pinging me, otherwise I would have missed that very interesting conversation here.

Would it feasible/reasonable to offer GraalVM where it’s supported and on 32-bit ARM only offer OpenJDK?

This is more of a distribution/openHABian/documentation question than a technical question I think. I suppose it depends on the extent of the performance improvement from running on GraalVM.

Rich, only to show you one example. So real no rocket tech behind.
I’m runnig a rule when my networkcam is not more reachable with a ping command.
Then I get a mail and a Whatsapp.
From thoose rules an other I’m have arround 70.
In 3.4.4 I’m getting the WA in 1 sec, under 4.0.2 or 4.03 it takes 30 sec.
Also a lot from my rules have combination from JS Scripting, Blockly and DSL in one rule…

DSL Part

val mailActions = getActions("mail","mail:smtp:123456") 
mailActions.sendMail("", "### ALARM ### ALARM ###", "Die Netzwerkkamera außen ist offline gegangen.")

JS Scripting Part

var HttpUtil = Java.type("")
var urlmessage = encodeURI("Die Kamera im Außenbereich ist OFFLINE gegangen.")
HttpUtil.executeUrl("GET", "" + urlmessage , 2000)

But the experts are now here and talk positive about an upcomming solution.

FYI I have created [jsscripting] Use GraalVM's optimised compiler · Issue #15600 · openhab/openhab-addons · GitHub to discuss the technical implementation of integrating the GraalVM optimised compiler (if possible).


I’m not 100% sure what you are seeing is related.

Everything the experts are talking about here has to do only with the first run of the rule. Once it runs that first time it should run fast from that point onward. The part that is slow is loading the helper library and that only happens on that first run.

Note, every time you open and save the rule, you are forcing that first run again so by doing that your are making the problem worse.

If it slow all the time, what ever is being discussed here isn’t going to fix it. As discussed in this an other threads, those reported cases of this behavior have been caused by overloaded machines.


To Make this very interesting conversation more clear i added “on the first run” to the title of the topic.

In my case it is exacty what rich posted

Once the library is loaded all my rules are as fast as with the latest OH3 version.


Rich, you are right…

BUT the fact, that it runs after the 2nd. time faster only works, when the rule was not disabeld between the run and the 2.nd run.
If the rule e.g.normal is enablend during daytime and disabled during night, and will be enabled at the next morning it has exactly this effect, that it is slow again.
This was not the case in 3.4.4…

BTW: In my case the rules are not open, closed and saved. I only press the “play button” to run it for testing.

Why would one do that? Limit the rule to not run during night time by using the

But only if


1 Like

Isn’t that what I said?

Note, every time you open and save the rule, you are forcing that first run again so by doing that your are making the problem worse.

I admit I didn’t expressly say enable/disable but it’s the same thing.

You keep saying that. We know! And it’s irrelevant. You are not using 3.4.4. You are not using the same language as 3.4.4 any more. It’s a wholly new underlying language and technology. Why? Because the old 3.4.4 implementation was awkward to use based on a ten year old version of JS which is not actively supported by anyone any more.

And we’ve already identified why and have people looking into that.

I’ll state the problem again, with more details.

The first time a rule is run after it is created, modified, enabled, or loaded (on system restart) it takes extra time for the helper library to be injected into the rule.

If your rules are slow and do not fit that description, open a new thread because nothing in this thread is going to be relevant.

If that does describe your rules, some work is being done to look into whether we can make this better for some users in some cases (if you are on a 32-bit Raspberry Pi OS you might be out of luck.

I also agree with @hmerk, that feels like an anti-pattern though more details are necessary. Usually the conditions are used for something like this.

1 Like