Hey there,
I have a problem using persistance in rules.
When I use the “changeSince” or “minimumSince” function, my system does freeze after some hours.
The rule gets triggerd every minute.
I can disable the rule and everything is running fine.
System Infos:
openHAB 3.0.0 (raspbian installation)
RRD4J (default Setup)
Rule Script:
//val lastChange1 = ClimaSensBadFenster_Fenster.changedSince (now.minusMinutes(11))
val lastChange2 = ClimaSensBadFenster_Fenster.changedSince (now.minusMinutes(10))
if (ClimaSensBadFenster_Fenster.state == OPEN) {
//if (lastChange2 == false && lastChange1 == true) sendCommand(EchoDot_Befehl,"kündige an Badfenster schließen")
}
Clearly there is something wrong and it shouldn’t do this. But the description of the rule sounds like an antipattern. Why run a rule every minute to query the database? Perhaps there is a better way.
I suppose someone can. I don’t see anything wrong and without something in the logs to indicate what would be wrong there is nothing to go on.
But like the old joke goes “Doctor, it hurts when I move my arm like this!” Doctor replies “Then stop moving it like that.” Since the overall approach seems off, you can fix both problems in one go. Or you can wait.
Regular quotes do not preserve white space and that white space is important.
The OutOfMemoryError is the root problem. Once you see that error any subsequent error isn’t all that meaningful. Just about anything that OH does will fail because it doesn’t have any memory to work in. But that doesn’t mean the thing reporting that error is the cause of the OutOfMemoryError. There are some threads with other users having 100% CPU and out of memory errors with Rules DSL in the UI so you might be hitting that and the Persistence problem is a red herring (misleading clue).
Do you see the “failed to execute rule” errors before the OutOfMemoryErrors?
OutOfMemoryError is several times befor the failed rule execution and explains why the system is slowing down before the freeze.
Until now I always looked at my memory and sawp usage which is fine all the time.
Is there a way to log java heap space usage?
What could be a solution for this?
I could just increase the java heap space but this would not solve the main issue I think.
Could it help to set “val = null” after usage?
Or should I call “System.gc();” in the rule to trigger the garbage collector?
There were some issues with dsl rules created in the UI causing the system to hang. Mine would do it every day or so. Disabling the two dsl rules I had resolved it.
There was something noted in 3.0.1 that may have addressed that issue, and this thread was a good reminder for me to turn those rules back on and check!
@rlkoshak I have found a post where u did describe how to log the Heap usage:
How can I do this with the exec binding in openhab?
You run the command “shell:info” in the openhab console but the user “openhab” does not know this command …
I have done a little bit testig and found out something interesting.
It looks like the persistance service is not the problem.
I have changed the rule to just get the time and my system is still freezing after soem time:
var time1 = now.minusMinutes(1);
var time2 = now.minusMinutes(2);
var time3 = now.minusMinutes(3);
var time4 = now.minusMinutes(4);
var time5 = now.minusMinutes(5);
Is there any other way to get the time or “free” it?
I have also thought about making the GC more aggressive but found nothing about how to do this.
Finally it’s neither the persistance service nor the time variable.
It’s just the DSL rule running like @jace wrote …
Here is the link to the GitHub pull request:
I’m now using a workaround so that I do only trigger the DSL script if the window is open.
I would be careful thinking that this is the same issue as the PR. We are definitely having a memory issue on some lower powered systems and this only works to try to contain that by reintroducing a thread pool for rules which can be turned on/off depending on the system. If you want to test, there is a jar available in respect to the PR to see if it solves your problem or not.