I’m trying to achieve the following. I want to compare different calculations from my persistence service (InfluxDB). Openhab-JS offers a nice API for calculating time-weighted averages.
In HomeAssistant there are two properties that do something similar: average_linear and average_step. You can also query average_timeless in HomeAssistant.
Is there something similar for openHAB? And which interpolation is currently used for averageSince, averageBetween and averageUntil? According to this implementation, I assume it is step interpolation, right?
I saw that there are internalMean[since|until|between] and internalMedian[since|until|between] in openhHAB core, but I guess not exposed to the JS API?
OH only provides one type of average calculation and based on the code it looks like OH implements something close to what HA is called a step average.
Time-weighted averages take into consideration not only the numerical levels of a particular variable, but also the amount of time spent on it. For instance, if you are measuring the temperature in a room - acknowledging the differences in the amounts of time until it changes. A brief example: 18 °C for 13 hours a day, 21 °C for 7 hours a day, and 16.5 °C for 4 hours a day, you would obtain 18 °C x 13 h, 21 °C x 7 h and 16.5 °C x 4 h (234, 147, and 66, respectively). Sum the values that you obtained. In this case, 447 °C hours. Add together the time weights to get the total weight. In our example, the total weight is 13 h + 7 h + 4 h = 24 h. Divide the value in Step 2 by the total weights in Step 3, to get an average of 447 °C hours / 24 h = 18.625 °C.
It uses weights therefore, instead of interpolation as I understand it but I’m not sure it matches any of the types of averages HA does based on their descriptions.
Other types of averages of course could be added I suspect, but we also run the risk of adding a whole lot of confusion if we have too many choices. I’d be surprised if the average OH users (pun intended) would know the differences between these and which is appropriate to use in which circumstances. So that needs to be weighed against the added complexity by adding a bunch new average functions.
Of course, using any of the “getAllStates*” action gives you the raw data to calculate the average how ever you wish.
I agree, with getAllStates you could implement any algorithm you can imagine. I tried that, but it is terribly slow compared to code executed via Java in the back of the JS wrappers.
The use case is this: one would calculate an integral to get the area under a curve using the average function and multiplying it by time. Now it depends on the persistence strategy and filters whether a linear or stepwise approach fits best (approximation closest to the truth).
More specifically: For example, I measure the performance of my heat pump and want to calculate the energy consumption. The heat pump is intermittent, meaning I always have performance values of 0 W in the recording, but I only persist them every 15 minutes (all values above 0 W every 5 seconds). The step variant is currently suitable, but there are use cases where linear interpolation makes more sense.
The Median functions are available in JS scripting, just like the Average functions. The Average functions correspond to the step averages in Home Assistant. There are no Mean functions at all (also not internally). While they could be added, they were considered less useful. But if you are interested, open an issue in the core repo, and I may look into it when I have time.
I doubt JS adds that much overhead. I suspect it’s the persistence itself that makes it slower.
But you can use Rules DSL or in JS skip the wrappers entirely by using var Persistence = Java.type("org.openhab.core.persistence.extensions.PersistenceExtensions");. That will import the raw Java Class that implements the persistence actions. Of course, that means that anything you get from calls to that class will also be raw Java so you’ll have to treat them accordingly. For example, you can’t use JS methods to iterate and map/reduce a java.utils.List.
Here is my script. The JavaScript approach (LOCF - last observation carried forward weighting), which follows the same calculation approach, takes almost 15 times as long.
All I can say is I still doubt it’s the JS wrappers causing problems. But you can prove one way or the other by using the raw Java.
var PersistenceExtensions = Java.type("org.openhab.core.persistence.extensions.PersistenceExtensions");
...
var states = PersistenceExtensions.getAllStatesBetween(items.getItem(item).rawItem, midnight, now);
...
states.stream.forEach(hi, () => {
// hi is the Java HistoricItem, not a JS PersistedState
...
})
I think you can use a Joda-JS ZonedDateTime here. If not you’ll need to import and use a java.time.ZonedDateTime.
I’m 100% sure as you can see in my “time tracking” in the script. The loop consumes 14 seconds. If I am slicing the states-array to 5 elements, the loop consumes less than 0.1 seconds.
I found out that the access to state.timestamp consumes most of the time, so handling Joda-JS ZonedDateTime objects is pretty slow.
OK, but I’ve also provided the code above you can use to use the raw Java everything inside JS. If you use the Java above, you won’t be calling state.timestamp at all. You’ll be calling hi.getInstant() which returns a java.time.Instant. It completely bypasses all the JS wrappers and everything else that can cause such slowdowns.
openhab-js just wraps the functions provided by OH core. Any new calculations should be implemented there. It’s definitely worth an issue as @Mherwege suggested. It’s not hard to implement by any means.
As for the slowness, I’m not seeing the same slowdown but I’m running on a relatively fast machine so I wouldn’t necessarily notice. But that’s is worth an issue on openhab-js repo to see if anything can be done. Historically messing around with date times tends to be expensive so there may not be anything that can be done there, but where performance is an issue I’ve can use the raw Java.
I did not file issues yet but implemented a pretty accurate calculation for power consumptions in a JS rule. This could be a nice gimmick for openHAB core and the rule API (JS and DSL). Input is an Item with a series of power consumptions. It will calculate the Riemann Sum with Midpoint strategy.