PersistenceExtensions.deltaSince behaves different then expected when delta window bigger than the existing values series

I created some rules today to monitor my gas consumption.
I use Homematic HM-ES-TX-WM and the corresponding dongle attached to my gas meter to read gas consumption.
HM-ES-TX-WM delivers a cumulative counter of gas consumed as m3.
I had the idea to create a rule which would make rolling consumption values for 1h, 24h, and 30d.
The commutative value counter is initialized with 0 before receiving the actual values from the gas meter.

Here is the Influxdb query result:

InfluxDB shell version: 1.8.10
> select * from GasCounterCum
name: GasCounterCum
time                     item          value
----                     ----          -----
2022-11-14T13:59:00.395Z GasCounterCum 0
2022-11-14T14:00:00.412Z GasCounterCum 5460.14013672
2022-11-14T14:00:00.94Z  GasCounterCum 5460.14013672
2022-11-14T14:05:00.943Z GasCounterCum 5460.14013672
2022-11-14T14:10:00.954Z GasCounterCum 5460.14013672
2022-11-14T14:15:00.964Z GasCounterCum 5460.14013672
..........
2022-11-14T17:06:00.346Z GasCounterCum 5460.18017578
2022-11-14T17:08:00.352Z GasCounterCum 5460.24023438
2022-11-14T17:10:00.939Z GasCounterCum 5460.24023438
2022-11-14T17:11:00.352Z GasCounterCum 5460.29980469
2022-11-14T17:14:00.529Z GasCounterCum 5460.33984375
2022-11-14T17:15:00.952Z GasCounterCum 5460.33984375
2022-11-14T17:16:00.356Z GasCounterCum 5460.37011719
2022-11-14T17:20:00.942Z GasCounterCum 5460.37011719
2022-11-14T17:25:00.941Z GasCounterCum 5460.37011719
2022-11-14T17:30:00.941Z GasCounterCum 5460.37011719
2022-11-14T17:35:00.951Z GasCounterCum 5460.37011719
2022-11-14T17:40:00.941Z GasCounterCum 5460.37011719

I would expect when using PersistenceExtensions.deltaSince with an interval back beyond the time stamp of the 0 value at (2022-11-14T13:59:00.395Z) to get some value different than 0.

this instruction gives me back a value 0:

last24h     = PersistenceExtensions.deltaSince(ir.getItem("GasCounterCum"), ZonedDateTime.now().minusHours(24)).floatValue()

Once the rolling window does not go beyond the 0 initial value, e.g the 1h rolling window deltaSince gives the right values back.

Here is the example output of the 1h values, 24h and 30d are zero:

Is there an option which I missed here?

Internally the calculation is done by

  • get value for start time
  • subtract that value from the current value

If your start time is before the first datapoint, then no value can be found and as a result the delta can’t be calculated.

1 Like

If there are any historic values for the given calculation time span, returning “None” is definitely not logical.

if a data series like the above is existing, then it would be logical to behave in the following:

If the start time is before the first data point, then the first data point should be looked for and taken as the starting value.

Here is my rationale:

  • If a value exists before the start time then this value is valid for the start (min) value of calculation.
  • If no value exists before the start time then the next/first value is valid for the delta calculation.
  • If there is just one value in the whole time series the logical result would be 0.
  • If no values exist for that item the result should be None

The alternative approach.
The function is deltaSince - “since” is between “now” and “then”.
There is no interest in any values in-between “now” and “then”, this is not about any “span” it is about two moments in time.
If “now” or “then” values are undefined, the delta cannot be calculated.

In normal circumstances there will never actually be a database record for the exact instant of “then”.
So we must search backwards in time for the next oldest record, to see what the value was immediately preceding “then”. Then we make the assumption that value is still current at instant “then” (otherwise we’d have recorded some new value).

If there is no record preceding “then”, the value is undefined.


To solve your actual issue, you might consider using minimumSince(), which does operate over a span.


There’s nothing to stop you putting in a suggestion for changed behaviour.
I’d object, but it’s not my decision in any way :smiley:

There’s nothing special about deltaSince(), so any change in policy for determining value at past time “then” should also be extended to functions minimumSince(), historicState(), deltaBetween(), etc.
That’s a fairly extensive change and will be be a breaking change for some users - worse, a silently breaking change in many cases.

In the case of historicState(), the user could at least see if the returned timestamp was before or after the target “then”
But in the case of say averageSince() the user will not be able to determine if “old” or “new” behaviour is at work.

You should refine the suggestion, by expicitly defining the policy for what to do if target “then” falls between a record “earlier” and a record “later”.- this will be the case for every real-world use.
Should we always use “earlier”? What happens if “earlier” is two months earlier, and “later” is one second later? If we always use “earlier” when available (which it usually is), what is the point of considering “later” at all?
There are arguments either way, before we even begin to come up with stuff like weighted averaging to determine “then” value.

Remember this is a home automation system, not a general purpose database or data analysis tool. We really don’t need all the possible fancy stuff.
The tool already exists to deal with your “meter reading” type data, minimumSince()

I suppose what is missing from persistence extensions is a method to seek out the oldest record - firstUpdate(), perhaps?

If the “now” value is undefined then there are no values at all so returning “Undefined” is ok
If “then” value is undefined, but the “now” value is defined implicitly means that the delta is the “now” value.

Like the statement:

Way back (e.g. 1 billion years) in time there were no people on earth and now there are 8 000 000 000. So deltSince(1 billion years) should return “Undefined”?

This is not logical behavior I am pointing to

minimumSince should be ok if traversing from now down the time finding the minimum value, an here stopping when there are no more values. The result would be better than returning Undefined.

historicState just return the value if existing or Undefined if no value exists.

deltaBetween same as minimumSince starting from the last value down the timeline as long as values exist. The result would be better than returning Undefined.

I cannot imagine that this behavior would break any existing code, as undefined is always a pain in the but and has to specially handled anyway.

I will issue a change request. Could you point me to the github project please?

Have you tried minimumSince() yet, to solve your unique problem with incomplete dataset, before asking for system redesign?

Sure, iIt’s called openhab.

Well, yes. We have no idea what the value of measurement X was that long ago. It is not possible to calculate a delta. You must take the general case, not your unique case. Taking your human population example, your proposal would return for you a delta of 8 billion since last year? (because your dataset only goes back month or two)

Another specific example, other users will be persisting say “average temperature”. There really is a delta since 1billion years, but we cannot know what it is because we don’t have the data. Delta since last year is a very poor substitute.

Yes. “Good”, in my opinion. Because I think requesting data from before the dataset started is a special case?

1 Like

poor is better than nothing as long I am not able to find my oldest value and timestamp (e.g. firstUpdate())

Anyway, I made a workaround by putting very-very-very old value into my reading values.

btw. is persistence core or addons

I agree with @rossko57. If we have an older value, it’s safe to assume that an update would have occurred if the value was different at the requested time. If we don’t have a value, it makes no sense to calculate a delta simply because we don’t know what would be the correct value.

I do not get your argumentation, and give it up.
I will work around these problems by setting initial values way back into the past.

Or use minimumSince(), which will give the answer you desire in the case of an ever-increasing “meter reading”

In the more general case of arbitrary values, there is no safe way to guess what would be a value from a time before the dataset begins.