HABApp persistence data

What could be a good/better way of implementing the equivalent of following rules-code in HabApp?


energy_usage_60_minutes = accumulated_energy.deltaSince(now.minusMinutes(60))

So far I have come up with following, but Im not really happy with it, and I must also handle the possibility of that “None” is returned from get_persistance_data-query.


_energy_usage_60_minutes= (
            self.accumulated_energy.value -
            self.accumulated_energy.get_persistence_data(persistence="rrd4j", end_time=datetime.now() - timedelta(minues=60) ).max()

How often do you want to run this? The only other idea that I have is to use an AggregationItem

In “rules”, I run it alot, without any considerations on “load”. The data i calc is like energy usage last month, 24h, week, hour etc for both utility grid import, warm water, heat production etc (fur the purpose visualizing the data in a plot/graph, Im using the openhab-plot and not grafana).

It seems in HABapp, the .get_persistence_data-query can be load-intensive, so as what you might imply(?) I need to take care to limit the usage. Maybe its better to keep this functionality in “rules” as the implementation is compact and clear?

I guess AggregationItem is considerably less load-intense? Downside among others is I need to wait a new 30 days before the month-average value is considered valid again.

If you want to run this every second it might be the wrong way to do things.
It’s load intensive on the openHAB side because you query all the values of the last hour.

Imho it would be more elegant to use the aggregation item and use a startup rule to populate the values from persistence.

Or just leave this part of your rules in openHAB, if you have a working solution there is no need to migrate it.

Thanks for suggestion,I will give it a go with the AggregationItem with an initiation routine. Kind of feels good to keep functionality in one place.

Im struggling a bit to understand the AggregationItem and (in next step) how I should assign the get_persistence_data to the AggregationItem.

Ive assigned a NumberItem as the source to the AggregationItem. Is it ok to use the NumberItem as source?

        self.total_energy = NumberItem.get_item("vvb_energy_total")
        self.ag_item_total_24h = AggregationItem.get_create_item('ag_item_total_24h')
        self.ag_item_total_24h.aggregation_func = max

If I let the NumberItem populate the AgItem the result/.value/get_value() is shown as ‘deque([58976.2, 58976.2, x, and further …])’. I see no change if I alter aggregation_func from max to min for instance. While I expected the output to be a single float, what am I doing wrong?

print(f'Agr item value: {self.ag_item_total_24h.value}')
print(f'Agr item get_value: {self.ag_item_total_24h.get_value()}')
print(f'Agr item: {self.ag_item_total_24h}')

Agr item value: deque([10223054.5, 10223069.0, 10223069.0, 10223096.4, 10223096.4, 10223096.4, 10223114.2, 10223125.1, 10223125. ...
Agr item get_value: deque([10223054.5, 10223069.0, 10223069.0, 10223096.4, 10223096.4, 10223096.4, 10223114.2, 10223125.1, 10223 ...
Agr item: <AggregationItem name: ag_item_total_24h, value: deque([10223054.5, 10223069.0, 10223069.0, 10223096.4, 10223096.4, 10 ...

Instead of this

self.ag_item_total_24h.aggregation_func = max

try this:


It seems that the docs are still showing how to use an early version of the item.

Ah, thats better :). Ok, so the AggregationItem now works as expected with the NumberItem-source. Next step to populate from the get_persistence_data. As I understand the functionality the Ag-Item need the timestamp of each added value and I suppose I should use the .set_value()-call.

The get_persistence_data returns a ‘dict’ with pairs of timestamp/values. Can/should I assign this dict directly through ag_item.set_value(), or should I add the values individually? And how do I in this case add the timestamp with the value?

In my example the ‘accumulated total energy’ is stored once every minute, meaning for 24hours there are 1440 values stored. Is there some limitation on how many values the AgItem can accept (will there be a case where I need to ‘downsample’?).

    def _init_agr_item(self, source: BaseValueItem, name: str, period: timedelta) -> 'AggregationItem':
        ag_item = AggregationItem.get_create_item(name)
        # time_window_search = period # timedelta(minutes=120)
        while period > timedelta(0):
            persistence_query = self.openhab.get_persistence_data(source, persistence="rrd4j", start_time=datetime.now() - period, end_time=datetime.now())) # + time_window_search))
            if len(persistence_query.get_data()) is not 0:
            # else
                # period -= time_window_search
        rule_log(f'Item {name} initiated with value {ag_item.get_value()}', cls=self, level=logging.DEBUG)
        return ag_item

I’m having second thoughts, I think I should stick to persistence data in my case, at least when involving data older than say 6-10 hours. However I should do something to narrow down the time-window (start/end-time) in the persistence-query to limit load.

If something I might be helped by say a parameter limiting the number of items returned from the persistence_query: ie return the first x items newer than start_time=now() - deltatime(hours=24), but on the other hand, thats should also be manageable with start/end-time, with some extra logics around, I will start with this approach.

Ok, final solution implemented through a help class rule: ‘persistent data worker’ memorizing ‘oldest’ valid persistent data, as well as narrowing down search time window (start/end-time) for queries.

Not as compact as the openhab-rules implementation but its running smooth and Im happy with the result.

1 Like