Could anybody point me in the right direction for loading historical/batched data into persistence? I’ve got a JSON web service that provides data for the last few hours that I need to poll. I want to make historical values available in OH, I’m currently using RR4J.
The data isn’t necessarily even, or reliable, it comes when it comes in various size groups, but I do get a timestamp with each value, so I should be able to target the historical value - but I don’t know how (or if it’s possible) to write these values into the history for those particular timestamps (and either skip or overwrite timestamps that already have a value).
I appreciate that I might need to switch to InfluxDb or something for this. The other alternative I have is, how feasible is it to plug into a non-OpenHAB instance of InfluxDb, e.g. something I can populate out of band but still have OH use for historical data?
using (and/or showing) that data within openHAB? in rules or graphs?
just “to have it” and/or get some dashboard overviews (not necessarily wihtin openHAB itself)?
ad 1
yes, you would have to save it in an additional persistance like SQL oder influx.
but: I don’t know of a way to change data in the persistence or even write “historic” data on a later date with openHAB tools.
Of course you could INSERT/UPDATE that data in your SQL-databases, which then in turn automagically can be used in openHAB. But you’ll have to write those statements yourself (e.g. knowing the database structure for that
ad 2
you can directly send the data to lets say influx for dashboards there or in grafana - but you’ll have to also use the timestamps for those INSERTs.
I’m not sure on the use-case At the very least dashboarding, but I’m possibly looking at interrogating historical values to make decisions on future changes etc. The update frequency of the data isn’t that great at the moment, so I might not be able to realise that part though.
I haven’t looked into the persistence engines too much and have viewed them as a “black box”, is the idea of having OpenHab talking to a ‘shared’ Influx DB or something a workable approach? OpenHab reads persistence data, I have another process handle inserts.
Sounds like X-Y problem to me
Since you’re unsure in the first place what that would be good for, you better get your use case right before you start looking for a solution how to implement it.
I believe you misunderstand how this is supposed to work. Persistence tracks openHAB item states. If configured correctly it’ll record every change the very moment the item changes so if you have properly setup some OH item to track your external data source, default persistence will allow for charting it. So put your efforts into mirroring your external data into OH items.
However if you’re looking to copy and graph some external data that isn’t available as openHAB items, persistence is the wrong tool for that. It’s persistence, not a general-purpose database interface. Only OH itself should write to it, no inserts into DB from the outside allowed.
PS: you could use the new TimeSeries data type if you really need to get data in.
OH 4.1 introduced the ability for persistence to store and use future values. However, support for that in the UIs and rules has lagged behind and is not yet fully available everywhere.
As part of the support for rules I believe they are implementing a version of the persist action that takes a timestamp to insert data at a given time.
However, none of this works with rrd4j because of the way rrd4j works internally.
Therefore, right now the only way to achieve this is to insert the data using the OH REST API which has an endpoint where you can insert a value with an arbitrary timestamp.
If you format the time series correctly and name your Items in OH correctly you should be able to update the data outside of OH and OH will see it.
oh, that’s great, didn’t know that one. you mean /persistence/items/{itemname}? Does it only “INSERT” or is the wrapper able to UPDATE, if the timestamp is identical (MySQL uses the timestamp as unique key). Swagger docu is not clear on that…
you could use the same tables for openHAB and influxDB/grafana, if you know what you’re doing, yes. But I won’t recommend it. openHAB uses the persistence for restoring after restarts (at least the one defined as “standard persistence service”) and you could mess up things here. Or some major upgrade changes the way persistence is used and you have to rework your use of influxDB.
That’s correct.
What I did while migrating one of my openHAB instances is to just INSERT “old values” into MySQL, but for that you’ll have to know the structure. In the “items”-table, there’s all OH-items and the ID then works for “item0001” (you can define the suffix-id in the JDBC-Persistence configuration, defaults to 4, meaning item0001 until item9999.)
So an
INSERT INTO `openHAB`.`item0001` (`time`, `value`) VALUES (<{time: current_timestamp(3)}>, <{value: }>);
will then create an entry for that timestamp. REST API will do that for you as Rich pointed out.
But: first off try to define your “real” use case, the rest ist just spitballing here!
To analyze further, switch on debug logging for the JDBC persistence, and log an issue in the addons repository. I can’t analyze further as I don’t use JDBC myself, but that way someone may look at it.