Overwrite item persistence data

Dear all,
from time to time i do have the issue that my item persistence data is screwed up …

For example my measured values of my smart power meter was bad the last weeks. The infrared sensor was not mounted correct.

Now it looks like this:

and now the final evalulation of my power consumption is garbage:

To clean the data i want to overwrite the sensor values from 2024-07-31 to 2024-08-01. This will clean the monthly evaluation.

Using the API Explorer i tried the following:

http://192.168.0.198:8080/rest/persistence
[
  {
    "id": "rrd4j",
    "label": "RRD4j",
    "type": "Queryable"
  }
]

and then i wanted to PUT the overwrite data. But there is no endtime available in the PUT query.
So i tried DELETE first:

http://192.168.0.198:8080/rest/persistence/items/Stromzaehler_Stromzahler_EingangEnergie?serviceId=rrd4j&starttime=2024-07-31%27T%2723%3A00%3A00&endtime=2024-08-01%27T%2701%3A00%3A00

But i am getting an error message:

{
  "error": {
    "message": "Persistence service not modifiable: rrd4j",
    "http-code": 400
  }
}

Can someone help?

RRD is queryable, but not modifiable. Maybe rrd internally can handle some modifications, but openHAB persistence service currently does not support that.

Thank you for the answer.
This is sad. So it is not possible to clean up sensor data afterwards.
Any other possibilities? :frowning:

Is it maybe possible to transfer the data including history to another persistence service and edit the data there?
Maybe from rrd4j to influxdb and than clean it up in influxdb?
Not for all items but for the ones which needs to cleaned up.

I’ve made myself a migration tool a while ago to copy data from one item to another, but it is able to work with different persistence services. You can also script this, first fetch data from rrd, then push it to influx. You can have both services running in paralel in openHAB, but one can be default, and other can be used as an optional data source (use serviceId i.e. when defining chart data series)

Thanks for your answer.
I managed now to install influxdb and grafana on my openhab raspberry pi.
Also i extracted the JSON format data of my item.
I can clean the data now local.

But i dont know how the re-import the clean data to influxdb and to a new openhab item.
And how to set the persistence service to influxdb only for this single item.

Can someone maybe help me with this? :slight_smile:
Thank you!

Hello, i installed the influxdb binding in Openhab 4 and created a new number item and linked it to the thing channel. I did not set up any semantic class but still the item is logged with rrd4j.
I also added the item to the influxdb binding config.
In grafana i can display all my openhab items.

Can someone help me with:

  • Seting up the persistence for this single item to influxdb only
  • Manually add the cleaned history data to influxdb (maybe over grafana?) as JSON or CSV

Thank you

both topics are quite easy to handle:

  • persistence for the single (or even more) item(s): just create a influxdb.persist file in the same directory where the rrdj.persist file is saved.
  • manually adding data into influxdb 2.x:
    on the admin webpage of your influx installation, click on “Load Data” , “Sources”, then “Upload a CSV”

Dear all thank you for your answer.
Didn’t had the time till now but i will try it next week.
Two short questions:

  • How to deactivate rrdj4 for single items when storing them to influxdb?
  • Can i still use the build in openhab charts (like shown in the second image in the first post) when using items from influxdb?
    Thank you :blush:

I installed influxdb and grafana using the build in routine in openhabian menu.
But i am not able to open the influxdb web ui.
Even if its enabled in the influxdb config.
using 192.168.0.xxx:8086 gives me 404 page not found.
Grafana UI can be open in the web ui.

Is this a known issue?

Is Influx already started? try via Putty following command:

sudo systemctl status influxdb

If not started yet, you need to start it with

sudo systemctl start influxdb

And to ensure, it starts automatically while booting:

sudo systemctl enable influxdb

Yes, of course. openHAB works perfect together with influx. (Although I prefer Grafana, as it is a more powerful application with many more possibilities)

Yes status is active (running).
Dont know what is wrong. Also tried the port :8083

This is what’s in the influxdb.conf
Sorry for the screenshot, i am on the phone.

I‘m not sure which version is installed by openhabian. Version 1.x has no UI. You need 2.x for UI.
I think, there are some requirements for influx 2.x; 64 bit OS, for example.
With 1.x you can also manually add data, but I think it’s only possible line by line and not with CSV.

Please check, which version you installed.
(I have 2.x in a docker container on my NAS)

seems like openhabian is installing a 1.x influxdb version:

xx@xx:/etc/influxdb $ influx -version
InfluxDB shell version: 1.8.10

:frowning: so no chance to use web ui.
Is there a way to import csv to influxdb via grafana?

There is a way to import mass data also into influx 1.8 database.
But 1.x is totally different to 2.x; both how to administrate it and also how to access it in OH.
(In both cases - 1.x and 2.x - you cannot import csv via grafana. grafana is only the app to present to data from the influx db in graphs and so on)

How to import into 1.x is described here: