Important
Changing data directly in influxDB is risky. Only proceed after you’ve created a backup and successfully tested whether you could safely restore from this backup in case something went wrong.
0) Setup
The following applies to influxDB 1.8 (instead of the default RRD4j) as a persistence for openHAB (in my case openHAB 3.4 on Docker). In addition, I used Chrohograf 1.10.0 (on Docker). influxDB 1.8 is also running on Docker. All run on a Raspi 3+ with Raspian OS 10 (32 Bit).
1) Task
Due to a logic change with one of my sensors, my energy consumption was stored as “average kW per 24 hours” in one year, and “total kWh per day” in another year. Everything was stored in one item. Not surprisingly, the graphs looked completely non-senseful (logical break from one year to another), and thus comparing years with one another did not make sense at all.
What I wanted to do: Harmonize the data by changing the “average kW per 24 hours” into “total kWh per day”. The calculation behind is trivial (“new values = old values * 24 (hours)”.
2) What doesn’t work
If you want to know how to do it, go straight to to #3. If you want to learn what doesn’t work (also worthwhile), keep reading.
Use Flux code in Chronograf to manipulate data
I read here that I was not the only person with the task at hand. However, only after hours of reading and trial-and-error, I learned that the solution presented under the link above only works with influxDB 2.x. Influx 1.8 is not able to write code via Flux (one of the two query languages of influxDB).
Trying to run the following Flux-code in Chronograf (which should theoretically do what I wanted to do)…
from(bucket: "openhab_db/autogen")
|> range(start: -2y, stop: now())
|> filter(fn: (r) => r._measurement == "RainPastHour")
|> map(fn: (r) => ({ r with _value: r._value * 24 }))
|> to(bucket: "openhab_db/autogen")
… just led to a nice error saying error calling function "to": function "to" is not implemented
.
So doing the data change via this approach would require influxDB 2.x. Unfortunatlely influxDB 2.x requires a 64 Bit OS, which I learned is something that’d require me to re-set up my entire system from scratch. Plus, although upgrading from influxDB 1.8 to 2.0 looks doable, it would be a bit over the top just to change a bunch of values. So back at square 1.
Use an SQL-like query either in Chronograf or in Terminal
Assuming that Flux (one of the two query languages of influxDB) cannot write data in influxDB 1.8, I tried switching to influxQL (the second query language of influxDB).
However, both in Chronograf as well as in Terminal, if I run the following query…
UPDATE RainPastHour SET RainPastHour = RainPastHour * 24 where time > '2022-01-11 08:00:00' and time < '2022-12-11 12:00:00'
… showed me the error error parsing query: found UPDATE, expected SELECT, DELETE, SHOW, CREATE, DROP, EXPLAIN, GRANT, REVOKE, ALTER, SET, KILL at line 1, char 1
, so even if my syntax was completely wrong (which it probably was), it would not work if INSERT
or UPDATE
are not even implemented. Maybe this was also a really stupid thing to try out, but since I found out what did work (see below), I did not spend more time on getting this to work.
4) What does work
4.1) How to delete data
Sometimes you just want to delete data. What I found easiest:
- Terminal
-
docker exec -it influxdb bash
to log into the influxDB container -
influx
to launch influxDB. Afterwards you can useshow databases
to check which databases you have available. -
use openhab_db
to use all subsequent commands on your openHAB database. Afterwards you can useshow measurements
to check which measurements (tables) you have available. - To delete entire measurements (tables), use
drop measurement <item name>
. Though, to be fair, I have not tried this out, i.e. I do not know how openHAB behaves if you delete an entire measurement. I always only went for deleting selective values. - To delete selective values, use
DELETE FROM <item name> WHERE time < '1980-01-01 00:00:00'
or alternativelyDELETE FROM warmwasserzaehler_leistung WHERE time > '2021-11-01' and time < '2021-11-20 21:45:00'
if you’re going for a specific timeframe. Keep in mind that the database time might be different compared to your current one, so test first.
What I never fully understood, why the equivalent DELETE FROM "openhab_db"."autogen"."RainPastHour" WHERE time < '2022-12-10 00:00:00'
(which I believe should be influxQL-code) in Chronograf did not work (“error parsing query: retention policy not supported at line 1, char 1
”). But, to be fair, I had a working solution already (via Terminal, see above), so I could not be bothered to figure out what would’ve worked in Chronograf (though, to be fair, working in Chronograf would’ve been much nicer than in a terminal box).
4.2) Inserting data (after potentially manipulating data)
This one is tricky, because (as for deleting data) there are theoretically four options to do so.
- Use influxQL (in Chronograf)
- Use Flux (in Chronograf)
- Use terminal (see above)
- Use the “write data” GUI-function (in Chronograf)
#1 (use influx QL in Chronograf) and #2 (use Flux in Chronograf) can be ruled out (influxDB 1.8 does not appear to support writing data via those paths). Which leaves #3 and #4.
#3 Use terminal
This one is actually pretty neat (though has its limits, see below).
- Terminal
-
docker exec -it influxdb bash
to log into the influxDB container -
influx
to launch influxDB. Afterwards you can useshow databases
to check which databases you have available. -
use openhab_db
to use all subsequent commands on your openHAB database. Afterwards you can useshow measurements
to check which measurements (tables) you have available. -
insert <item name> value=2.0 1638301257000000000
, with the value being the value you want to write, and the 19 digits Unix timestamp as the time for which you want to write the value into the database. Use this converter to translate a date/time-tag into the 19 digits timestamp (fill up with enough 0s to fulfil the nanosecond-requirement.
Problem of this method (like in my case): If you want to manipulate a large number of values, this gets pretty cumbersome.
So this leads to the last option:
#4 Use the “write data” function in Chronograf
This one is actually pretty neat. In Chronograf, if you click on Explore and then Write data, a window opens that expects the same command as the terminal-box, except that you don’t need insert (e.g. <item name> value=2.0 1638301257000000000
). Plus, it allows you insert more strings below each other, which is really handy if you want to include more values at once:
Press Write and you’re done.
However, if you want to write a large number of values, this can also get cumbersome as well, since you have to construct all strings manually.
What I did instead (and what I’d also recommend if you want to change large amounts of data):
- Extract all data that you want to manipulate into a .csv by adjusting the following Flux code to your needs…
from(bucket: "openhab_db/autogen")
|> range(start: 2020-01-01T06:00:00Z, stop: 2022-12-30T09:00:00Z)
|> filter(fn: (r) => r._measurement == "warmwasserzaehler_leistung")
… run it and download the result as .csv.
2. Do all your value manipulations in Excel (in my case:
old value * 24
)…3. In excel, do your value-manipulations…
4. … and then re-create your new “write-string” via Excel-formulas, so that that it matches the what’s expected from the Write Data-function (
<item name> value=2.0 1638301257000000000
). The Excel-formulas that are useful to do so:3.1 Extracting Excel time- and date-stamps from influxDB’s RFC3339 timestamp via
=left()
and =right()
and then apply =datevalue()
and =timevalue()
, so that Excel recognizes the result as a date- and timestamp3.2. Getting from an Excel timestamp to Unix timestamp via
=(B3-DATE(1970,1,1))*86400000000000
.3.3. Concatenating the results (cells and strings) together via
&
and "xxx"
, so that one line then says item_abc value=2.0 1638301257000000000
4. Afterwards, save this as a .csv (one string per line).
5. Upload it and you’re done.
Closing remarks
I am pretty sure there are neater ways of doing what I wanted to do. However, after hours of reading, this was the only thing that got me the job done.
If I missed something, let me know.