Migrate your existing persistence data to InfluxDB

Tags: #<Tag:0x00007f616cc0dad0> #<Tag:0x00007f616cc0d8f0> #<Tag:0x00007f616cc0d738>

If you start using InfluxDB to store your data (for example to get great charts with Grafana) you might want to migrate your old data from the “old” persistance service (like rrd4j) to the new database.
Otherwise our new charts would only start from the time you started using InflluxDB.

So here is a little bash script wich reads the available data via REST and imports the data to InfluxDB.

This sould be compatible with openHAB v2.5.5+.

7 Likes

@ThomDietrich, should this be linked to from the OP for InfluxDB+Grafana Tutorial? More people may want to switch if they know they now have an easy way to migrate their data out of whatever they are using now and into InfluxDB.

Good call, I would maybe have missed it.

@christoph_wempe this is a great help. I’ve linked the tool in the InfluxDB+Grafana article: InfluxDB+Grafana persistence and graphing

Fantastic. Fear of losing old data had made me stick to rrd4j.

The other thing that stops me moving is concern that InfluxDB databases may spiral to enormous size - I used to have that with db4o before I moved everything to rrd4j. Is there a neat way of purging old records from InfluxDB?

thanks

Dan

I think you could write a script and use the DELETE-SQL command to delete entries older than X.

But I cannot imagine database size will be a problem.
Then again, I have no experience in this topic. :slight_smile:

You can configure a retention policy that will automatically purge data as it ages.

Also, InfluxDB does a MUCH better job of handling large databases. And in practice I see between 1-6 mb per Item per year which isn’t that big of a deal size wise.

And, unlike rrd4j, InfluxDB had a nice way to interact and delete data as needed through it’s rest API.

And records can be deleted through OH’s rest API also.

Setting up a retention policy and a discussion of database sizes are included in the tutorial link Christoph provided.

Awesome - thank you!

It looks like the script is not working with openhab 2.1. :disappointed:

The Rest interface does now return a single line of data rather than multiple lines.

I will look into it tomorrow. :sleeping:

The latest version of the script supports switch items.
The previous one did not, because ON/OFF had to be translatet to 1/0.

I also created a branch for openhab 2.1 compatibility.

Unfortunately my RaspberryPi is really overwhelmed by the script to testing it is not as easy as it could be. :slightly_frowning_face:

I will not spent time to create a script that is compatible with 2.0 and 2.1!
This should be easy, but I don’t have a OH2.0 system to test it anymore.

Great CWempe! Thanks for the contribution and the update for oh 2.1. Will certainly use this as a) I enjoy grafana/influx very much and b) I am a data geek. But first I need to get some other stuff going in openhab.

Thanks a lot for the script - with its help I’ve managed to migrate all my data from rrd4j to InfluxDB in OH 2.1. A couple of small changes make the script work as expected:

  1. All of the date formats need to be changed to:
    tenyearsago=`date +"%Y-%m-%dT%H:%M:%S.%3NZ" --date="10 years ago"
    And so on for the others. Note the upper case %3NZ instead of %:z at the end. This might not be the only way to get the formatting working, but it works for me.

  2. Also there is a typo in the curl line for the period one week through to one day ($onwdayago should be $onedayago)

The script output includes pairs of lines with a 100 Continue response followed by 204 No Content, but this doesn’t seem to be important. As far as I can tell, the 204 response is just what you get at the end of each file.

Reviving an old thread to ask if there has been support added for 2.2 or 2.3? Or does the 2.1 version work on those versions as well? I’m on v2.3 and looking to make the jump to Influx.

Thanks,

Danny

The REST API hasn’t changed as far as I know so it should still work.

I’m using the script with openHab 2.5.5.
I try to migrate from jdbc:sqlite to Influx.
This is the output of the script:

./rest2influxdb.sh LoungeSxDoorSensorTemperature

### timestamps
item: LoungeSxDoorSensorTemperature
10y:  2010-05-18T16:47:57+02:00
1y:   2019-04-20T16:47:57+02:00
1m:   2020-04-19T16:47:57+02:00
1w:   2020-05-11T16:48:57+02:00
1d:   2020-05-17T16:48:57+02:00
8h:   2020-05-18T08:48:57+02:00
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100   338  100   338    0     0     56      0  0:00:06  0:00:05  0:00:01    89
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100   338  100   338    0     0    169      0  0:00:02  0:00:01  0:00:01   169
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100   338  100   338    0     0   3520      0 --:--:-- --:--:-- --:--:--  3520
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100   338  100   338    0     0  14083      0 --:--:-- --:--:-- --:--:-- 14083
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100   338  100   338    0     0  37555      0 --:--:-- --:--:-- --:--:-- 37555
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100   338  100   338    0     0  21125      0 --:--:-- --:--:-- --:--:-- 21125

### found values: 0
Warning: Couldn't read data from file "LoungeSxDoorSensorTemperature-*", this 
Warning: makes an empty POST.
HTTP/1.1 204 No Content
Content-Type: application/json
Request-Id: 9593c329-9916-11ea-8c88-000000000000
X-Influxdb-Build: OSS
X-Influxdb-Version: 1.6.4
X-Request-Id: 9593c329-9916-11ea-8c88-000000000000
Date: Mon, 18 May 2020 14:48:06 GMT

Sleep for 5 seconds to let InfluxDB process the data...

### delete temporary files

What I’m doing wrong ?

Thanks

Hi,

I try to run the latest version of the script on openhabian openHAB 2.5.5-1 (Release Build)

My config.cfg is:

#!/bin/bash
# openHAB server
openhabserver="rpi.local"
openhabport="8080"
serviceid="rrd4j"

# InfluxDB server
influxserver="rpi.local"
influxport="8086"
influxdatbase="openhab_db"
influxuser="openhab"
influxpw="somepass"
importsize=100
sleeptime=5

I get this:
$ ./rest2influxdb.sh RPi_Temp_CPU

### timestamps
item: RPi_Temp_CPU
10y:  2010-06-11T00:09:28%
1y:   2019-05-14T00:09:28%
1m:   2020-05-13T00:09:28%
1w:   2020-06-04T00:10:28%
1d:   2020-06-10T00:10:28%
8h:   2020-06-10T16:10:28%
./rest2influxdb.sh: line 60: unexpected EOF while looking for matching `''
./rest2influxdb.sh: line 83: syntax error: unexpected end of file

I tested the link manually through the REST API:
curl -X GET http://rpi.local:8080/rest/persistence/items/RPi_Temp_CPU?serviceId=rrd4j

It works fine:

{“name”:“RPi_Temp_CPU”,“datapoints”:“657”,“data”:[{“time”:1591784640000,“state”:“53.069”}

{
  "name": "RPi_Temp_CPU",
  "datapoints": "657",
  "data": [
    {
      "time": 1591784640000,
      "state": "53.069"
    },

Can you spot the problem? I run out of ideas!

sigmoidal

spotted this:
L:38 echo “resturl: $resturl”

There is a problem with the latest update. Check latest pull req.

You are right, fixed.

Thanks

I’m also trying to move into influxdb from rrd4j.
When I try your script, it seems that my item data is empty?
Is it possible that it has something to do with the timezone? I guess I’ve got the +2 in the timeframe wrong?

I’m running openHAB 2.5.9 on Ubuntu 20

The config file:

#!/bin/bash

# openHAB server
openhabserver="192.168.0.10"
openhabport="8080"
serviceid="rrd4j"

# InfluxDB server
influxserver="192.168.0.18"
influxport="8086"
influxdatbase="openhab"
influxuser="admin"
influxpw="admin"
importsize=100
sleeptime=5

When I run it with an existing item:

./rest2influxdb.sh fluksoGasA

### timestamps
item: fluksoGasA
10y:  2010-10-11T12:48:00+02:00
1y:   2019-09-13T12:48:00+02:00
1m:   2020-09-12T12:48:00+02:00
1w:   2020-10-04T12:49:00+02:00
1d:   2020-10-10T12:49:00+02:00
8h:   2020-10-11T04:49:00+02:00
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100   338  100   338    0     0  56333      0 --:--:-- --:--:-- --:--:-- 67600
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100   338  100   338    0     0  67600      0 --:--:-- --:--:-- --:--:-- 67600
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100   338  100   338    0     0  67600      0 --:--:-- --:--:-- --:--:-- 67600
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100   338  100   338    0     0  84500      0 --:--:-- --:--:-- --:--:--  110k
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100   338  100   338    0     0  84500      0 --:--:-- --:--:-- --:--:-- 84500
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100   338  100   338    0     0  84500      0 --:--:-- --:--:-- --:--:-- 84500

### found values: 0

### delete temporary files

When I open the rest api in a browser:

http://192.168.0.10:8080/rest/persistence/items/fluksoGasA

{"name":"fluksoGasA","datapoints":"349","data":[{"time":1602326880000,"state":"301.0"},{"time":1602327120000,"state":"301.0"},{"time":1602327360000,"state":"301.0"},{"time":1602327600000,"state":"301.0"},{"time":1602327840000,"state":"301.0"},{"time":1602328080000,"state":"301.0"},{"time":1602328320000,"state":"301.0"},{"time":1602328560000,"state":"301.0"},{"time":1602328800000,"state":"301.0"},{"time":1602329040000,"state":"301.0"},{"time":1602329280000,"state":"301.0"},{"time":1602329520000,"state":"301.73333333333335"},{"time":1602329760000,"state":"25907.507738095235"},{"time":1602330000000,"state":"18144.0"},{"time":1602330240000,"state":"18144.0"},{"time":1602330480000,"state":"18144.0"},{"time":1602330720000,"state":"18144.0"},{"time":1602330960000,"state":"9573.266666666666"},{"time":1602331200000,"state":"712.0"},{"time":1602331440000,"state":"712.0"},{"time":1602331680000,"state":"712.0"},{"time":1602331920000,"state":"712.0"},{"time":1602332160000,"state":"712.0"},{"time":1602332400000,"state":"712.0"},{"time":1602332640000,"state":"712.0"},{"time":1602332880000,"state":"712.0"},{"time":1602333120000,"state":"712.0"},{"time":1602333360000,"state":"35622.183333333334"},{"time":1602333600000,"state":"35769.229166666664"},{"time":1602334320000,"state":"18923.0"},{"time":1602334560000,"state":"29591.33863636364"},{"time":1602334800000,"state":"19000.0"},{"time":1602335040000,"state":"19000.0"},{"time":1602335280000,"state":"19000.0"},{"time":1602335520000,"state":"28950.55"},{"time":1602335760000,"state":"29409.0"},{"time":1602336000000,"state":"29409.0"},{"time":1602336240000,"state":"29409.0"},{"time":1602336480000,"state":"35824.183333333334"},{"time":1602336720000,"state":"47679.0"},{"time":1602336960000,"state":"47679.0"},{"time":1602337200000,"state":"47679.0"},{"time":1602337440000,"state":"29666.344444444443"},{"time":1602337680000,"state":"64492.0"},{"time":1602337920000,"state":"64492.0"},{"time":1602338160000,"state":"64492.0"},{"time":1602338400000,"state":"56264.17283950617"},{"time":1602338640000,"state":"47654.458333333336"},{"time":1602338880000,"state":"46705.0"},{"time":1602339120000,"state":"46705.0"},{"time":1602339360000,"state":"35259.75"},{"time":1602339600000,"state":"43576.17222222222"},{"time":1602339840000,"state":"37975.0"},{"time":160234





I guess it’s linked with the timezone format.

With the +2 (timezone)

http://192.168.0.10:8080/rest/persistence/items/fluksoGasA?serviceId=rrd4j&starttime=2020-10-11T05:18:53+2

{"error":{"message":"2020-10-11T05:18:53 2 is not in a valid format.","http-code":500,"exception":{"class":"java.lang.IllegalArgumentException","message":"2020-10-11T05:18:53 2 is not in a valid format.","localized-message":"2020-10-11T05:18:53 2 is not in a valid format.","cause":"java.time.format.DateTimeParseException"}}}

without the +2

http://192.168.0.10:8080/rest/persistence/items/fluksoGasA?serviceId=rrd4j&starttime=2020-10-11T05:18:53

{"name":"fluksoGasA","datapoints":"111","data":[{"time":1602386400000,"state":"17634.0"},{"time":1602386640000,"state":"17634.0"},{"time":1602386880000,"state":"22724.243537414965"},{"time":1602387120000,"state":"18306.74166666667"},{"time":1602387360000,"state":"17995.504166666666"},{"time":1602387600000,"state":"17904.44166666667"},{"time":1602387840000,"state":"17802.279166666667"},{"time":1602388080000,"state":"17732.433333333334"},{"time":1602388320000,"state":"17670.487906504066"},{"time":1602388560000,"state":"17610.80416666667"},{"time":1602388800000,"state":"17589.1"},{"time":1602389040000,"state":"17590.0"},{"time":1602389280000,"state":"17590.0"},{"time":1602389520000,"state":"17590.0"},{"time":1602389760000,"state":"2995.0"},{"time":1602390000000,"state":"910.0"},{"time":1602390240000,"state":"910.0"},{"time":1602390480000,"state":"910.0"},{"time":1602390720000,"state":"910.0"},{"time":1602390960000,"state":"21612.05576923077"},{"time":1602391200000,"state":"19635.0"},{"time":1602391440000,"state":"6663.0"},{"time":1602391680000,"state":"2339.0"},{"time":1602391920000,"state":"2339.0"},{"time":160

When I remove the %:z in all time lines, it seems to go on?

fe from

tenyearsago=`date +"%Y-%m-%dT%H:%M:%S%:z" --date="10 years ago"`

to

tenyearsago=`date +"%Y-%m-%dT%H:%M:%S" --date="10 years ago"`

But at first sight, data isn’t complete/correct. :blush: