Migrating InfluxDB Data to a new installation

Openhab data can be easily transferred to a new openhabian installation. Running sudo openhabian-config and using the build in backup/restore function makes the migration super simple. This does not backup/ restore influxdb though. I will try to explain how this can be easily achieved in this article.

Let’s say you managed to create a new openhabian sd card with all your openhab config data transferred. Great! Let’s start here.

Create backup of the data: I would suggest creating a folder first where the data is stored and then run the backup command:

mkdir backup_folder
# Create a backup in the portable format
influxd backup -portable /home/openhabian/backup_folder

https://docs.influxdata.com/influxdb/v1.8/administration/backup_and_restore/

Now you should have alot of files in there. To transfer it to your computer, zip it:

zip -r backup_folder.zip

Now transfer the file to your computer with scp. I would suggest logging out of ssh and transfer the file from there:

scp openhabian@IP:/home/openhabian/backup_folder.zip ./

Now copy this file to the new openhabian:

scp backup_folder.zip openhabian@IP:/home/openhabian/backup_folder.zip
# Unzip it
unzip backup_folder.zip

Now lets run

sudo openhabian-config

Install and configure Influxdb there. It seems the easiest solution. It is important to create an admin user and the user for openhab. In my case it was “admin” and “openhab”. If it creates a new database, that is not a problem.

Now let’s login to the database:

 influx -username "admin" -password "openhabian"
#check if the database is there:
show databases
# if openhab is there delete it
drop database openhab
exit

Now we have it almost! Just important our backup:

influxd restore -portable /home/openhabian/backup_folder

Almost! We need to change the rights to our user:

influx -username "admin" -password "openhabian"
grant all on "openhab"  to "openhab"

And that’s it!

I’m interested if anybody get’s this far :wink:

5 Likes

I tried this and got pretty far…however after the last command I get:

Warning: It is possible this error is due to not setting a database.
Please set a database with the command "use <database>".

Any idea?

I’m re-installing OH3 on a new SD-card with Openhabian

Old thread but maybe I can get help.

I have setup a completely new openHAB 4.x system running on a RP5 for migration of my old production openHAB 2.5 system. Model and scripts are migrated. I have installed influx v1 on both and both are working in parallel. I plan to switch production from the 2.5 to the 4.x system shortly using a script.

My question: besides data migration as shown here, I see that the schema used by openHAB 4.x changed slightly since openHAB 2.5. In particular, every measurement shows a new series “item” besides the actual “value”.

Sample on openHAB 4.x system:

> select * from Wind_Fledermaus_Speed limit 10
name: Wind_Fledermaus_Speed
time                item                  value
----                ----                  -----
1732035471961000000 Wind_Fledermaus_Speed 3.555051856640912
1732035473258000000 Wind_Fledermaus_Speed 2.8439979615762536
1732035475262000000 Wind_Fledermaus_Speed 2.8431836438030644
1732035477264000000 Wind_Fledermaus_Speed 2.1302612902338356
1732035479269000000 Wind_Fledermaus_Speed 2.7212130889325823
1732035481274000000 Wind_Fledermaus_Speed 4.1709313366210825
1732035483277000000 Wind_Fledermaus_Speed 4.914560083322485
1732035485286000000 Wind_Fledermaus_Speed 6.753109089061174
1732035555536000000 Wind_Fledermaus_Speed 3.548262068842349
1732035557550000000 Wind_Fledermaus_Speed 4.261174352656674

Sample on openHAB 2.5 system:

> select * from Wind_Fledermaus_Speed limit 10
name: Wind_Fledermaus_Speed
time                value
----                -----
1721031450404000000 4.824608181408684
1721031452440000000 4.649645208258924
1721031454397000000 4.144424372525099
1721031456416000000 4.293801630347217
1721031458427000000 4.421749671212869
1721031460447000000 4.787358090345532
1721031462442000000 4.641240469913669
1721031464447000000 3.902642001350101
1721031466461000000 3.1523443521243752
1721031468458000000 3.1515993310662975

Is it necessary to inject the extra series during migration or will openHAB 4.x accept measurements without the “item” series?

Just in case someone runs into the same challenge. I managed to transfer the data successfully. The difference between openHAB 2.5 and 4.2 is that the influx series carry an additional tag “item” with the name of the item represented.

What I did is scanning influxdb.persist for a list of items that should be migrated, an export of each of this measurements (json), addition of the named tag, export to Influx line format and finally import to the new database. I used a Python script accessing the databases using their REST API.

In case someone’s interested, let me know.

Hi!
I’m in the same situation

I’m moving my setup from Openhab version 2.5.12-1 to 4.3.1.
So this means in my case going from Influx 1.8.5 to v1.11.8.

I tried to restore a backup without considering the extra item tag but the process doesn’t seem to end well

Backup command:
sudo influxd backup -portable /home/openhabian/backupsDS/myHAB4/InfluxDB/influx_data_dir_$(date +\%Y\%m\%d)

Restore command:
sudo influxd restore -portable -db openhab_db -newdb testdb /InfluxDB/influx_data_dir_20250114/

Error prompt:

2025/01/14 18:27:49 Restoring shard 144 live from backup 20250113T233002Z.s144.tar.gz
2025/01/14 18:27:49 Restoring shard 152 live from backup 20250113T233002Z.s152.tar.gz
2025/01/14 18:27:49 Restoring shard 162 live from backup 20250113T233002Z.s162.tar.gz
/- - - -/ (a lot of processed shards)
2025/01/14 18:27:55 Restoring shard 283 live from backup 20250113T233002Z.s283.tar.gz
2025/01/14 18:27:55 Restoring shard 72 live from backup 20250113T233002Z.s72.tar.gz
2025/01/14 18:27:55 Restoring shard 218 live from backup 20250113T233002Z.s218.tar.gz
2025/01/14 18:27:55 error updating shards: write tcp 127.0.0.1:44592->127.0.0.1:8088: write: connection reset by peer
restore: write tcp 127.0.0.1:44592->127.0.0.1:8088: write: connection reset by peer

What is your approach?

/Niklas

I used an influx-python library initially. It worked but failed with the (huge) amount of data I had to transfer. It has been far too slow. So I kept Python but used the REST API of InfluxDB. So here is the script I used.

migrate_influx.txt (7.2 KB)

Start with renaming it from .txt to .py (upload allows for a small number of extensions here) and adjust the configuration in the top. It is not fully generalized, so you may need to apply changes elsewhere too.

The script will try to recognize items you persist in influx initially. NB: in case you use wildcards, you need to change the code and probably hard code a list of items. I have done this because I had many series in Influx I actually didn’t use any more. So I used the migration for cleanup.

Afterwards, the script will load the series one by one, and write it to the new database. I had openhab 2.5 running on an RPi4, openhab 4.3 has been installed from scratch to a new RPi5 (recommended, it is so much faster).

I ran the Python script using

python3 migrate_influx.py

from my development machine (a Mac). I had a copy of the influxdb.persist file available on this machine. You can run it from your old machine otherwise.