InfluxDB+Grafana persistence and graphing

Tags: #<Tag:0x00007f617348b508> #<Tag:0x00007f617348b3f0> #<Tag:0x00007f617348b288>

I’m expriencing an issue with non-working wildcards e.g “SpeedtestResult*” on items in influxdb.persist. Can anyone reproduce this issue?

* isn’t a wild card. From the docs:

<groupName>* - all members of this group will be persisted, but not the group itself. … Note that * is NOT a wildcard match character in this context.

1 Like

Oh, huge misinterpretation of the Asterisk symbol there, thanks for clarifying! RTFM is often the way to go… Took me two hours debugging the persistence “problem” where no data was saved :laughing:

thank you!
worked for OH 2.4 (stable) on Raspb.3

Any examples how to use grafana annotations within openhab?

Have a look at this:

I followed the initial tutorial and am stuck here (cannot get Grafana to load), any ideas ?

unfortunately not.

Which tutorial exactly? Use of update-rd.d is superseded in the latest Raspbian by the systemctl commands of systemd.

Enabling auto-start for instance, looks like:

$ sudo systemctl enable grafana-server

Hi,
I am running openhab with influxdb and grafana - all installed on the SD Card of my Raspberry according to the “how-to” of this thread…
I knew that this is problematic because of the traffic caused on the SD Card.
Luckily I made an Image of my SD Card once a month because now my sd Card seems corrupted.
So I am running my backup now and lost some recorded data in the influxdb database.
However, now I think it is time to move the database from the SD Card to an external ssd.
But I can’t find a nice tutorial or step by step instruction how to do it.
Any ideas how to do this?

1 Like

If you install openhab using openhabian image then using the built in config tool will install influx and grafana for you. The config tool also has an option to move everything to ssd

There’s lots of info on moving InfluxDB if you Google search it. Just look further up in this very thread for one example: InfluxDB+Grafana persistence and graphing

I don’t want to make the install again - just move the database to an SSD.
So I did the following (based on some different pages found by google):

  1. I formatted a SSD on my Windows PC (MBR not GPT) in NTFS
  2. I plugged the SSD into the raspberry and found the device with “dmesg” (sda: sda1 - I did not format it with " sudo mkfs.ext4 /dev/sda1" because I read that this is not neccessary any more.
  3. I created a directory " sudo mkdir /media/usb-platte"
  4. Mont the directory " sudo mount /dev/sda1 /media/usb-platte"
  5. Found the UUID " sudo blkid" - For example 5C24-1453
  6. Opended sudo nano /etc/fstab " and entered the line " UUID=5C24-1453 /media/usb-platte ntfs defaults,auto,users,rw,nofail, umask=000,x-systemd.device-timeout=30 0 0 " for Automount.
  7. Reboot my Raspberry sudo reboot and check after reboot"sudo lsblk -o UUID,NAME,FSTYPE,SIZE,MOUNTPOINT,LABEL,MODEL". Mountpoint still there - good.
  8. sudo apt-get install ntfs-3g” to install ntfs driver to get write access (only read access for NTFS drives on stock drivers)
  9. sudo chmod 777 /media/usb-platte to give all users access.

Now I try to access the new directory on my Raspberry USB SSD from my Windows PC - like I do for the openhab folders - but this does not work. I can not access the folders.
I think I missed a step for the userrights - but I simply don’t know how to give my “openhabian” user the full access to this new directory

What do I do wrong?
I am a noob so sorry for the stupid questions maybe.

oh no - now I destoryed my userrights…
I entered sudo chmod 777 /media/usb-platte.
My intention was to give all userrights for this directory for all users for testing the access.
Now I can still login to my Raspberry via putty with my openhabian user but the network drives in my windows explorer can not be opened any more.
–> I found out, that the Password for my “openhabian” user was set to Default again. So ok, I Can handle this and the Access to the openhab Folders is available again from my Windows PC.
Butt still no Access to my SSD.

Thanks a lot!

Thanks a lot for this very helpful documentation.

I still don’t get a connection from openhab to write into influxdb:

I can connect using
influx -port 8086 -username openhab -password mypasswd -host localhost
(the same I used when I created the openhab_db).
My influx.cfg contains the same credentials.

I can see the db:

InfluxDB shell version: 1.7.10
    SHOW DATABASES
    name: databases
    name
    ----
    openhab_db

but I still get in my openhab.log:

2020-04-01 21:19:36.076 [ERROR] [org.influxdb.impl.BatchProcessor    ] - Batch could not be sent. Data will be lost
java.lang.RuntimeException: {"error":"authorization failed"}

Any idea?

EDIT: auth-enabled = true is set in influxdb.conf

Should be “influxdb.cfg”

This is my influxdb.cfg. I havn´t touch it or changed it since I first started using it aprox 2 years ago:

# # The database URL, e.g. http://127.0.0.1:8086 or https://127.0.0.1:8084 .
# Defaults to: http://127.0.0.1:8086 
url=http://10.4.28.200:8086

# The name of the database user, e.g. openhab.
# Defaults to: openhab
user=username

# The password of the database user.
password=password

# The name of the database, e.g. openhab.
# Defaults to: openhab
db=openhab_db

# The retention policy to be used, needs to configured in InfluxDB
# Till v0.13: 'default', since v1.0: 'autogen'
retentionPolicy=autogenurl=
user=username
password=password
db=openhab_db
retentionPolicy=autogen

sorry, of course - was a typo.

Thank you for your help.
username and password have double entries.

I forgot to say, that I started from scratch and used to have this influxbd.cfg before.
I will try your other settings.