Upload Sensor data to a InfluxDB2 Web-Installation (http Binding // Exec Binding // ...)

  • Platform information:
    • Hardware: Raspberry Pi 3 Model B Plus Rev 1.3
    • OS: Openhabian
    • openHAB version: openHAB 3.3.0 - Release Build
  • Issue of the topic:

I am running a InfluxDB2 Instance on a webserver and try to push sensor data of two rooms to InfluxDB.

  • Temp-Sensor-Livingroom
  • Temp-Sensor-Outdoor

Both deliver temperature and humidity values.

Now I want to set up a rule which uploads the latest measurements each hour. I tried to figure out the best way described in multiple blogs and forum posts but I wasn’t successfull.

I tried the following approaches:

  1. http-binding
    Unfortunately the authentification in the http-binding does not support Tokens which are required from InfluxDB2. Basic Auth is possible according to the InfluxDB Documentation, but it does not seem to work (I cannot even load data with basic auth via POSTMAN but Token based authentifactions works easily)

  2. Persistence Service
    I tried the Persistence Service which seems to work but I do not like to work in config files and the measurement-names and tags which are autogenerated are a bit messy - I prefer to have full control of the tags. Furthermore I like to set up the persistence solution seperately to not mix it up with my sensor measurements and calculations.

  3. Exec Binding
    What seems to work according to multiple forum entries are curl commands with the Exec Binding. I tried to set up the configurations according to Exec - Bindings | openHAB
    I like the concept (full control, easy to maintain) but I am struggling to bring it to life.

I have a working curl command which works in the prompt and I tried it with sudo -u openhab <YOUR COMMAND>, too. These tests were successful.
Next I tried to set up my usecase according to the full example (Exec - Bindings | openHAB) - but within the web-ui.
The curl command would be
curl --request POST "https://MYURL.de/api/v2/write?org=MY_DB&bucket=myBUCKET&precision=s" --header "Authorization: Token xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx" --header "Content-Type: text/plain; charset=utf-8" --header "Accept: application/json" --data-binary "test,room=Wohnzimmer temp=20.5,hum=45".
As I understood it I need to split it into two parts: the base-part goes into the THING and the Whitelist whereas the dynamic part goes into the .sendCommand part for the ITEM.


UID: exec:command:4a6d64e9dc
label: Exec - InfluxDB Data Push
thingTypeUID: exec:command
  transform: REGEX((.*))
  interval: 0
  autorun: true
  command: 'curl --request POST
    --header "Authorization: Token
    --header "Content-Type: text/plain; charset=utf-8" --header "Accept:
    application/json" %2$s'
  timeout: 15

The string behind command is copied in the whitelist file as well (without ‘’ of course).

There is an ITEM linked to the Channel “Input” of type STRING with the Name ExecInfluxDBDataPush_Eingabewert.

I then created a dummy switch and a rule which fires once the dummy switch is pressed (similar to the example)


configuration: {}
  - id: "1"
      itemName: Dummy_Sensorpush_Temp_Hum
      command: ON
    type: core.ItemCommandTrigger
conditions: []
  - inputs: {}
    id: "2"
      itemName: ExecInfluxDBDataPush_Eingabewert
      command: --data-binary "test,room=Wohnzimmer temp=99.5,hum=99"
    type: core.ItemCommandAction

This is a first test to see if I can send hardcoded measurement values. Afterwards I am going to set up a proper rule with the real values.

And that is where I am stuck - when I press the button the REST command does not seem to work. I can read a logfile entry:

2023-01-19 16:54:25.878 [INFO ] [openhab.event.ItemCommandEvent ] - Item ‘ExecInfluxDBDataPush_Eingabewert’ received command --data-binary “test,room=Wohnzimmer temp=99.5,hum=99”

But the InfluxDB does not receive any data.

My Questions are:

  • What am I doing wrong? What can I do to debug my issue?
  • Is there another binding to easily connect to InfluxDB2?

Best regards

The easy way is to use persistence. If you don’t want to edit the .persist file every time you add new Items, you can specify a Group Item and add all Items that should be persisted to that group.

This is my influxdb.persist file, I haven’t touched it for over a year:

Strategies {
        everyHour : "0 0 * * * ?"
        default = everyChange
Items {
        Persistence_EveryChange*, Presence_Group : strategy = everyChange
        Persistence_EveryUpdate* : strategy = everyUpdate
        Persistence_EveryHour* : strategy = everyHour

As for the measurement names etc, I make that more user friendly when presenting the data (using grafana)

So in that case I could create two persistence files: one for system persistence and one for datalogging:

File 1: rrd4j.persist

Strategies {
  everyHour : "0 0 * * * ?"
  default = everyChange

Items {
    * : strategy = everyUpdate, everyChange, everyHour, restoreOnStartup

And File 2: influxdb.persist

Items {
    GP_Datalog_InfluxDB* : strategy = everyUpdate, everyChange, everyHour

Then I drop all temperature sensors into the group GP_Datalog_InfluxDB

That would work?
And is there a risk to set the system persistence up in the way I listed above with * to blindly fetch all items?

I think you need to define the strategies in the influxdb.persist file as well, but apart from that it should work.

Persisting all items like that shouldn’t pose any risk, apart from wasted space, but since rrd is designed to keep the db size from growing excessively, that shouldn’t be a concern unless you have a huge amount of Items.

A thing to consider though is if you want all items to be restored on startup, since some rules might run because of this, using outdated values if there has been a long outage. Better then not to restore some Items and wait for the sensors to send up-to-date info. YMMV of course.

I just want to give a short feedback: The proposed solution works just fine. I am now storing the values in InfluxDB2 and visualize them via Grafana.

Thank you!

1 Like