Updated InfluxDB binding with tags

Create a new database for the new binding…
Could your java program ask for the new tags for each series it finds in the old database?
Ie: Prompt for the tags to assign for each series ?
Also, check if the series has had data written into it in the last few days, and skip import into the new database if no data has been there for that period.

As for tags, i wonder if this type of metadata is being considered for OH3 - where each item/thing has a “room” or “floor” tag field…

As a newbie who started only two weeks ago with logging openhab data into influxdb I would sacrifice my existing data in a heartbeat to get tag support. Is there anywhere a .jar flying around that I could “borrow”?

2 Likes

Any updates on this?

Great work @dominikkv !

Have you thought adding a way to specify the retention policy to the item ?
I think I’m not the only one having items varying very frequently that are kept a few days before being downsampled.
I currently use a rule and a direct http post to influx but, as for tags, your update would make it a lot cleaner !

I was really happy to see there’s finally a way to work with tags directly from the persistence binding. I am using tick-scripts to add them as data comes into the influx DB, but that’s quite unhandy.

I would like to test this, but I am not experienced enough to compile the addon myself.
Is there a .jar file I could download somewhere to test?

Also: will this become part of v2.5 final?

Hey all, sorry for not responding :-/ I am now father of a lovely boy, so there is not that much time left for smart home anymore.

@christoph_wempe : I’ll improve the readme with your suggestions.
@greg: I understand your wish, but I think I will not implement an automated migration mechanism. This is complicated, error prone and in the end the user wanted to have it in a different way. Sorry :frowning:
@StephN: You can already specify a retention policy in the configs for all items. For a per-item-config, this is a little bit more difficult. Saving the states in another RP would be possible, but when querying the database, there is no information about the item. This means OpenHAB charts and functions to query a value like <item>.deltaSince(AbstractInstant) would not work anymore for measurements not in the global RP. Maybe a solution for you is saving the data in the global RP in the first place, and then later on transfering them to another RP with a continous query?
@bastian, @klaernie: here you go.

Dominik

1 Like

@dominikkv - thanks a lot, will try it out on the weekende. And all the best for your boy - congratulations to the three of you!

Was this code finally merged or in process to a PR?
I want to do some personal experimentation with influxdb 2.0 and I’ve been looking for current code and it’s in openhab 1.0 addons?
The only work in 2.0 is this? or I’m missing something?

Hey Joan @lujop, I have not yet created a PR, but that’s the next step. Today I have added the other example in the documentation.

You are right, the current InfluxDB binding is an OpenHAB 1.x binding (which works perfectly fine with OpenHAB 2.x). I have ported it to OpenHAB 2.x, adding other features like additional tags.

PR is coming soon, stay tuned :slight_smile:

Dominik

Is it feasible to deliver the update with OpenHAB 2.5? Having tags would add a lot of value to the InfluxDB persistence binding. I am happy to help by testing things with my dockerized OpenHAB/InfluxDB environment.

Thanks @dominikkv for you work! Finally got to installing it right now and it works really nicely.

For anybody searching for a migration strategy for the “replace _ with .” problem: I just grabbed a simple bash script to do this:

 foreach olditem in $( influx -database openhab_db -execute 'show measurements' | grep -E '[_]' ); do echo "select * into \"${olditem//_/.}\" from ${olditem}"; done

The output I pasted back into influx, double checked the new measurements were there, then did the same with

foreach olditem in $( influx -database openhab_db -execute 'show measurements' | grep -E '[_]' ); do echo "drop measurement ${olditem}"; done
1 Like

Okay, after running a few days I noticed one problem. Somehow .averageSince() doesn’t work anymore.

This debug rule:

rule "debug output"
when
        Time cron "0 * * * * ?"
then
        logInfo("debug",PresenceCountdown.averageSince(now.minusMinutes(15)))
end

produces only this log output:

 2019-12-02 18:13:00.002 [DEBUG] [ersistence.influxdb.internal.InfluxDBPersistenceService] - Got a query for historic points!
 2019-12-02 18:13:00.002 [DEBUG] [ersistence.influxdb.internal.InfluxDBPersistenceService] - Query: SELECT * FROM "openhab_db"."autogen"./.*/ WHERE item = 'PresenceCountdown' AND time > 1575305880s  LIMIT 2147483647
 2019-12-02 18:13:00.028 [DEBUG] [ersistence.influxdb.internal.InfluxDBPersistenceService] - Returning query() with 985 items
 2019-12-02 18:13:00.030 [ERROR] [thome.model.rule.runtime.internal.engine.ExecuteRuleJob] - Error during the execution of rule 'debug output': An error occurred during the script execution: Could not invoke method: org.eclipse.smarthome.model.script.actions.LogAction.logInfo(java.lang.String,java.lang.String,java.lang.Object[]) on instance: null

but when ask the persistence over HTTP I get meaningful data:

kandre@mainframe(pts/7) ~ % curl 'http://openhab.ak-online.be/rest/persistence/items/PresenceCountdown?starttime=2019-12-02T18:18:00'
{
  "name":"PresenceCountdown",
  "datapoints":"16",
  "data":[
     {"time":1575307080004,"state":"1798.0"},
     {"time":1575307080060,"state":"1798.0"},
     {"time":1575307081003,"state":"1797.0"},
     {"time":1575307082058,"state":"1796.0"},
     {"time":1575307083003,"state":"1795.0"},
     {"time":1575307084002,"state":"1794.0"},
     {"time":1575307085004,"state":"1793.0"},
     {"time":1575307086003,"state":"1792.0"},
     {"time":1575307087002,"state":"1791.0"},
     {"time":1575307088003,"state":"1790.0"},
     {"time":1575307089003,"state":"1789.0"},
     {"time":1575307090003,"state":"1788.0"},
     {"time":1575307091003,"state":"1787.0"},
     {"time":1575307092077,"state":"1786.0"},
     {"time":1575307093003,"state":"1785.0"},
     {"time":1575307094003,"state":"1784.0"}
  ]
}

Does anyone have an idea if this is caused by the InfluxDB binding or where I could find more debugging information?

Hey @klaernie,

this is an exception thrown in your rule. Your log says that the InfluxDB binding has provided 985 data points, and that you have an error in your logInfo() call. Try something like this:

logInfo("debug", "Average: {}", PresenceCountdown.averageSince(now.minusMinutes(15)));

Thanks for your rename script!

@martingruening, I don’t think it’l made it into 2.5, as M6 has already arrived, wich will be the last milestone. But you can use the jar I have provided.

@all: Good news - PR is ready: https://github.com/openhab/openhab2-addons/pull/6507

Thanks for your feedback, and sorry again that it took so long :slight_smile:

Dominik

2 Likes

@dominikkv Could you change the way Color items are written? I just tried to graph an RGB lamp’s brightness, and it took me a while to find out why Grafana didn’t want to deliver data for it :wink:

From my user perspective I would expect that the binding writes three or six values into InfluxDB:

  • either HSB values into (H,S,B)
  • or the RGB into fields (R,G,B)
  • or possibly both together.

Maybe to keep backwards-compatibility the field value should be still written like it currently is.

From my personal view I’d like the HSB approach best, as openHAB uses it internally, so conversion is lossless and it supports the use case: “Show me a graph of the lamp’s brightness”.

Else I can report everything still working fine, once I discovered that during the 2.5 upgrade the 1.14 bundle got activated and was messing with my setup.

Well, it wasn’t the weekend which brought time to try out your new addon-version, it was actually only happening right now.
Having just installed and configured my items, I want to confirm that everything works as expected, based on your documentation.
I am sending items into InfluxDB, they get mapped to measurements based on the group they belong to and have tags added, based on the per-item definitions.

@dominikkv: Thank you very much for making this possible! My InfluxDB set up is becoming way clearer and better usable through this.
If you don’t hear anything more from me, that means I am a happy user who did not find any problems with the add-on.
And all the best for 2020!

@dominikkv Will your version of the Persistence service work with InfluxDB 2.0?

No it doesn’t.
The protocol and drivers for 2.0 are different.
I started on a module with 2.0 APIs (look at this message).
But this is a very preliminary work that has been paused from the last months.
I’ve I get some guidance and support I will try to continue on it.

1 Like

@dominikkv: sadly your PR was closed by Kai Kreuzer. What does it actually mean? More work on your side to be done :(?

Also it would be cool, if one could enable “auto-tagging” based on the current timestamp. Means: putting a tag for the current month, year and maybe week number. Since otherwise it’s almost impossible to perform grouping with influxdb in Grafana. E.g. to graph the power consumption sum on a week, month or year basis.

You can group your data by time without tags.

I do it this way.

A new version of the addon has been integrated into master branch and will be available in OH 3.0.
It was originally a new addon for InfluxDB 2.0 but finally, it has been refactored and it’s a rewrite of the previous addon that supports both versions and incorporates @dominikkv work with tags.

2 Likes