Influx db2 - Openhab 3.1M5 deltaSince (Start_of_day) not working when lines without item name in database (from Openhab 2.5)

Dear Community,

at the moment I perform an upgrade from Openhab 2.5/ Influx 1 to Openhab 3.1 /Influx 2. If working very good but i have trouble with rules.

To manage the daliy consumption I read calculate Item based on figure from the Influx 2 DB.

Due to the fact that deltasince(now.withStartofDay) is not working anymore I tried to adjust to the new behavior. As nothing was working I tried to put several values to the log. At the end I want to have this line running

   postUpdate(Lesekopf3Channel0.deltaSince(now().with(LocalTime.MIDNIGHT)) as Number) 

These are the test’s I did to understand the problem. average.since is working but delta.scince not.

   logInfo("Tageswerte", "Test 1 {}", start_of_day)
   logInfo("Tageswerte", "Test 2 average {}", FRITZDECT2001Energy.averageSince(start_of_day, "influxdb"))
   logInfo("Tageswerte", "Test 3 Zeit-Min {}",
   logInfo("Tageswerte", "Test 4 Zeit-Midnight {}",
   logInfo("Tageswerte", "Test 5 Zeit-Now {} wh",
   logInfo("Tageswerte", "Test 6 average {} wh", Lesekopf3Channel0.averageSince(start_of_day, "influxdb") as Number)
   logInfo("Tageswerte", "Test 7 deltaSince {} wh", Lesekopf3Channel0.deltaSince(now.withHour(0).withMinute(4).withSecond(0), "influxdb") as Number)
   logInfo("Tageswerte", "Test 8 deltaSince  {} wh", Lesekopf3Channel0.deltaSince(start_of_day, "influxdb") as Number)

As you see in the log until Test6 it is working - Test 7 is the deltaSince and just before i got this strange Could not find item ‘null’ in registry - Why is the item null when it isn’t with average. I am lost

2021-06-13 19:56:46.421 [INFO ] [openhab.core.model.script.Tageswerte] - Test 1 2021-06-13T00:00+02:00[Europe/Berlin]
2021-06-13 19:56:46.429 [INFO ] [openhab.core.model.script.Tageswerte] - Test 2 average 1196.5195
2021-06-13 19:56:46.429 [INFO ] [openhab.core.model.script.Tageswerte] - Test 3 Zeit-Min 2021-06-13T00:00+02:00[Europe/Berlin]
2021-06-13 19:56:46.429 [INFO ] [openhab.core.model.script.Tageswerte] - Test 4 Zeit-Midnight 2021-06-13T00:00+02:00[Europe/Berlin]
2021-06-13 19:56:46.429 [INFO ] [openhab.core.model.script.Tageswerte] - Test 5 Zeit-Now 2021-06-13T00:00:00.429519+02:00[Europe/Berlin] wh
2021-06-13 19:56:46.437 [INFO ] [openhab.core.model.script.Tageswerte] - Test 6 average 24179137.88801619 wh
2021-06-13 19:56:46.458 [INFO ] [b.internal.InfluxDBStateConvertUtils] - Could not find item ‘null’ in registry
2021-06-13 19:56:46.459 [INFO ] [openhab.core.model.script.Tageswerte] - Test 7 deltaSince null wh
2021-06-13 19:56:46.480 [INFO ] [b.internal.InfluxDBStateConvertUtils] - Could not find item ‘null’ in registry
2021-06-13 19:56:46.480 [INFO ] [openhab.core.model.script.Tageswerte] - Test 8 deltaSince null wh

Thanks for your support how to use deltaSince with Openhab 3.1 and Influx 2


Noting that the average has units, this may be about poor handling of deltaSince involving Quantity types.

Not sure to understand - the Unit (WH) is part of the logcommand =Text - this is the database

Oh yes, I’d missed that.

(For background, units (if any) are never stored in your database, but the persistence service will try to reconstruct them if your existing Item has them.)

Ok - good to know.

I tried annother way


it’s working in a strange way - it takes the value from the moment where I upgrades from influx 1 /openhab 2.5 to Influx 2/ openhab 3.1 - There are new fields now that won’t exist before - so even I say minusDays(1) it took end of may

It seems that it could not read the new part of the database with the persistence command. Are there new command for Influxdb2?

Woah, hang on - you’ve got some hybrid Influx2 database populated with data from Influx1? How did you migrate the old data?

I simply use the included upgrade function from Influx inside a docker. Then I installed openhab 3 in a docker and connect to this upgraded influxdatabase - It was running out of the box except the rules with deltasince

I am stil searching for a solution - with a new item deltaSince is working but not with item from Openhab 2.5 - Itried the last snapshot because i though that isue #10680 solved this but it was already included in M5 and no change to the influx persistence since them.

The fix was only for Influx1 because it applied to data that was stored before 3.0.0 that didn’t include item tag name.
And as Influx2 was introduced in 3.0.0 in theory it doesn’t apply because can’t be such data. Can you update your data to include item tagname if it isn’t present?

That is what I also tried the last days - but I have to say I am beginner in Influx - all documentation I found were for Influx 1 and didn’t work. Do you have a good link where I can continue?

What I still don’t understand why should it be a difference between V1 and V2 because if you upgrade from 2.5 to 3.1 you will have missing itemnames for old data in both version. Or did I something wrong / Is there a special way to upgrade OH.

Additional here the query from the log

|||> range(start:-100y, stop:2021-06-19T22:00:00.000000000Z)|
|||> filter(fn: (r) => r[_measurement] == Lesekopf3Channel0)|
|||> sort(desc:true, columns:[_time])|
|||> limit(n:1, offset:0)|

The query is not using “item” for the filter. but limit deliver more then one result (With item tag and without item tag even if there is a limt n:1

Hi @Asmodeon,
Currently it’s not used for query if it’s not needed but it’s always expected to be present on the results.
Stefan has made a PR to allow observations without the item tag name.

When he attach a compiled version can you verify and confirm it’s working well for your old data and new one?

Hello @lujop
of course - I will test when Stefan will have it ready. Thanks for your support

Stefan has already uploaded the updated binding in the PR.
Just remember to rename to .jar as he said

I put it to addons, stopped the old one and try to start this new one - but it failed to start

272 │ Active    │  80 │    │ openHAB Add-ons :: Bundles :: Go-eCharger Binding                                                                                                                                                                                                                                            
273 │ Installed │  80 │    │ openHAB Add-ons :: Bundles :: Persistence Service :: InfluxDB                                                                                                                                                                                                                                
274 │ Resolved  │  80 │ 3.1.0.M5              │ openHAB Add-ons :: Bundles :: Persistence Service :: InfluxDB                                                                                                                                                                                                                                
openhab> bundle:start 273                                                                                                                                                                                                                                                                                                                    
Error executing command: Error executing command on bundles:                                                                                                                                                                                                                                                                                 
        Error starting bundle 273: Could not resolve module: org.openhab.persistence.influxdb [273]                                                                                                                                                                                                                                          
  Unresolved requirement: Import-Package: com.influxdb.client                                                                                                                                                                                                                                                                                

I installed the latest snapshot and it working again. For old items but also for new items

@lujop I have to open this topic again because something is still strange what I not realised the first day. The deltasince is now calculating the difference only to items without the “items-Field” filled. For me everyday the difference to 28.05.2021 where did the upgrade.

this is the query

from(bucket: "openhab_db/autogen")
 	|> range(start:-100y, stop:2021-06-24T22:00:00.000000000Z)
	|> filter(fn: (r) => r["_measurement"] == "Lesekopf3Channel0")
	|> sort(desc:true, columns:["_time"])
	|> limit(n:1, offset:0)

and this the result

As you can see the query give 2 results and by “Descending” it tooks the old one. So its calculating permanently with the difference from 28th

Of couse with “desc:wrong” it’s working neither because it takes the first entry from last year.

If have no idea how to solve this. Any idea

I thing I have a soltution, If in the query all unneeded columns are droped the result look good for me
from(bucket: “openhab_db/autogen”)

from(bucket: "openhab_db/autogen")
 	|> range(start:-100y, stop:2021-06-24T22:00:00.000000000Z)
	|> filter(fn: (r) => r["_measurement"] == "Lesekopf3Channel0")
    |> drop(columns: ["item", "category", "label", "type"])
    |> sort(desc:true, columns:["_time"])
    |> limit(n:1, offset: 0)

This is the result

Could you please check if this could be a solution without give problems for others

I’m not able the results from yous screenshot.
Can you send the results showed as a table from Data Explorer from InfluxDB UI?

Here the table result of the first query

I think the same what you get

don’t wonder why there are 4 on the left because it’s not only with item and without item, i also test to use the additional fields as test to solve the problem, was not working but now it’s in the database. But at the end if someone change how many field are put to the table he will have the same problem like me

here the result for the 2nd query

nothing on the left because only one version due to droping all additional columns.

Thanks @Asmodeon
I see now. The problem is that limit works individually for each output table, and a table is created by default for each serie or different group key, and that includes tags.

Having only desired fields can be solution, but more than a drop a keep would be better.
But I want to check with more detail to check there isn’t anything broken with other cases