Looks weird, indeed. When did this last work? In my experience the whole historic functions are broken since years (when using RRD, that’s why I don’t use them).
It has been working for me until my switch to 2.5 snapshot. I switched from 2.1 to 2.5 snapshot
Can you capture the query on mysql which OH is running? That’ll helps understand what’s happening.
How about changing yours to >=? Does not explain the negative though
2019-05-26 09:28:03.534 [DEBUG] [jdbc.internal.JdbcPersistenceService] - JDBC::query: item is Powermessurement_EG_Kuelschrank
2019-05-26 09:28:03.534 [DEBUG] [persistence.jdbc.internal.JdbcMapper] - JDBC::getHistItemFilterQuery filter=‘true’ numberDecimalcount=‘3’ table=‘item0041’ item=‘Powermessurement_EG_Kuelschrank (Type=NumberItem, State=0.0)’ itemName=‘Powermessurement_EG_Kuelschrank’
2019-05-26 09:28:03.534 [DEBUG] [nhab.persistence.jdbc.db.JdbcBaseDAO] - JDBC::getHistItemFilterQueryProvider filter = org.openhab.core.persistence.FilterCriteria@196069d5, numberDecimalcount = 3, table = item0041, simpleName = Powermessurement_EG_Kuelschrank
2019-05-26 09:28:03.535 [DEBUG] [nhab.persistence.jdbc.db.JdbcBaseDAO] - JDBC::query queryString = SELECT time, value FROM item0041 WHERE TIME>‘2019-05-26 08:28:03’ ORDER BY time ASC
2019-05-26 09:28:03.535 [DEBUG] [nhab.persistence.jdbc.db.JdbcBaseDAO] - JDBC::doGetHistItemFilterQuery sql=SELECT time, value FROM item0041 WHERE TIME>‘2019-05-26 08:28:03’ ORDER BY time ASC
2019-05-26 09:28:03.536 [DEBUG] [nhab.persistence.jdbc.db.JdbcBaseDAO] - JDBC::ItemResultHandler::handleResult getState value = ‘0.0’, getClass = ‘class java.lang.Double’, clazz = ‘Double’
2019-05-26 09:28:03.536 [DEBUG] [nhab.persistence.jdbc.db.JdbcBaseDAO] - JDBC::ItemResultHandler::handleResult getState value = ‘0.0’, getClass = ‘class java.lang.Double’, clazz = ‘Double’
2019-05-26 09:28:03.536 [DEBUG] [nhab.persistence.jdbc.db.JdbcBaseDAO] - JDBC::ItemResultHandler::handleResult getState value = ‘14.399999618530273’, getClass = ‘class java.lang.Double’, clazz = ‘Double’
2019-05-26 09:28:03.537 [DEBUG] [nhab.persistence.jdbc.db.JdbcBaseDAO] - JDBC::ItemResultHandler::handleResult getState value = ‘0.0’, getClass = ‘class java.lang.Double’, clazz = ‘Double’
2019-05-26 09:28:03.593 [DEBUG] [nhab.persistence.jdbc.db.JdbcBaseDAO] - JDBC::ItemResultHandler::handleResult getState value = ‘139.8000030517578’, getClass = ‘class java.lang.Double’, clazz = ‘Double’
2019-05-26 09:28:03.593 [DEBUG] [nhab.persistence.jdbc.db.JdbcBaseDAO] - JDBC::ItemResultHandler::handleResult getState value = ‘139.8000030517578’, getClass = ‘class java.lang.Double’, clazz = ‘Double’
2019-05-26 09:28:03.593 [DEBUG] [nhab.persistence.jdbc.db.JdbcBaseDAO] - JDBC::ItemResultHandler::handleResult getState value = ‘139.8000030517578’, getClass = ‘class java.lang.Double’, clazz = ‘Double’
2019-05-26 09:28:03.593 [DEBUG] [nhab.persistence.jdbc.db.JdbcBaseDAO] - JDBC::ItemResultHandler::handleResult getState value = ‘125.80000305175781’, getClass = ‘class java.lang.Double’, clazz = ‘Double’
2019-05-26 09:28:03.594 [DEBUG] [nhab.persistence.jdbc.db.JdbcBaseDAO] - JDBC::ItemResultHandler::handleResult getState value = ‘125.80000305175781’, getClass = ‘class java.lang.Double’, clazz = ‘Double’
2019-05-26 09:28:03.594 [DEBUG] [nhab.persistence.jdbc.db.JdbcBaseDAO] - JDBC::ItemResultHandler::handleResult getState value = ‘125.80000305175781’, getClass = ‘class java.lang.Double’, clazz = ‘Double’
2019-05-26 09:28:03.595 [DEBUG] [nhab.persistence.jdbc.db.JdbcBaseDAO] - JDBC::ItemResultHandler::handleResult getState value = ‘0.0’, getClass = ‘class java.lang.Double’, clazz = ‘Double’
2019-05-26 09:28:03.595 [DEBUG] [jdbc.internal.JdbcPersistenceService] - JDBC::query: query for Powermessurement_EG_Kuelschrank returned 11 rows in 61 ms
2019-05-26 09:28:03.596 [INFO ] [.smarthome.model.script.Betrieb.rule] - Verbrauch Kühlschrank Durchschnitt: -73.19130978547278
mysql> select avg(value) from item0041 where time >= ‘2019-05-26 08:30:00’ and time <= ‘2019-05-26 09:25:00’;
±------------------+
| avg(value) |
±------------------+
| 73.74545617537065 |
±------------------+
1 row in set (0.07 sec)
| 2019-05-26 08:30:05 | 0 |
| 2019-05-26 08:40:04 | 0 |
| 2019-05-26 08:45:34 | 14.399999618530273 |
| 2019-05-26 08:46:03 | 0 |
| 2019-05-26 08:49:08 | 139.8000030517578 |
| 2019-05-26 08:50:05 | 139.8000030517578 |
| 2019-05-26 09:00:05 | 139.8000030517578 |
| 2019-05-26 09:02:59 | 125.80000305175781 |
| 2019-05-26 09:10:04 | 125.80000305175781 |
| 2019-05-26 09:20:05 | 125.80000305175781 |
| 2019-05-26 09:23:29 | 0 |
So it seems that the same values are received. I‘ll have a look at the calculation.
I’m wondering if this is a timezone issue. The calculation itself is ok, depends on timestamps (since it is in fact a weighted average depending on the time-difference of the stored values). As last value the current item’s state is included with the current timestamp. If this value has another timezone than the stores valued and that is not properly considered, negative averages could be calculated (because now-first<0).
Ah, thx.
How can i check a timezone issue?
openHab, MySQL server both have the same:
So Mai 26 10:28:45 CEST 2019
I have to say that openHab and mySQL is running in a docker container, but as i said, the containers are running in the same timezone
Why isn’t avg() used for averageSince?
Unfortunately there is no debugging in that part. I have prepared a version which includes that. On the karaf console, please find the current version number with bundle:list -s | grep persistence
. You should see something like
165 x Active x 80 x 2.5.0.201905191647 x org.openhab.core.model.persistence
166 x Active x 80 x 2.5.0.201905191708 x org.openhab.core.model.persistence.ide
167 x Active x 80 x 2.5.0.201905191712 x org.openhab.core.model.persistence.runtime
180 x Active x 80 x 2.5.0.201905191646 x org.openhab.core.persistence
268 x Active x 80 x 1.14.0.201905210308 x org.openhab.persistence.influxdb
269 x Active x 80 x 1.14.0.201905210308 x org.openhab.persistence.jdbc
270 x Active x 80 x 1.14.0.201905210308 x org.openhab.persistence.mapdb
271 x Active x 80 x 1.14.0.201905210308 x org.openhab.persistence.rrd4j
the org.openhab.core.model.persistence
is the one you are looking for. In my case the bundle-number is 165
. Then update the bundle with bundle:update 165 https://janessa.me/esh/org.openhab.core.model.persistence-2.5.0-SNAPSHOT.jar
(change the bundle-number).
If you
log:set TRACE org.eclipse.smarthome.model.persistence.extensions
you should see additional logging when averageSince
is called. You can turn off the logging with log:set DEFAULT org.eclipse.smarthome.model.persistence.extensions
.
This is on your own risk, if unsure, make a backup before.
My output is:
165 x Active x 80 x 2.5.0.201902130920 x org.openhab.core.model.persistence
166 x Active x 80 x 2.5.0.201902130928 x org.openhab.core.model.persistence.ide
167 x Active x 80 x 2.5.0.201902130929 x org.openhab.core.model.persistence.runtime
180 x Active x 80 x 2.5.0.201902130920 x org.openhab.core.persistence
248 x Active x 80 x 1.14.0.201902170308 x org.openhab.persistence.jdbc
My snapshot is from february, should your snapshot build be compatible with mine?
Probably yes. But I can‘t say for sure. There has been no change in the persistence since then.
Ok, do you know whether openHab must be restarted after the update?
No. If nothing changes, you can restart the bundle (bundle:restart 165
), but that should not be necessary at all.
openhab> bundle:restart 165
Error executing command: Error restarting bundles:
Unable to start bundle 165: Could not resolve module: org.openhab.core.model.persistence [165]
Unresolved requirement: Import-Package: org.eclipse.jdt.annotation; resolution:=“optional”
Unresolved requirement: Import-Package: org.eclipse.xtext.xbase.lib; version="[2.17.0,3.0.0)"
Which xtext version is installed? bundle:list | grep -i xtext
? Anyway, if you are on snapshot, is there any resaon not to upgrade to a more recent version? Is there any known issue with the bindings you use?
108 x Active x 80 x 2.14.0.v20180522-1821 x Xtext
109 x Active x 80 x 2.14.0.v20180522-1833 x Xtext Common Types
110 x Active x 80 x 2.14.0.v20180522-1821 x Xtext IDE Core
111 x Active x 80 x 2.14.0.v20180522-1821 x Xtext Utility
there is no reason not to update to #1600
do you know if it is possible to update to #1600 in the karaf console?
No, that is not possible. XText has been upgraded in March, I did forget about that.
I did no backup
Do you know how i can get back to “2.5.0.201902130920 x org.openhab.core.model.persistence” ?
Or where i can find the bundle jars?
Actually: no. I’m not aware of any location where that old snapshots are stored. I would recommend upgrading.
Ok, i’ll update, hopefully there are not a lot of breaking changes regarding my configurations