hmmm … mystic:
If there is an trigger defined on databse level, openHAB2 persistant service reacts “not amused” with “Error in SQL query” at record insert statement.
Trigger works without problems. This is tested.
OpenHAB insert statement works fine, if no trigger is defined. This is tested.
What’s going on here?
The trigger action doesn’t attach the insert procedure directly. It only delets old records from tabel.
Based on the given information.
Most likely you are trying to store a text that is too long.
You add one of the these 2 lines to your mysql.cfg file
sqltype.STRING = TEXT
the string to inser is araund 20 chars only.
The behavior has something to do with an active insert trigger on database side.
It would be helpfull if you show a part of the log where the error occurs.
java.sql.SQLException: Can’t update table ‘item0014’ in stored function/trigger because it is already used by statement which invoked this stored function/trigger. Query: INSERT INTO item0014 (TIME, VALUE) VALUES( NOW(3), ? ) ON DUPLICATE KEY UPDATE VALUE= ? Parameters: [9.0, 9.0]
at org.apache.commons.dbutils.AbstractQueryRunner.rethrow(AbstractQueryRunner.java:392) ~[commons-dbutils-1.6.jar:1.6]
at org.apache.commons.dbutils.QueryRunner.update(QueryRunner.java:491) ~[commons-dbutils-1.6.jar:1.6]
at org.apache.commons.dbutils.QueryRunner.update(QueryRunner.java:457) ~[commons-dbutils-1.6.jar:1.6]
at org.knowm.yank.Yank.execute(Yank.java:194) [yank-3.2.0.jar:?]
at org.knowm.yank.Yank.execute(Yank.java:177) [yank-3.2.0.jar:?]
at org.openhab.persistence.jdbc.db.JdbcBaseDAO.doStoreItemValue(JdbcBaseDAO.java:337) [bundleFile:?]
at org.openhab.persistence.jdbc.internal.JdbcMapper.storeItemValue(JdbcMapper.java:145) [bundleFile:?]
at org.openhab.persistence.jdbc.internal.JdbcPersistenceService.store(JdbcPersistenceService.java:137) [bundleFile:?]
at org.openhab.core.persistence.internal.PersistenceServiceDelegate.store(PersistenceServiceDelegate.java:59) [bundleFile:?]
at org.eclipse.smarthome.core.persistence.internal.PersistenceManagerImpl.handleStateEvent(PersistenceManagerImpl.java:137) [bundleFile:?]
at org.eclipse.smarthome.core.persistence.internal.PersistenceManagerImpl.stateChanged(PersistenceManagerImpl.java:437) [bundleFile:?]
at org.eclipse.smarthome.core.items.GenericItem$1.run(GenericItem.java:261) [bundleFile:?]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [?:1.8.0_65]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [?:1.8.0_65]
at java.lang.Thread.run(Thread.java:745) [?:1.8.0_65]
2020-06-20 18:38:21.230 [ERROR] [org.knowm.yank.Yank ] - Error in SQL query!!!
I don’t know much about triggers, but this looks relevant
I have seen this before.
If you have deleted an Item00 table, it will not add again.
Maybe you should try to rename Item14 in your items file
it doesn’t solve the problem, but it is a work around
well, after I deleted the trigger it works without errors.
But that’s not a real solution.
It looks as if there is no way around to go into the code of the openHAB MySQL add-on and change the DB calls. Here th ‘ON DUPLICATE KEY UPDATE’ clause makes the problem.
Also the knowledge how to code it is on my side, I wanted to avoid this and was so happy that first look to openHAB gives impression to me that it could be used out of the box for my automatically irrigation of garden.
OK … let’s see, where it goes