Jdbc mariadb problem persisting string items

I run into trouble if I want to persist a string item into a mariadb jdbc database. All other items where stored perfect. Only the item-type string is not working.

Item:
String           zzEskalationMessage "TestMessage [%s]"

Rule:
 zzEskalationMessage.postUpdate("Testing something")

Persistance:
Items {
    * : strategy = everyChange, everyUpdate
}

Error message in log:

2018-02-08 12:02:33.109 [ERROR] [org.knowm.yank.Yank                 ] - Error in SQL query!!!
java.sql.SQLException: (conn=25) Table 'openhab.zzeskalationmessage_0172' doesn't exist Query: INSERT INTO zzeskalationmessage_0172 (TIME, VALUE) VALUES( NOW(3), ? ) ON DUPLICATE KEY UPDATE VALUE= ? Parameters: [Meldung der Eskalation 02, Meldung der Eskalation 02]
        at org.apache.commons.dbutils.AbstractQueryRunner.rethrow(AbstractQueryRunner.java:392) [228:org.openhab.persistence.jdbc:1.11.0]
        at org.apache.commons.dbutils.QueryRunner.update(QueryRunner.java:491) [228:org.openhab.persistence.jdbc:1.11.0]
        at org.apache.commons.dbutils.QueryRunner.update(QueryRunner.java:457) [228:org.openhab.persistence.jdbc:1.11.0]
        at org.knowm.yank.Yank.execute(Yank.java:194) [228:org.openhab.persistence.jdbc:1.11.0]
        at org.knowm.yank.Yank.execute(Yank.java:177) [228:org.openhab.persistence.jdbc:1.11.0]
        at org.openhab.persistence.jdbc.db.JdbcBaseDAO.doStoreItemValue(JdbcBaseDAO.java:332) [228:org.openhab.persistence.jdbc:1.11.0]
        at org.openhab.persistence.jdbc.internal.JdbcMapper.storeItemValue(JdbcMapper.java:141) [228:org.openhab.persistence.jdbc:1.11.0]
        at org.openhab.persistence.jdbc.internal.JdbcPersistenceService.store(JdbcPersistenceService.java:133) [228:org.openhab.persistence.jdbc:1.11.0]
        at org.openhab.core.persistence.internal.PersistenceServiceDelegate.store(PersistenceServiceDelegate.java:55) [201:org.openhab.core.compat1x:2.2.0]
        at org.eclipse.smarthome.core.persistence.internal.PersistenceManagerImpl.handleStateEvent(PersistenceManagerImpl.java:131) [114:org.eclipse.smarthome.core.persistence:0.10.0.b1]
        at org.eclipse.smarthome.core.persistence.internal.PersistenceManagerImpl.stateUpdated(PersistenceManagerImpl.java:437) [114:org.eclipse.smarthome.core.persistence:0.10.0.b1]
        at org.eclipse.smarthome.core.items.GenericItem$1.run(GenericItem.java:232) [109:org.eclipse.smarthome.core:0.10.0.b1]
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:?]
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:?]
        at java.lang.Thread.run(Thread.java:748) [?:?]
2018-02-08 12:02:33.109 [ERROR] [org.knowm.yank.Yank                 ] - Error in SQL query!!!
java.sql.SQLException: (conn=25) Table 'openhab.zzeskalationmessage_0172' doesn't exist Query: INSERT INTO zzeskalationmessage_0172 (TIME, VALUE) VALUES( NOW(3), ? ) ON DUPLICATE KEY UPDATE VALUE= ? Parameters: [Meldung der Eskalation 02, Meldung der Eskalation 02]
        at org.apache.commons.dbutils.AbstractQueryRunner.rethrow(AbstractQueryRunner.java:392) [228:org.openhab.persistence.jdbc:1.11.0]
        at org.apache.commons.dbutils.QueryRunner.update(QueryRunner.java:491) [228:org.openhab.persistence.jdbc:1.11.0]
        at org.apache.commons.dbutils.QueryRunner.update(QueryRunner.java:457) [228:org.openhab.persistence.jdbc:1.11.0]
        at org.knowm.yank.Yank.execute(Yank.java:194) [228:org.openhab.persistence.jdbc:1.11.0]
        at org.knowm.yank.Yank.execute(Yank.java:177) [228:org.openhab.persistence.jdbc:1.11.0]
        at org.openhab.persistence.jdbc.db.JdbcBaseDAO.doStoreItemValue(JdbcBaseDAO.java:332) [228:org.openhab.persistence.jdbc:1.11.0]
        at org.openhab.persistence.jdbc.internal.JdbcMapper.storeItemValue(JdbcMapper.java:141) [228:org.openhab.persistence.jdbc:1.11.0]
        at org.openhab.persistence.jdbc.internal.JdbcPersistenceService.store(JdbcPersistenceService.java:133) [228:org.openhab.persistence.jdbc:1.11.0]
        at org.openhab.core.persistence.internal.PersistenceServiceDelegate.store(PersistenceServiceDelegate.java:55) [201:org.openhab.core.compat1x:2.2.0]
        at org.eclipse.smarthome.core.persistence.internal.PersistenceManagerImpl.handleStateEvent(PersistenceManagerImpl.java:131) [114:org.eclipse.smarthome.core.persistence:0.10.0.b1]
        at org.eclipse.smarthome.core.persistence.internal.PersistenceManagerImpl.stateChanged(PersistenceManagerImpl.java:432) [114:org.eclipse.smarthome.core.persistence:0.10.0.b1]
        at org.eclipse.smarthome.core.items.GenericItem$1.run(GenericItem.java:234) [109:org.eclipse.smarthome.core:0.10.0.b1]
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:?]
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:?]
        at java.lang.Thread.run(Thread.java:748) [?:?]

in the items table of jdbc the item exists:
grafik

but no table is created for the item iteself. The last one in alphabetic order is a item with ‘w’
grafik

I found it by myself. It looks, that maridadb cannot handle big string items

if I change in jdbc.cfg to this:

sqltype.STRING        =   VARCHAR(4096)

It is working.

I give you a additional tipp: I changed the collation of the MariaDB Database to utf8_general_mysql500_ci. Now I works …

3 Likes

And my tipp is to execute something like following to do that :wink:

USE information_schema;

SELECT concat("ALTER DATABASE `",table_schema,
              "` CHARACTER SET = 'utf8 COLLATE = utf8_general_mysql500_ci;") as _sql
  FROM `TABLES`
 WHERE table_schema like "<db_name>"
 GROUP BY table_schema;

SELECT concat("ALTER TABLE `",table_schema,"`.`",table_name,
              "` CONVERT TO CHARACTER SET utf8 COLLATE utf8_general_mysql500_ci;") as _sql
  FROM `TABLES`
 WHERE table_schema like "<db_name>"
 GROUP BY table_schema, table_name;

Replace <db_name> with the name of the used database. The second statement will change all tables in this database, the first will change the databse collation.

But thank you very much, this solved my problem.