OH 3 no longer writes in "event.log"

I’m on build # 2072 but I think the problem started earlier, I think 2065. At some point I noticed that the last entries are from the day before, then deleted the events.log file, it was created again but is empty. The Openab.log is written, here the start (unfortunately there are many errors regarding the influxdb):

2020-12-13 21:13:00.441 [INFO ] [.core.internal.i18n.I18nProviderImpl] - Time zone set to 'Europe/Vienna'.
2020-12-13 21:13:00.501 [INFO ] [.core.internal.i18n.I18nProviderImpl] - Location set to '48.17402269633533,14.833800856857199,272'.
2020-12-13 21:13:00.507 [INFO ] [.core.internal.i18n.I18nProviderImpl] - Locale set to 'de_AT'.
2020-12-13 21:13:00.513 [INFO ] [.core.internal.i18n.I18nProviderImpl] - Measurement system set to 'SI'.
2020-12-13 21:13:14.058 [INFO ] [el.core.internal.ModelRepositoryImpl] - Loading model 'influxdb.persist'
2020-12-13 21:13:15.387 [INFO ] [el.core.internal.ModelRepositoryImpl] - Loading model 'default.sitemap'
2020-12-13 21:13:21.470 [INFO ] [.core.model.lsp.internal.ModelServer] - Started Language Server Protocol (LSP) service on port 5007
2020-12-13 21:13:27.143 [INFO ] [org.openhab.ui.internal.UIService   ] - Started UI on port 8080
2020-12-13 21:13:27.993 [DEBUG] [rnal.discovery.AstroDiscoveryService] - Location has been changed from null to 48.17402269633533,14.833800856857199,272: Creating new discovery results
2020-12-13 21:13:28.025 [DEBUG] [rnal.discovery.AstroDiscoveryService] - Scheduled astro location-changed job every 60 seconds
2020-12-13 21:13:28.883 [DEBUG] [g.astro.internal.action.AstroActions] - Astro actions service instanciated
2020-12-13 21:13:29.803 [DEBUG] [o.internal.handler.AstroThingHandler] - Initializing thing astro:moon:local
2020-12-13 21:13:29.807 [DEBUG] [o.internal.handler.AstroThingHandler] - org.openhab.binding.astro.internal.config.AstroThingConfig@40c02ff3
2020-12-13 21:13:29.814 [DEBUG] [o.internal.handler.AstroThingHandler] - Restarting jobs for thing astro:moon:local
2020-12-13 21:13:29.820 [DEBUG] [o.internal.handler.AstroThingHandler] - Stopping scheduled jobs for thing astro:moon:local
2020-12-13 21:13:29.841 [DEBUG] [o.internal.handler.AstroThingHandler] - Scheduled org.openhab.core.internal.scheduler.SchedulerImpl$ScheduledCompletableFutureRecurring@419a5361[Not completed, 2 dependents] at midnight
2020-12-13 21:13:30.167 [DEBUG] [o.internal.handler.AstroThingHandler] - Publishing planet Moon for thing astro:moon:local
2020-12-13 21:13:30.208 [DEBUG] [enhab.binding.astro.internal.job.Job] - Scheduled Astro event-jobs for thing astro:moon:local
2020-12-13 21:13:30.402 [DEBUG] [o.internal.handler.AstroThingHandler] - Thing astro:moon:local initialized ONLINE
2020-12-13 21:13:30.495 [DEBUG] [g.astro.internal.action.AstroActions] - Astro actions service instanciated
2020-12-13 21:13:30.745 [DEBUG] [o.internal.handler.AstroThingHandler] - Initializing thing astro:sun:local
2020-12-13 21:13:30.748 [DEBUG] [o.internal.handler.AstroThingHandler] - org.openhab.binding.astro.internal.config.AstroThingConfig@51380e9e
2020-12-13 21:13:30.752 [DEBUG] [o.internal.handler.AstroThingHandler] - Restarting jobs for thing astro:sun:local
2020-12-13 21:13:30.764 [DEBUG] [o.internal.handler.AstroThingHandler] - Stopping scheduled jobs for thing astro:sun:local
2020-12-13 21:13:30.786 [DEBUG] [o.internal.handler.AstroThingHandler] - Scheduled org.openhab.core.internal.scheduler.SchedulerImpl$ScheduledCompletableFutureRecurring@278d14f2[Not completed, 2 dependents] at midnight
2020-12-13 21:13:30.946 [DEBUG] [o.internal.handler.AstroThingHandler] - Publishing planet Sun for thing astro:sun:local
2020-12-13 21:13:31.119 [DEBUG] [enhab.binding.astro.internal.job.Job] - Scheduled Astro event-jobs for thing astro:sun:local
2020-12-13 21:13:31.179 [INFO ] [e.automation.internal.RuleEngineImpl] - Rule engine started.
2020-12-13 21:13:31.256 [INFO ] [o.internal.handler.AstroThingHandler] - Scheduled Positional job astro:sun:local every 300 seconds
2020-12-13 21:13:31.263 [DEBUG] [o.internal.handler.AstroThingHandler] - Thing astro:sun:local initialized ONLINE
2020-12-13 21:13:31.345 [DEBUG] [o.internal.handler.AstroThingHandler] - Publishing planet Sun for thing astro:sun:local
2020-12-13 21:13:32.281 [INFO ] [g.discovery.internal.PersistentInbox] - Added new thing 'harmonyhub:device:Wohnzimmer:67608065' to inbox.
2020-12-13 21:13:34.380 [INFO ] [.transport.mqtt.MqttBrokerConnection] - Starting MQTT broker connection to '10.0.0.60' with clientid 36cd419c-9025-49f7-9199-95d14b1bb288
2020-12-13 21:13:34.454 [WARN ] [.MqttBrokerConnectionServiceInstance] - MqttBroker connection configuration faulty: host : You need to provide a hostname/IP!
2020-12-13 21:13:34.819 [INFO ] [.onkyo.internal.handler.OnkyoHandler] - Using configuration: ipAddress = 10.0.0.53, port = 60128, udn = b014a0b5-00b5-a0BB-b000-0009b014a0b5, refreshInterval = 0, volumeLimit = 100, volumeScale = 1.0
2020-12-13 21:13:36.143 [INFO ] [io.openhabcloud.internal.CloudClient] - Connected to the openHAB Cloud service (UUID = 55657c87-ad05-4927-b319-80d52640d478, base URL = http://localhost:8080)
2020-12-13 21:13:36.900 [WARN ] [mmon.WrappedScheduledExecutorService] - Scheduled runnable ended with an exception: 
java.lang.NullPointerException: null
	at org.openhab.binding.chromecast.internal.ChromecastStatusUpdater.updateMediaStatus(ChromecastStatusUpdater.java:153) ~[?:?]
	at org.openhab.binding.chromecast.internal.ChromecastCommander.handleRefresh(ChromecastCommander.java:115) ~[?:?]
	at org.openhab.binding.chromecast.internal.handler.ChromecastHandler$Coordinator.refresh(ChromecastHandler.java:265) ~[?:?]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) ~[?:?]
	at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) ~[?:?]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) ~[?:?]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?]
	at java.lang.Thread.run(Thread.java:834) [?:?]
2020-12-13 21:13:39.969 [WARN ] [core.voice.internal.VoiceManagerImpl] - Error saying 'Hey Süßer Openhab wurde neu gestartet und steht jetzt wieder voll zu deiner Verfügung': No TTS service can be found for voice googletts:deDEWavenetF
org.openhab.core.voice.TTSException: No TTS service can be found for voice googletts:deDEWavenetF
	at org.openhab.core.voice.internal.VoiceManagerImpl.say(VoiceManagerImpl.java:203) [bundleFile:?]
	at org.openhab.core.automation.module.media.internal.SayActionHandler.execute(SayActionHandler.java:58) [bundleFile:?]
	at org.openhab.core.automation.internal.RuleEngineImpl.executeActions(RuleEngineImpl.java:1177) [bundleFile:?]
	at org.openhab.core.automation.internal.RuleEngineImpl.runRule(RuleEngineImpl.java:985) [bundleFile:?]
	at org.openhab.core.automation.internal.TriggerHandlerCallbackImpl$TriggerData.run(TriggerHandlerCallbackImpl.java:89) [bundleFile:?]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) [?:?]
	at java.util.concurrent.FutureTask.run(FutureTask.java:264) [?:?]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?]
	at java.lang.Thread.run(Thread.java:834) [?:?]
2020-12-13 21:13:45.462 [ERROR] [nal.common.AbstractInvocationHandler] - An error occurred while calling method 'QueryablePersistenceService.query()' on 'org.openhab.persistence.influxdb.InfluxDBPersistenceService@307f24a1': Value must be between 0 and 100
java.lang.IllegalArgumentException: Value must be between 0 and 100
	at org.openhab.core.library.types.PercentType.validateValue(PercentType.java:57) ~[bundleFile:?]
	at org.openhab.core.library.types.PercentType.<init>(PercentType.java:47) ~[bundleFile:?]
	at org.openhab.persistence.influxdb.internal.InfluxDBStateConvertUtils.objectToState(InfluxDBStateConvertUtils.java:128) ~[?:?]
	at org.openhab.persistence.influxdb.internal.InfluxDBStateConvertUtils.objectToState(InfluxDBStateConvertUtils.java:101) ~[?:?]
	at org.openhab.persistence.influxdb.InfluxDBPersistenceService.mapRow2HistoricItem(InfluxDBPersistenceService.java:237) ~[?:?]
	at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:195) ~[?:?]
	at java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1655) ~[?:?]
	at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:484) ~[?:?]
	at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:474) ~[?:?]
	at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:913) ~[?:?]
	at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) ~[?:?]
	at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:578) ~[?:?]
	at org.openhab.persistence.influxdb.InfluxDBPersistenceService.query(InfluxDBPersistenceService.java:229) ~[?:?]
	at jdk.internal.reflect.GeneratedMethodAccessor74.invoke(Unknown Source) ~[?:?]
	at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:?]
	at java.lang.reflect.Method.invoke(Method.java:566) ~[?:?]
	at org.openhab.core.internal.common.AbstractInvocationHandler.invokeDirect(AbstractInvocationHandler.java:154) [bundleFile:?]
	at org.openhab.core.internal.common.Invocation.call(Invocation.java:52) [bundleFile:?]
	at java.util.concurrent.FutureTask.run(FutureTask.java:264) [?:?]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?]
	at java.lang.Thread.run(Thread.java:834) [?:?]
2020-12-13 21:13:45.478 [ERROR] [ence.internal.PersistenceManagerImpl] - Exception occurred while querying persistence service 'influxdb': Value must be between 0 and 100
java.lang.IllegalArgumentException: Value must be between 0 and 100
	at org.openhab.core.library.types.PercentType.validateValue(PercentType.java:57) ~[bundleFile:?]
	at org.openhab.core.library.types.PercentType.<init>(PercentType.java:47) ~[bundleFile:?]
	at org.openhab.persistence.influxdb.internal.InfluxDBStateConvertUtils.objectToState(InfluxDBStateConvertUtils.java:128) ~[?:?]
	at org.openhab.persistence.influxdb.internal.InfluxDBStateConvertUtils.objectToState(InfluxDBStateConvertUtils.java:101) ~[?:?]
	at org.openhab.persistence.influxdb.InfluxDBPersistenceService.mapRow2HistoricItem(InfluxDBPersistenceService.java:237) ~[?:?]
	at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:195) ~[?:?]
	at java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1655) ~[?:?]
	at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:484) ~[?:?]
	at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:474) ~[?:?]
	at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:913) ~[?:?]
	at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) ~[?:?]
	at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:578) ~[?:?]
	at org.openhab.persistence.influxdb.InfluxDBPersistenceService.query(InfluxDBPersistenceService.java:229) ~[?:?]
	at jdk.internal.reflect.GeneratedMethodAccessor74.invoke(Unknown Source) ~[?:?]
	at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:?]
	at java.lang.reflect.Method.invoke(Method.java:566) ~[?:?]
	at org.openhab.core.internal.common.AbstractInvocationHandler.invokeDirect(AbstractInvocationHandler.java:154) [bundleFile:?]
	at org.openhab.core.internal.common.Invocation.call(Invocation.java:52) [bundleFile:?]
	at java.util.concurrent.FutureTask.run(FutureTask.java:264) [?:?]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?]
	at java.lang.Thread.run(Thread.java:834) [?:?]
2020-12-13 21:13:46.739 [ERROR] [nal.common.AbstractInvocationHandler] - An error occurred while calling method 'QueryablePersistenceService.query()' on 'org.openhab.persistence.influxdb.InfluxDBPersistenceService@307f24a1': Value must be between 0 and 100
java.lang.IllegalArgumentException: Value must be between 0 and 100
	at org.openhab.core.library.types.PercentType.validateValue(PercentType.java:57) ~[bundleFile:?]
	at org.openhab.core.library.types.PercentType.<init>(PercentType.java:47) ~[bundleFile:?]
	at org.openhab.persistence.influxdb.internal.InfluxDBStateConvertUtils.objectToState(InfluxDBStateConvertUtils.java:128) ~[?:?]
	at org.openhab.persistence.influxdb.internal.InfluxDBStateConvertUtils.objectToState(InfluxDBStateConvertUtils.java:101) ~[?:?]
	at org.openhab.persistence.influxdb.InfluxDBPersistenceService.mapRow2HistoricItem(InfluxDBPersistenceService.java:237) ~[?:?]
	at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:195) ~[?:?]
	at java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1655) ~[?:?]
	at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:484) ~[?:?]
	at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:474) ~[?:?]
	at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:913) ~[?:?]
	at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) ~[?:?]
	at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:578) ~[?:?]
	at org.openhab.persistence.influxdb.InfluxDBPersistenceService.query(InfluxDBPersistenceService.java:229) ~[?:?]
	at jdk.internal.reflect.GeneratedMethodAccessor74.invoke(Unknown Source) ~[?:?]
	at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:?]
	at java.lang.reflect.Method.invoke(Method.java:566) ~[?:?]
	at org.openhab.core.internal.common.AbstractInvocationHandler.invokeDirect(AbstractInvocationHandler.java:154) [bundleFile:?]
	at org.openhab.core.internal.common.Invocation.call(Invocation.java:52) [bundleFile:?]
	at java.util.concurrent.FutureTask.run(FutureTask.java:264) [?:?]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?]
	at java.lang.Thread.run(Thread.java:834) [?:?]
2020-12-13 21:13:46.760 [ERROR] [ence.internal.PersistenceManagerImpl] - Exception occurred while querying persistence service 'influxdb': Value must be between 0 and 100
java.lang.IllegalArgumentException: Value must be between 0 and 100
	at org.openhab.core.library.types.PercentType.validateValue(PercentType.java:57) ~[bundleFile:?]
	at org.openhab.core.library.types.PercentType.<init>(PercentType.java:47) ~[bundleFile:?]
	at org.openhab.persistence.influxdb.internal.InfluxDBStateConvertUtils.objectToState(InfluxDBStateConvertUtils.java:128) ~[?:?]
	at org.openhab.persistence.influxdb.internal.InfluxDBStateConvertUtils.objectToState(InfluxDBStateConvertUtils.java:101) ~[?:?]
	at org.openhab.persistence.influxdb.InfluxDBPersistenceService.mapRow2HistoricItem(InfluxDBPersistenceService.java:237) ~[?:?]
	at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:195) ~[?:?]
	at java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1655) ~[?:?]
	at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:484) ~[?:?]
	at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:474) ~[?:?]
	at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:913) ~[?:?]
	at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) ~[?:?]
	at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:578) ~[?:?]
	at org.openhab.persistence.influxdb.InfluxDBPersistenceService.query(InfluxDBPersistenceService.java:229) ~[?:?]
	at jdk.internal.reflect.GeneratedMethodAccessor74.invoke(Unknown Source) ~[?:?]
	at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:?]
	at java.lang.reflect.Method.invoke(Method.java:566) ~[?:?]
	at org.openhab.core.internal.common.AbstractInvocationHandler.invokeDirect(AbstractInvocationHandler.java:154) [bundleFile:?]
	at org.openhab.core.internal.common.Invocation.call(Invocation.java:52) [bundleFile:?]
	at java.util.concurrent.FutureTask.run(FutureTask.java:264) [?:?]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?]
	at java.lang.Thread.run(Thread.java:834) [?:?]
2020-12-13 21:13:49.061 [ERROR] [nal.common.AbstractInvocationHandler] - An error occurred while calling method 'QueryablePersistenceService.query()' on 'org.openhab.persistence.influxdb.InfluxDBPersistenceService@307f24a1': Value must be between 0 and 100
java.lang.IllegalArgumentException: Value must be between 0 and 100
	at org.openhab.core.library.types.PercentType.validateValue(PercentType.java:57) ~[bundleFile:?]
	at org.openhab.core.library.types.PercentType.<init>(PercentType.java:47) ~[bundleFile:?]
	at org.openhab.persistence.influxdb.internal.InfluxDBStateConvertUtils.objectToState(InfluxDBStateConvertUtils.java:128) ~[?:?]
	at org.openhab.persistence.influxdb.internal.InfluxDBStateConvertUtils.objectToState(InfluxDBStateConvertUtils.java:101) ~[?:?]
	at org.openhab.persistence.influxdb.InfluxDBPersistenceService.mapRow2HistoricItem(InfluxDBPersistenceService.java:237) ~[?:?]
	at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:195) ~[?:?]
	at java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1655) ~[?:?]
	at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:484) ~[?:?]
	at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:474) ~[?:?]
	at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:913) ~[?:?]
	at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) ~[?:?]
	at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:578) ~[?:?]
	at org.openhab.persistence.influxdb.InfluxDBPersistenceService.query(InfluxDBPersistenceService.java:229) ~[?:?]
	at jdk.internal.reflect.GeneratedMethodAccessor74.invoke(Unknown Source) ~[?:?]
	at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:?]
	at java.lang.reflect.Method.invoke(Method.java:566) ~[?:?]
	at org.openhab.core.internal.common.AbstractInvocationHandler.invokeDirect(AbstractInvocationHandler.java:154) [bundleFile:?]
	at org.openhab.core.internal.common.Invocation.call(Invocation.java:52) [bundleFile:?]
	at java.util.concurrent.FutureTask.run(FutureTask.java:264) [?:?]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?]
	at java.lang.Thread.run(Thread.java:834) [?:?]
2020-12-13 21:13:49.074 [ERROR] [ence.internal.PersistenceManagerImpl] - Exception occurred while querying persistence service 'influxdb': Value must be between 0 and 100
java.lang.IllegalArgumentException: Value must be between 0 and 100
	at org.openhab.core.library.types.PercentType.validateValue(PercentType.java:57) ~[bundleFile:?]
	at org.openhab.core.library.types.PercentType.<init>(PercentType.java:47) ~[bundleFile:?]
	at org.openhab.persistence.influxdb.internal.InfluxDBStateConvertUtils.objectToState(InfluxDBStateConvertUtils.java:128) ~[?:?]
	at org.openhab.persistence.influxdb.internal.InfluxDBStateConvertUtils.objectToState(InfluxDBStateConvertUtils.java:101) ~[?:?]
	at org.openhab.persistence.influxdb.InfluxDBPersistenceService.mapRow2HistoricItem(InfluxDBPersistenceService.java:237) ~[?:?]
	at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:195) ~[?:?]
	at java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1655) ~[?:?]
	at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:484) ~[?:?]
	at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:474) ~[?:?]
	at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:913) ~[?:?]
	at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) ~[?:?]
	at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:578) ~[?:?]
	at org.openhab.persistence.influxdb.InfluxDBPersistenceService.query(InfluxDBPersistenceService.java:229) ~[?:?]
	at jdk.internal.reflect.GeneratedMethodAccessor74.invoke(Unknown Source) ~[?:?]
	at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:?]
	at java.lang.reflect.Method.invoke(Method.java:566) ~[?:?]
	at org.openhab.core.internal.common.AbstractInvocationHandler.invokeDirect(AbstractInvocationHandler.java:154) [bundleFile:?]
	at org.openhab.core.internal.common.Invocation.call(Invocation.java:52) [bundleFile:?]
	at java.util.concurrent.FutureTask.run(FutureTask.java:264) [?:?]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?]
	at java.lang.Thread.run(Thread.java:834) [?:?]
2020-12-13 21:13:49.987 [ERROR] [nal.common.AbstractInvocationHandler] - An error occurred while calling method 'QueryablePersistenceService.query()' on 'org.openhab.persistence.influxdb.InfluxDBPersistenceService@307f24a1': Value must be between 0 and 100
java.lang.IllegalArgumentException: Value must be between 0 and 100
	at org.openhab.core.library.types.PercentType.validateValue(PercentType.java:57) ~[bundleFile:?]
	at org.openhab.core.library.types.PercentType.<init>(PercentType.java:47) ~[bundleFile:?]
	at org.openhab.persistence.influxdb.internal.InfluxDBStateConvertUtils.objectToState(InfluxDBStateConvertUtils.java:128) ~[?:?]
	at org.openhab.persistence.influxdb.internal.InfluxDBStateConvertUtils.objectToState(InfluxDBStateConvertUtils.java:101) ~[?:?]
	at org.openhab.persistence.influxdb.InfluxDBPersistenceService.mapRow2HistoricItem(InfluxDBPersistenceService.java:237) ~[?:?]
	at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:195) ~[?:?]
	at java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1655) ~[?:?]
	at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:484) ~[?:?]
	at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:474) ~[?:?]
	at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:913) ~[?:?]
	at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) ~[?:?]
	at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:578) ~[?:?]
	at org.openhab.persistence.influxdb.InfluxDBPersistenceService.query(InfluxDBPersistenceService.java:229) ~[?:?]
	at jdk.internal.reflect.GeneratedMethodAccessor74.invoke(Unknown Source) ~[?:?]
	at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:?]
	at java.lang.reflect.Method.invoke(Method.java:566) ~[?:?]
	at org.openhab.core.internal.common.AbstractInvocationHandler.invokeDirect(AbstractInvocationHandler.java:154) [bundleFile:?]
	at org.openhab.core.internal.common.Invocation.call(Invocation.java:52) [bundleFile:?]
	at java.util.concurrent.FutureTask.run(FutureTask.java:264) [?:?]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?]
	at java.lang.Thread.run(Thread.java:834) [?:?]
2020-12-13 21:13:49.998 [ERROR] [ence.internal.PersistenceManagerImpl] - Exception occurred while querying persistence service 'influxdb': Value must be between 0 and 100
java.lang.IllegalArgumentException: Value must be between 0 and 100
	at org.openhab.core.library.types.PercentType.validateValue(PercentType.java:57) ~[bundleFile:?]
	at org.openhab.core.library.types.PercentType.<init>(PercentType.java:47) ~[bundleFile:?]
	at org.openhab.persistence.influxdb.internal.InfluxDBStateConvertUtils.objectToState(InfluxDBStateConvertUtils.java:128) ~[?:?]
	at org.openhab.persistence.influxdb.internal.InfluxDBStateConvertUtils.objectToState(InfluxDBStateConvertUtils.java:101) ~[?:?]
	at org.openhab.persistence.influxdb.InfluxDBPersistenceService.mapRow2HistoricItem(InfluxDBPersistenceService.java:237) ~[?:?]
	at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:195) ~[?:?]
	at java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1655) ~[?:?]
	at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:484) ~[?:?]
	at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:474) ~[?:?]
	at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:913) ~[?:?]
	at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) ~[?:?]
	at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:578) ~[?:?]
	at org.openhab.persistence.influxdb.InfluxDBPersistenceService.query(InfluxDBPersistenceService.java:229) ~[?:?]
	at jdk.internal.reflect.GeneratedMethodAccessor74.invoke(Unknown Source) ~[?:?]
	at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:?]
	at java.lang.reflect.Method.invoke(Method.java:566) ~[?:?]
	at org.openhab.core.internal.common.AbstractInvocationHandler.invokeDirect(AbstractInvocationHandler.java:154) [bundleFile:?]
	at org.openhab.core.internal.common.Invocation.call(Invocation.java:52) [bundleFile:?]
	at java.util.concurrent.FutureTask.run(FutureTask.java:264) [?:?]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?]
	at java.lang.Thread.run(Thread.java:834) [?:?]
2020-12-13 21:13:50.876 [ERROR] [nal.common.AbstractInvocationHandler] - An error occurred while calling method 'QueryablePersistenceService.query()' on 'org.openhab.persistence.influxdb.InfluxDBPersistenceService@307f24a1': Value must be between 0 and 100
java.lang.IllegalArgumentException: Value must be between 0 and 100
	at org.openhab.core.library.types.PercentType.validateValue(PercentType.java:57) ~[bundleFile:?]
	at org.openhab.core.library.types.PercentType.<init>(PercentType.java:47) ~[bundleFile:?]
	at org.openhab.persistence.influxdb.internal.InfluxDBStateConvertUtils.objectToState(InfluxDBStateConvertUtils.java:128) ~[?:?]
	at org.openhab.persistence.influxdb.internal.InfluxDBStateConvertUtils.objectToState(InfluxDBStateConvertUtils.java:101) ~[?:?]
	at org.openhab.persistence.influxdb.InfluxDBPersistenceService.mapRow2HistoricItem(InfluxDBPersistenceService.java:237) ~[?:?]
	at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:195) ~[?:?]
	at java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1655) ~[?:?]
	at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:484) ~[?:?]
	at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:474) ~[?:?]
	at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:913) ~[?:?]
	at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) ~[?:?]
	at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:578) ~[?:?]
	at org.openhab.persistence.influxdb.InfluxDBPersistenceService.query(InfluxDBPersistenceService.java:229) ~[?:?]
	at jdk.internal.reflect.GeneratedMethodAccessor74.invoke(Unknown Source) ~[?:?]
	at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:?]
	at java.lang.reflect.Method.invoke(Method.java:566) ~[?:?]
	at org.openhab.core.internal.common.AbstractInvocationHandler.invokeDirect(AbstractInvocationHandler.java:154) [bundleFile:?]
	at org.openhab.core.internal.common.Invocation.call(Invocation.java:52) [bundleFile:?]
	at java.util.concurrent.FutureTask.run(FutureTask.java:264) [?:?]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?]
	at java.lang.Thread.run(Thread.java:834) [?:?]
2020-12-13 21:13:50.884 [ERROR] [ence.internal.PersistenceManagerImpl] - Exception occurred while querying persistence service 'influxdb': Value must be between 0 and 100
java.lang.IllegalArgumentException: Value must be between 0 and 100
	at org.openhab.core.library.types.PercentType.validateValue(PercentType.java:57) ~[bundleFile:?]
	at org.openhab.core.library.types.PercentType.<init>(PercentType.java:47) ~[bundleFile:?]
	at org.openhab.persistence.influxdb.internal.InfluxDBStateConvertUtils.objectToState(InfluxDBStateConvertUtils.java:128) ~[?:?]
	at org.openhab.persistence.influxdb.internal.InfluxDBStateConvertUtils.objectToState(InfluxDBStateConvertUtils.java:101) ~[?:?]
	at org.openhab.persistence.influxdb.InfluxDBPersistenceService.mapRow2HistoricItem(InfluxDBPersistenceService.java:237) ~[?:?]
	at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:195) ~[?:?]
	at java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1655) ~[?:?]
	at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:484) ~[?:?]
	at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:474) ~[?:?]
	at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:913) ~[?:?]
	at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) ~[?:?]
	at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:578) ~[?:?]
	at org.openhab.persistence.influxdb.InfluxDBPersistenceService.query(InfluxDBPersistenceService.java:229) ~[?:?]
	at jdk.internal.reflect.GeneratedMethodAccessor74.invoke(Unknown Source) ~[?:?]
	at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:?]
	at java.lang.reflect.Method.invoke(Method.java:566) ~[?:?]
	at org.openhab.core.internal.common.AbstractInvocationHandler.invokeDirect(AbstractInvocationHandler.java:154) [bundleFile:?]
	at org.openhab.core.internal.common.Invocation.call(Invocation.java:52) [bundleFile:?]
	at java.util.concurrent.FutureTask.run(FutureTask.java:264) [?:?]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?]
	at java.lang.Thread.run(Thread.java:834) [?:?]
2020-12-13 21:13:55.533 [ERROR] [nal.common.AbstractInvocationHandler] - An error occurred while calling method 'QueryablePersistenceService.query()' on 'org.openhab.persistence.influxdb.InfluxDBPersistenceService@307f24a1': Value must be between 0 and 100
java.lang.IllegalArgumentException: Value must be between 0 and 100
	at org.openhab.core.library.types.PercentType.validateValue(PercentType.java:57) ~[bundleFile:?]
	at org.openhab.core.library.types.PercentType.<init>(PercentType.java:47) ~[bundleFile:?]
	at org.openhab.persistence.influxdb.internal.InfluxDBStateConvertUtils.objectToState(InfluxDBStateConvertUtils.java:128) ~[?:?]
	at org.openhab.persistence.influxdb.internal.InfluxDBStateConvertUtils.objectToState(InfluxDBStateConvertUtils.java:101) ~[?:?]
	at org.openhab.persistence.influxdb.InfluxDBPersistenceService.mapRow2HistoricItem(InfluxDBPersistenceService.java:237) ~[?:?]
	at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:195) ~[?:?]
	at java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1655) ~[?:?]
	at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:484) ~[?:?]
	at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:474) ~[?:?]
	at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:913) ~[?:?]
	at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) ~[?:?]
	at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:578) ~[?:?]
	at org.openhab.persistence.influxdb.InfluxDBPersistenceService.query(InfluxDBPersistenceService.java:229) ~[?:?]
	at jdk.internal.reflect.GeneratedMethodAccessor74.invoke(Unknown Source) ~[?:?]
	at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:?]
	at java.lang.reflect.Method.invoke(Method.java:566) ~[?:?]
	at org.openhab.core.internal.common.AbstractInvocationHandler.invokeDirect(AbstractInvocationHandler.java:154) [bundleFile:?]
	at org.openhab.core.internal.common.Invocation.call(Invocation.java:52) [bundleFile:?]
	at java.util.concurrent.FutureTask.run(FutureTask.java:264) [?:?]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?]
	at java.lang.Thread.run(Thread.java:834) [?:?]
2020-12-13 21:13:55.542 [ERROR] [ence.internal.PersistenceManagerImpl] - Exception occurred while querying persistence service 'influxdb': Value must be between 0 and 100
java.lang.IllegalArgumentException: Value must be between 0 and 100
	at org.openhab.core.library.types.PercentType.validateValue(PercentType.java:57) ~[bundleFile:?]
	at org.openhab.core.library.types.PercentType.<init>(PercentType.java:47) ~[bundleFile:?]
	at org.openhab.persistence.influxdb.internal.InfluxDBStateConvertUtils.objectToState(InfluxDBStateConvertUtils.java:128) ~[?:?]
	at org.openhab.persistence.influxdb.internal.InfluxDBStateConvertUtils.objectToState(InfluxDBStateConvertUtils.java:101) ~[?:?]
	at org.openhab.persistence.influxdb.InfluxDBPersistenceService.mapRow2HistoricItem(InfluxDBPersistenceService.java:237) ~[?:?]
	at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:195) ~[?:?]
	at java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1655) ~[?:?]
	at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:484) ~[?:?]
	at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:474) ~[?:?]
	at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:913) ~[?:?]
	at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) ~[?:?]
	at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:578) ~[?:?]
	at org.openhab.persistence.influxdb.InfluxDBPersistenceService.query(InfluxDBPersistenceService.java:229) ~[?:?]
	at jdk.internal.reflect.GeneratedMethodAccessor74.invoke(Unknown Source) ~[?:?]
	at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:?]
	at java.lang.reflect.Method.invoke(Method.java:566) ~[?:?]
	at org.openhab.core.internal.common.AbstractInvocationHandler.invokeDirect(AbstractInvocationHandler.java:154) [bundleFile:?]
	at org.openhab.core.internal.common.Invocation.call(Invocation.java:52) [bundleFile:?]
	at java.util.concurrent.FutureTask.run(FutureTask.java:264) [?:?]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?]
	at java.lang.Thread.run(Thread.java:834) [?:?]
2020-12-13 21:13:55.649 [WARN ] [core.audio.internal.AudioManagerImpl] - Failed playing audio stream 'org.openhab.core.audio.FileAudioStream@4bb5d7d4' as no audio sink was found.
2020-12-13 21:13:57.056 [ERROR] [nal.common.AbstractInvocationHandler] - An error occurred while calling method 'QueryablePersistenceService.query()' on 'org.openhab.persistence.influxdb.InfluxDBPersistenceService@307f24a1': Value must be between 0 and 100
java.lang.IllegalArgumentException: Value must be between 0 and 100
	at org.openhab.core.library.types.PercentType.validateValue(PercentType.java:57) ~[bundleFile:?]
	at org.openhab.core.library.types.PercentType.<init>(PercentType.java:47) ~[bundleFile:?]
	at org.openhab.persistence.influxdb.internal.InfluxDBStateConvertUtils.objectToState(InfluxDBStateConvertUtils.java:128) ~[?:?]
	at org.openhab.persistence.influxdb.internal.InfluxDBStateConvertUtils.objectToState(InfluxDBStateConvertUtils.java:101) ~[?:?]
	at org.openhab.persistence.influxdb.InfluxDBPersistenceService.mapRow2HistoricItem(InfluxDBPersistenceService.java:237) ~[?:?]
	at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:195) ~[?:?]
	at java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1655) ~[?:?]
	at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:484) ~[?:?]
	at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:474) ~[?:?]
	at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:913) ~[?:?]
	at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) ~[?:?]
	at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:578) ~[?:?]
	at org.openhab.persistence.influxdb.InfluxDBPersistenceService.query(InfluxDBPersistenceService.java:229) ~[?:?]
	at jdk.internal.reflect.GeneratedMethodAccessor74.invoke(Unknown Source) ~[?:?]
	at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:?]
	at java.lang.reflect.Method.invoke(Method.java:566) ~[?:?]
	at org.openhab.core.internal.common.AbstractInvocationHandler.invokeDirect(AbstractInvocationHandler.java:154) [bundleFile:?]
	at org.openhab.core.internal.common.Invocation.call(Invocation.java:52) [bundleFile:?]
	at java.util.concurrent.FutureTask.run(FutureTask.java:264) [?:?]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?]
	at java.lang.Thread.run(Thread.java:834) [?:?]
2020-12-13 21:13:57.065 [ERROR] [ence.internal.PersistenceManagerImpl] - Exception occurred while querying persistence service 'influxdb': Value must be between 0 and 100
java.lang.IllegalArgumentException: Value must be between 0 and 100
	at org.openhab.core.library.types.PercentType.validateValue(PercentType.java:57) ~[bundleFile:?]
	at org.openhab.core.library.types.PercentType.<init>(PercentType.java:47) ~[bundleFile:?]
	at org.openhab.persistence.influxdb.internal.InfluxDBStateConvertUtils.objectToState(InfluxDBStateConvertUtils.java:128) ~[?:?]
	at org.openhab.persistence.influxdb.internal.InfluxDBStateConvertUtils.objectToState(InfluxDBStateConvertUtils.java:101) ~[?:?]
	at org.openhab.persistence.influxdb.InfluxDBPersistenceService.mapRow2HistoricItem(InfluxDBPersistenceService.java:237) ~[?:?]
	at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:195) ~[?:?]
	at java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1655) ~[?:?]
	at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:484) ~[?:?]
	at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:474) ~[?:?]
	at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:913) ~[?:?]
	at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) ~[?:?]
	at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:578) ~[?:?]
	at org.openhab.persistence.influxdb.InfluxDBPersistenceService.query(InfluxDBPersistenceService.java:229) ~[?:?]
	at jdk.internal.reflect.GeneratedMethodAccessor74.invoke(Unknown Source) ~[?:?]
	at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:?]
	at java.lang.reflect.Method.invoke(Method.java:566) ~[?:?]
	at org.openhab.core.internal.common.AbstractInvocationHandler.invokeDirect(AbstractInvocationHandler.java:154) [bundleFile:?]
	at org.openhab.core.internal.common.Invocation.call(Invocation.java:52) [bundleFile:?]
	at java.util.concurrent.FutureTask.run(FutureTask.java:264) [?:?]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?]
	at java.lang.Thread.run(Thread.java:834) [?:?]
2020-12-13 21:13:59.478 [INFO ] [internal.ModuleHandlerFactoryStarter] - WebPushNotificationModuleHandlerFactory started by ModuleHandlerFactoryStarter
2020-12-13 21:14:00.590 [INFO ] [hab.ui.habot.tile.internal.HABotTile] - Started HABot at /habot
2020-12-13 21:14:01.270 [WARN ] [.googletts.internal.GoogleTTSService] - Audio format OGG_OPUS is not yet supported.

The Event Monitor in the Developer Sidebar works, but the event.log with LgExpert would be more helpful for troubleshooting in Rules.

I rarely look at events.log but I can confirm that sometime after December 10th the logging stopped to events.log.

Looking at log4j.xml it looks like this is deliberate. The logging level for the event bus events has been turned down to ERROR. This makes some sense. With the new Developer Sidebar in MainUI for monitoring Item states and updates, it’s a minority of users who need events.log. That is a whole lot of writes and space taken up by default for the majority of users who never need to look at events.log.

To get them back, edit log4j.xml (or even better someone can figure out how to import a separate config at the end of log4j.xml so we can keep our logging customizations separate from the main config) and change the levels back to INFO.

1 Like

Do I understand something wrong?
Shouldn’t that be this section, which is here on INFO:

  <Logger additivity="false" level="INFO" name="openhab.event">
  	<AppenderRef ref="EVENT"/>
  	<AppenderRef ref="OSGI"/>
  </Logger>

I for one look at it all the time, how do you know it’s a minority of users? Have you taken a poll?

Right above that:

                <Logger level="ERROR" name="smarthome.event.ItemStateEvent"/>
                <Logger level="ERROR" name="smarthome.event.ItemAddedEvent"/>
                <Logger level="ERROR" name="smarthome.event.ItemRemovedEvent"/>
                <Logger level="ERROR" name="smarthome.event.ThingStatusInfoEvent"/>
                <Logger level="ERROR" name="smarthome.event.ThingAddedEvent"/>
                <Logger level="ERROR" name="smarthome.event.ThingRemovedEvent"/>
                <Logger level="ERROR" name="smarthome.event.InboxUpdatedEvent"/>
                <Logger level="ERROR" name="smarthome.event.RuleStatusInfoEvent"/>
                <Logger level="ERROR" name="smarthome.event.RuleAddedEvent"/>
                <Logger level="ERROR" name="smarthome.event.RuleRemovedEvent"/>
                <Logger level="ERROR" name="smarthome.event.StartlevelEvent"/>

I’ve been active helping users on this forum since before we had our own forum. But you don’t have to believe me and if you don’t like it, you can file an issue to have the developers change it back.

I’m pretty sure this isn’t intentional, see https://github.com/openhab/openhab-distro/commit/76b3a29951c02871d66c16b5244af94acbc4dec2. The fix is to change all occurrences smarthome.event to openhab.event in userdata/etc/log4j2.xml.

Be aware that we applied a renaming to replace smarthome by openhab. This affects the events too (see e.g. https://github.com/openhab/openhab-distro/pull/1206). Users perhaps have to change it manually.

The level for ItemStateEvent is ERROR even before that change. That would suppress the ItemChanged and related events, right as those are all logged at the INFO level.

Yes this does work, see my post above yours.

That only suppresses some of the messages, not the ones that should get logged to the events.log file. For example openhab.event.ItemStateChangedEvent and others.

OK, an issue definitely needs to be filed for this then.

I don’t get it, now my log4j2 looks like this:

<!-- openHAB specific logger configuration -->

		<Logger level="INFO" name="org.openhab"/>

		<Logger level="INFO" name="openhab.event.ItemStateEvent"/>
		<Logger level="INFO" name="openhab.event.ItemAddedEvent"/>
		<Logger level="INFO" name="openhab.event.ItemRemovedEvent"/>
		<Logger level="INFO" name="openhab.event.ItemChannelLinkAddedEvent"/>
		<Logger level="INFO" name="openhab.event.ItemChannelLinkRemovedEvent"/>
		<Logger level="INFO" name="openhab.event.ThingStatusInfoEvent"/>
		<Logger level="INFO" name="openhab.event.ThingAddedEvent"/>
		<Logger level="INFO" name="openhab.event.ThingUpdatedEvent"/>
		<Logger level="INFO" name="openhab.event.ThingRemovedEvent"/>
		<Logger level="INFO" name="openhab.event.InboxUpdatedEvent"/>
		<Logger level="INFO" name="openhab.event.RuleStatusInfoEvent"/>
		<Logger level="INFO" name="openhab.event.RuleAddedEvent"/>
		<Logger level="INFO" name="openhab.event.RuleRemovedEvent"/>
		<Logger level="INFO" name="openhab.event.StartlevelEvent"/>
		<Logger level="INFO" name="openhab.event.AddonEvent"/>

		<Logger additivity="false" level="INFO" name="openhab.event">
			<AppenderRef ref="EVENT"/>
			<AppenderRef ref="OSGI"/>
		</Logger>

But the events.log ist ampty, also after OH restart.

I doubt if this would be a problem with a new install, just an artifact of upgrading.

It worked until a few days ago though, I’ve been using OH3 for a few weeks.

I just made the changes to mine and they are working.

                <!-- openHAB specific logger configuration -->

                <Logger level="INFO" name="org.openhab"/>

                <Logger level="ERROR" name="openhab.event.ItemStateEvent"/>
                <Logger level="ERROR" name="openhab.event.ItemAddedEvent"/>
                <Logger level="ERROR" name="openhab.event.ItemRemovedEvent"/>
                <Logger level="ERROR" name="openhab.event.ThingStatusInfoEvent"/>
                <Logger level="ERROR" name="openhab.event.ThingAddedEvent"/>
                <Logger level="ERROR" name="openhab.event.ThingRemovedEvent"/>
                <Logger level="ERROR" name="openhab.event.InboxUpdatedEvent"/>
                <Logger level="ERROR" name="openhab.event.RuleStatusInfoEvent"/>
                <Logger level="ERROR" name="openhab.event.RuleAddedEvent"/>
                <Logger level="ERROR" name="openhab.event.RuleRemovedEvent"/>
                <Logger level="ERROR" name="openhab.event.StartlevelEvent"/>

                <Logger additivity="false" level="INFO" name="openhab.event">
                        <AppenderRef ref="EVENT"/>
                        <AppenderRef ref="OSGI"/>
                </Logger>

I was wrong about needing to change the logger level. It’s the logger name that needed to be changed.

The change was merged 6 days ago.

Sorry, I’m already embarrassed, but I’ve been looking at your post for several minutes and can’t figure out what you have changed.

All that I changed in the original file was to change"smarthome" to “openhab” on the lines I posted above.

Interesting, ther is everywhere “openhab” in my file:


	<Appenders>
		<!-- Console appender not used by default (see Root logger AppenderRefs) -->
		<Console name="STDOUT">
			<PatternLayout pattern="%d{HH:mm:ss.SSS} [%-5.5p] [%-36.36c] - %m%n"/>
		</Console>

		<!-- Rolling file appender -->
		<RollingFile fileName="${sys:openhab.logdir}/openhab.log" filePattern="${sys:openhab.logdir}/openhab.log.%i" name="LOGFILE">
			<PatternLayout pattern="%d{yyyy-MM-dd HH:mm:ss.SSS} [%-5.5p] [%-36.36c] - %m%n"/>
			<Policies>
				<SizeBasedTriggeringPolicy size="16 MB"/>
			</Policies>
		</RollingFile>

		<!-- Event log appender -->
		<RollingRandomAccessFile fileName="${sys:openhab.logdir}/events.log" filePattern="${sys:openhab.logdir}/events.log.%i" name="EVENT">
			<PatternLayout pattern="%d{yyyy-MM-dd HH:mm:ss.SSS} [%-5.5p] [%-36.36c] - %m%n"/>
			<Policies>
				<SizeBasedTriggeringPolicy size="16 MB"/>
			</Policies>
		</RollingRandomAccessFile>

		<!-- Audit file appender -->
		<RollingRandomAccessFile fileName="${sys:openhab.logdir}/audit.log" filePattern="${sys:openhab.logdir}/audit.log.%i" name="AUDIT">
			<PatternLayout pattern="%d{yyyy-MM-dd HH:mm:ss.SSS} [%-5.5p] [%-36.36c] - %m%n"/>
			<Policies>
				<SizeBasedTriggeringPolicy size="8 MB"/>
			</Policies>
		</RollingRandomAccessFile>

		<!-- OSGi appender -->
		<PaxOsgi filter="*" name="OSGI"/>
	</Appenders>

	<Loggers>
		<!-- Root logger configuration -->
		<Root level="WARN">
			<AppenderRef ref="LOGFILE"/>
			<AppenderRef ref="OSGI"/>
		</Root>

		<!-- Karaf Shell logger -->
		<Logger level="OFF" name="org.apache.karaf.shell.support">
			<AppenderRef ref="STDOUT"/>
		</Logger>

		<!-- Security audit logger -->
		<Logger additivity="false" level="INFO" name="org.apache.karaf.jaas.modules.audit">
			<AppenderRef ref="AUDIT"/>
		</Logger>

		<!-- openHAB specific logger configuration -->

		<Logger level="INFO" name="org.openhab"/>
       
		<Logger level="ERROR" name="openhab.event.ItemStateEvent"/>
        <Logger level="ERROR" name="openhab.event.ItemAddedEvent"/>
        <Logger level="ERROR" name="openhab.event.ItemRemovedEvent"/>
        <Logger level="ERROR" name="openhab.event.ThingStatusInfoEvent"/>
        <Logger level="ERROR" name="openhab.event.ThingAddedEvent"/>
        <Logger level="ERROR" name="openhab.event.ThingRemovedEvent"/>
        <Logger level="ERROR" name="openhab.event.InboxUpdatedEvent"/>
        <Logger level="ERROR" name="openhab.event.RuleStatusInfoEvent"/>
        <Logger level="ERROR" name="openhab.event.RuleAddedEvent"/>
        <Logger level="ERROR" name="openhab.event.RuleRemovedEvent"/>
        <Logger level="ERROR" name="openhab.event.StartlevelEvent"/>

        <Logger additivity="false" level="INFO" name="openhab.event">
                <AppenderRef ref="EVENT"/>
                <AppenderRef ref="OSGI"/>
        </Logger>

		<Logger level="ERROR" name="javax.jmdns"/>
		<Logger level="ERROR" name="org.jupnp"/>

		<!-- This suppresses all Maven download issues from the log when doing feature installations -->
		<!-- as we are logging errors ourselves in a nicer way anyhow. -->
		<Logger level="ERROR" name="org.ops4j.pax.url.mvn.internal.AetherBasedResolver"/>

		<!-- Filters known issues of pax-web (issue link to be added here). -->
		<!-- Can be removed once the issues are resolved in an upcoming version. -->
		<Logger level="OFF" name="org.ops4j.pax.web.pax-web-runtime"/>

		<!-- Filters known issues of lsp4j, see -->
		<!-- https://github.com/eclipse/smarthome/issues/4639 -->
		<!-- https://github.com/eclipse/smarthome/issues/4629 -->
		<!-- https://github.com/eclipse/smarthome/issues/4643 -->
		<!-- Can be removed once the issues are resolved in an upcoming version. -->
		<Logger level="OFF" name="org.eclipse.lsp4j"/>

		<!-- Filters warnings for events that could not be delivered to a disconnected client. -->
		<Logger level="ERROR" name="org.apache.cxf.jaxrs.sse.SseEventSinkImpl"/>

		<!-- Filters known issues of KarServiceImpl, see -->
		<!-- https://github.com/openhab/openhab-distro/issues/519#issuecomment-351944506 -->
		<!-- Can be removed once the issues are resolved in an upcoming version. -->
		<Logger level="ERROR" name="org.apache.karaf.kar.internal.KarServiceImpl"/>

		<!-- Filters warnings about unavailable ciphers when JCE is not installed, see -->
		<!-- https://github.com/openhab/openhab-distro/issues/999 -->
		<Logger level="ERROR" name="org.apache.karaf.shell.ssh.SshUtils"/>

		<!-- Filters known issues of javax.mail, see -->
		<!-- https://github.com/openhab/openhab-addons/issues/5530 -->
		<Logger level="ERROR" name="javax.mail"/>

		<!-- Added by Karaf to prevent debug logging loops, see -->
		<!-- https://issues.apache.org/jira/browse/KARAF-5559 -->
		<Logger level="WARN" name="org.apache.sshd"/>
	</Loggers>

</Configuration>

Maybe it’s time to do my installation again. As always, many thanks for your patience

I am confused now. Upgraded from M5 to RC1 this morning and in the upgrade process my log4j2.xml was replaced (upgrading via apt), so there are no smarthome.event in it, from what I can see it looks exactly like you say it should. But I get nothing in events.log and also no changes when doing “log:tail” in Karaf. I really use this all the time, I think I’m not alone. Not really clear on where I should look in the developer sidebar to see these changes? Turning on the event monitor doesn’t work, it spews out way too much z-wave stuff to spot anything relevant…

Did you restart after making the changes to the log4j.xml file? They don’t get automatically loaded.

All I can really offer is this is what my file looks like and it’s working for me.

<?xml version="1.0" encoding="UTF-8" standalone="no"?><Configuration>

	<Appenders>
		<!-- Console appender not used by default (see Root logger AppenderRefs) -->
		<Console name="STDOUT">
			<PatternLayout pattern="%d{HH:mm:ss.SSS} [%-5.5p] [%-36.36c] - %m%n"/>
		</Console>

		<!-- Rolling file appender -->
		<RollingFile fileName="${sys:openhab.logdir}/openhab.log" filePattern="${sys:openhab.logdir}/openhab.log.%i" name="LOGFILE">
			<PatternLayout pattern="%d{yyyy-MM-dd HH:mm:ss.SSS} [%-5.5p] [%-36.36c] - %m%n"/>
			<Policies>
				<SizeBasedTriggeringPolicy size="16 MB"/>
			</Policies>
		</RollingFile>

		<!-- Event log appender -->
		<RollingRandomAccessFile fileName="${sys:openhab.logdir}/events.log" filePattern="${sys:openhab.logdir}/events.log.%i" name="EVENT">
			<PatternLayout pattern="%d{yyyy-MM-dd HH:mm:ss.SSS} [%-5.5p] [%-36.36c] - %m%n"/>
			<Policies>
				<SizeBasedTriggeringPolicy size="16 MB"/>
			</Policies>
		</RollingRandomAccessFile>

		<!-- Audit file appender -->
		<RollingRandomAccessFile fileName="${sys:openhab.logdir}/audit.log" filePattern="${sys:openhab.logdir}/audit.log.%i" name="AUDIT">
			<PatternLayout pattern="%d{yyyy-MM-dd HH:mm:ss.SSS} [%-5.5p] [%-36.36c] - %m%n"/>
			<Policies>
				<SizeBasedTriggeringPolicy size="8 MB"/>
			</Policies>
		</RollingRandomAccessFile>

		<!-- OSGi appender -->
		<PaxOsgi filter="*" name="OSGI"/>
	</Appenders>

	<Loggers>
		<!-- Root logger configuration -->
		<Root level="WARN">
			<AppenderRef ref="LOGFILE"/>
			<AppenderRef ref="OSGI"/>
		</Root>

		<!-- Karaf Shell logger -->
		<Logger level="OFF" name="org.apache.karaf.shell.support">
			<AppenderRef ref="STDOUT"/>
		</Logger>

		<!-- Security audit logger -->
		<Logger additivity="false" level="INFO" name="org.apache.karaf.jaas.modules.audit">
			<AppenderRef ref="AUDIT"/>
		</Logger>

		<!-- openHAB specific logger configuration -->

		<Logger level="INFO" name="org.openhab"/>

		<Logger level="ERROR" name="openhab.event.ItemStateEvent"/>
		<Logger level="ERROR" name="openhab.event.ItemAddedEvent"/>
		<Logger level="ERROR" name="openhab.event.ItemRemovedEvent"/>
		<Logger level="ERROR" name="openhab.event.ThingStatusInfoEvent"/>
		<Logger level="ERROR" name="openhab.event.ThingAddedEvent"/>
		<Logger level="ERROR" name="openhab.event.ThingRemovedEvent"/>
		<Logger level="ERROR" name="openhab.event.InboxUpdatedEvent"/>
		<Logger level="ERROR" name="openhab.event.RuleStatusInfoEvent"/>
		<Logger level="ERROR" name="openhab.event.RuleAddedEvent"/>
		<Logger level="ERROR" name="openhab.event.RuleRemovedEvent"/>
		<Logger level="ERROR" name="openhab.event.StartlevelEvent"/>

		<Logger additivity="false" level="INFO" name="openhab.event">
			<AppenderRef ref="EVENT"/>
			<AppenderRef ref="OSGI"/>
		</Logger>

		<Logger level="ERROR" name="javax.jmdns"/>
		<Logger level="ERROR" name="org.jupnp"/>

		<!-- This suppresses all Maven download issues from the log when doing feature installations -->
		<!-- as we are logging errors ourselves in a nicer way anyhow. -->
		<Logger level="ERROR" name="org.ops4j.pax.url.mvn.internal.AetherBasedResolver"/>

		<!-- Filters known issues of pax-web (issue link to be added here). -->
		<!-- Can be removed once the issues are resolved in an upcoming version. -->
		<Logger level="OFF" name="org.ops4j.pax.web.pax-web-runtime"/>

		<!-- Filters known issues of lsp4j, see -->
		<!-- https://github.com/eclipse/smarthome/issues/4639 -->
		<!-- https://github.com/eclipse/smarthome/issues/4629 -->
		<!-- https://github.com/eclipse/smarthome/issues/4643 -->
		<!-- Can be removed once the issues are resolved in an upcoming version. -->
		<Logger level="OFF" name="org.eclipse.lsp4j"/>

		<!-- Filters warnings for events that could not be delivered to a disconnected client. -->
		<Logger level="ERROR" name="org.apache.cxf.jaxrs.sse.SseEventSinkImpl"/>

		<!-- Filters known issues of KarServiceImpl, see -->
		<!-- https://github.com/openhab/openhab-distro/issues/519#issuecomment-351944506 -->
		<!-- Can be removed once the issues are resolved in an upcoming version. -->
		<Logger level="ERROR" name="org.apache.karaf.kar.internal.KarServiceImpl"/>

		<!-- Filters warnings about unavailable ciphers when JCE is not installed, see -->
		<!-- https://github.com/openhab/openhab-distro/issues/999 -->
		<Logger level="ERROR" name="org.apache.karaf.shell.ssh.SshUtils"/>

		<!-- Filters known issues of javax.mail, see -->
		<!-- https://github.com/openhab/openhab-addons/issues/5530 -->
		<Logger level="ERROR" name="javax.mail"/>

		<!-- Added by Karaf to prevent debug logging loops, see -->
		<!-- https://issues.apache.org/jira/browse/KARAF-5559 -->
		<Logger level="WARN" name="org.apache.sshd"/>
	</Loggers>

</Configuration>

All I changed was replacing smarthome with openhab in those eventbus related lines. I run in Docker so this file doesn’t get replaced on upgrade so it’s the file as it existed when I first installed OH 3 a few months ago.

Bring up the Developer Sidebar.
Click on “Stream Events”

You will start to see all the events, the exact same events that get printed to events.log, scrolling by.

In fact, you see more because it also shows Item state updates, and it tells you the type of the update.

If you click on the little filter icon (top right hand corner) you can filter the events to only see those for the Items or event types you want to see. For example, if you put in */statechanged it will only show you the Items that changed state.

You can narrow it down to just those Items and events you care about at the time.