I still get the error and don’t know how to identify it.
I read about too many updates in the DB by items changing very often (persisting everyChange), but which could that be?
2020-04-15 09:05:20.912 [ERROR] [org.influxdb.impl.BatchProcessor ] - Batch could not be sent. Data will be lost
java.lang.RuntimeException: {"error":"timeout"}
at org.influxdb.impl.InfluxDBErrorHandler.handleError(InfluxDBErrorHandler.java:19) ~[influxdb-java-2.2.jar:?]
at retrofit.RestAdapter$RestHandler.invoke(RestAdapter.java:242) ~[retrofit-1.9.0.jar:?]
at org.influxdb.impl.$Proxy207.writePoints(Unknown Source) ~[?:?]
at org.influxdb.impl.InfluxDBImpl.write(InfluxDBImpl.java:151) ~[influxdb-java-2.2.jar:?]
at org.influxdb.impl.BatchProcessor.write(BatchProcessor.java:171) [influxdb-java-2.2.jar:?]
at org.influxdb.impl.BatchProcessor$1.run(BatchProcessor.java:144) [influxdb-java-2.2.jar:?]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [?:1.8.0_222]
at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) [?:1.8.0_222]
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) [?:1.8.0_222]
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) [?:1.8.0_222]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_222]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_222]
at java.lang.Thread.run(Thread.java:748) [?:1.8.0_222]
If you think that might be the cause, watch events.log. If you have an Item changing a lot really fast, or if you have several persisted Items changing very close together you should be able to see that in events.log. Unless you are persisting on everyUpdate. Updates do not get logged out in events.log.
I remember, that I have cleaned up months ago the eventlog from annoying messages (CPU Load and such every second AFAIR) by using the org.ops4j.pax.logging.cfg.
I will revert this back and check whether these are persisted and might be the reason…
Thanks for pushing me into the right direction.
I am facing since the update to 7.0.0 the following issue:
[21:15:18] XX@XX:~$ sudo systemctl status grafana-server.service
● grafana-server.service - Grafana instance
Loaded: loaded (/usr/lib/systemd/system/grafana-server.service; enabled; vendor preset: enabled)
Active: failed (Result: exit-code) since Sat 2020-05-23 21:13:22 CEST; 1min 59s ago
Docs: http://docs.grafana.org
Process: 782 ExecStart=/usr/sbin/grafana-server --config=${CONF_FILE} --pidfile=${PID_FILE_DIR}/grafana-server.pid --pac
Main PID: 782 (code=exited, status=1/FAILURE)
CPU: 0
May 23 21:13:22 openHABianPi systemd[1]: grafana-server.service: Unit entered failed state.
May 23 21:13:22 openHABianPi systemd[1]: grafana-server.service: Failed with result 'exit-code'.
May 23 21:13:22 openHABianPi systemd[1]: grafana-server.service: Service hold-off time over, scheduling restart.
May 23 21:13:22 openHABianPi systemd[1]: Stopped Grafana instance.
May 23 21:13:22 openHABianPi systemd[1]: grafana-server.service: Start request repeated too quickly.
May 23 21:13:22 openHABianPi systemd[1]: Failed to start Grafana instance.
May 23 21:13:22 openHABianPi systemd[1]: grafana-server.service: Unit entered failed state.
May 23 21:13:22 openHABianPi systemd[1]: grafana-server.service: Failed with result 'exit-code'.
I am running Openhabian and did an update yesterday - since which I am facing the issue.
I did a restart, but still the same issue messages. The problem is as mentioned since the upgrade of the version. Does anybody else is facing the problem?
I am new in grafana. I need small help.
I can insert values into influxDB, and draw graphics in realtime.
how to implement this requirement.
Example, not a real life.
I have temperature sensor, which I can read any time, and insert value into influx db.
but then I have another temperature which come with date. And I want to insert is with date an draw accurate graphic. because when I insert second sensor into influxDB, my graph is not accurate, as data I receive is little bit from the past.
This is a bit special and openHAB does not support this. It’s more a SQL job, I think. So maybe another forum is a better (i.e. there are more people to help with this problem)
I’m using a simple bash script to import daily measurements from my photovoltaic system, but I do it to store in MariaDB and I don’t use this measurements in openHAB or Grafana…
I don’t think anyone has posted an example of this because it’s usually handled outside of openHAB. As Udo indicates, you’d write a script that inserts the data into InfluxDB instead of relying on openHAB to do it.
To clarify a bit, the openHAB persistence services are not intended for general purpose database access, and do not allow arbitrary timestamping.
For openHAB, persistence is about recording (and later reading) Item states only, a snapshot of the state “now”, if you will.
You can send directly to InfluxDB using the sendHttpPostRequest and bypass the persistence. Then you can send anything you want including the timestamp. Example here of a different retention policy.
Grafana deprecated that plugin quite some time ago. Now there is a separate service. See Grafana Image Renderer for a way to install and use the new separate service using Docker.