I have the idea to develop a tool to read the eventlog and provide the data in a new database - e.g. a new influxdb - to use Grafana to analyse data later.
Often when analyzing problems I need to know the state of many other items. This is often troublesome as the time range is bigger and getting the states at the interesting time is difficult.
Also often not all items required for the analysis are present in my influx setup - and I don’t want to add all items to my persistency setup, just because I might need them for analysis later.
Before I start this I would like to hear your opinion on this - or if something like this already exists.
Thanks Michael
Analysing events? This sounds like a job for the Elastic Stack.
Here is a tutorial: Logging to Filebeat + Elasticsearch + Kibana
But I never used this myself.
And I think this approach still does store all data in a database.
So it might not solve your problem.
Hello @christoph_wempe,
thanks for this good hint. I had a look at it yesterday, but it seems that currently OH is not able to trace as json. I found a couple of threads that post this problem
I’ll have a shot at it anyway.
Have a look at Loki. This is a log aggregation tool from Grafana team and thus works very well with grafana. Also I find it much more easy to setup than the above mentioned elastic stack. But this might just be my personal taste…
Basicly your workflow can look like this:
Events.log > promtail (part of Loki to parse the log file) > Loki (storage of data) > grafana (visualize data, I like the explore feature)
Works perfect for me as docker containers
1 Like