This tutorial walks you through setting up OpenHab, Filebeat and Elasticsearch to enable viewing OpenHab logs in Kibana. The purpose is purely viewing application logs rather than analyzing the event logs .
There already are a couple of great guides on how to set this up using a combination of Logstash and a Log4j2 socket appender (here and here) however I decided on using Filebeat instead for two reasons:
- Logstash is quite heavy on resource usage (being a JRuby JVM both CPU and memory)
- In my work life I found socket appenders to be quite unreliable - every now and then they will just stop working… Then you need to grep and tail your logs as if you’ve never had centralised logging in the first place!
The following steps have been tested on OpenHab 2.4 and the 7.3 version of the Elastic stack (according to the documentation this should work all the way back to 5.0 though).
The guide assumes you already installed Elasticsearch (+ Kibana) as well as Filebeat.
We’re going to configure OH to emit a JSON log file which will then be picked up by Filebeat and sent off directly to Elasticsearch.
-
In order to enable JSON logging in OH, edit the
etc/org.ops4j.pax.logging.cfg
file (usually in/var/lib/openhab2
) and amend the Root Logger section near the top to add the new appender ref:# Root logger log4j2.rootLogger.level = WARN log4j2.rootLogger.appenderRefs = out, osgi, json log4j2.rootLogger.appenderRef.out.ref = LOGFILE log4j2.rootLogger.appenderRef.osgi.ref = OSGI log4j2.rootLogger.appenderRef.json.ref = JSON
Scroll further down to where all the existing appenders are defined and add your new JSON appender:
# JSON log appender log4j2.appender.json.type = RollingRandomAccessFile log4j2.appender.json.name = JSON log4j2.appender.json.fileName = ${openhab.logdir}/openhab.log.json log4j2.appender.json.filePattern = ${openhab.logdir}/openhab.log.json.%i log4j2.appender.json.immediateFlush = true log4j2.appender.json.append = true log4j2.appender.json.layout.type = JSONLayout log4j2.appender.json.layout.compact = true log4j2.appender.json.layout.eventEol = true log4j2.appender.json.policies.type = Policies log4j2.appender.json.policies.size.type = SizeBasedTriggeringPolicy log4j2.appender.json.policies.size.size = 16MB
The
compact
andeventEol
settings are necessary for Filebeat to be able to parse the log file as JSON. -
The Log4j2 JSONLayout adds a timestamp field named
timeMillis
which is UNIX time (seconds since epoch) formatted. This needs a little bit of processing to make it Elasticsearch / Kibana friendly but can be achieved within Elasticsearch by using a processor. To set it up you can usecurl
:$ curl -XPUT localhost:9200/_ingest/pipeline/timeMillis -H "Content-type: application/json" -d'{ "description": "Parse log4j2 timeMillis into date field", "processors": [ { "date": { "field": "timeMillis", "target_field": "@timestamp", "formats": [ "UNIX_MS" ] } } ] }'
-
Configure Filebeat to pick up the new log file, parse it as JSON and instruct ES to use the processor you’ve just defined:
filebeat.inputs: - type: log paths: - /path/to/your/openhab/logs/openhab.log.json fields: app: "openhab" fields_under_root: true json.add_error_key: true json.keys_under_root: true output.elasticsearch: hosts: ["your-elasticsearch-host"] pipeline: "timeMillis"