Logging to Filebeat + Elasticsearch + Kibana

This tutorial walks you through setting up OpenHab, Filebeat and Elasticsearch to enable viewing OpenHab logs in Kibana. The purpose is purely viewing application logs rather than analyzing the event logs .

There already are a couple of great guides on how to set this up using a combination of Logstash and a Log4j2 socket appender (here and here) however I decided on using Filebeat instead for two reasons:

  • Logstash is quite heavy on resource usage (being a JRuby JVM both CPU and memory)
  • In my work life I found socket appenders to be quite unreliable - every now and then they will just stop working… Then you need to grep and tail your logs as if you’ve never had centralised logging in the first place!

The following steps have been tested on OpenHab 2.4 and the 7.3 version of the Elastic stack (according to the documentation this should work all the way back to 5.0 though).

The guide assumes you already installed Elasticsearch (+ Kibana) as well as Filebeat.
We’re going to configure OH to emit a JSON log file which will then be picked up by Filebeat and sent off directly to Elasticsearch.

  1. In order to enable JSON logging in OH, edit the etc/org.ops4j.pax.logging.cfg file (usually in /var/lib/openhab2) and amend the Root Logger section near the top to add the new appender ref:

    # Root logger
    log4j2.rootLogger.level = WARN
    log4j2.rootLogger.appenderRefs = out, osgi, json
    log4j2.rootLogger.appenderRef.out.ref = LOGFILE
    log4j2.rootLogger.appenderRef.osgi.ref = OSGI
    log4j2.rootLogger.appenderRef.json.ref = JSON
    

    Scroll further down to where all the existing appenders are defined and add your new JSON appender:

    # JSON log appender
    log4j2.appender.json.type = RollingRandomAccessFile
    log4j2.appender.json.name = JSON
    log4j2.appender.json.fileName = ${openhab.logdir}/openhab.log.json
    log4j2.appender.json.filePattern = ${openhab.logdir}/openhab.log.json.%i
    log4j2.appender.json.immediateFlush = true
    log4j2.appender.json.append = true
    log4j2.appender.json.layout.type = JSONLayout
    log4j2.appender.json.layout.compact = true
    log4j2.appender.json.layout.eventEol = true
    log4j2.appender.json.policies.type = Policies
    log4j2.appender.json.policies.size.type = SizeBasedTriggeringPolicy
    log4j2.appender.json.policies.size.size = 16MB
    

    The compact and eventEol settings are necessary for Filebeat to be able to parse the log file as JSON.

  2. The Log4j2 JSONLayout adds a timestamp field named timeMillis which is UNIX time (seconds since epoch) formatted. This needs a little bit of processing to make it Elasticsearch / Kibana friendly but can be achieved within Elasticsearch by using a processor. To set it up you can use curl:

    $ curl -XPUT localhost:9200/_ingest/pipeline/timeMillis -H "Content-type: application/json" -d'{
      "description": "Parse log4j2 timeMillis into date field",
      "processors": [
        {
          "date": {
            "field": "timeMillis",
            "target_field": "@timestamp",
            "formats": [
              "UNIX_MS"
            ]
          }
        }
      ]
    }'
    
  3. Configure Filebeat to pick up the new log file, parse it as JSON and instruct ES to use the processor you’ve just defined:

    filebeat.inputs:
    - type: log
      paths:
        - /path/to/your/openhab/logs/openhab.log.json
      fields:
        app: "openhab"
      fields_under_root: true
      json.add_error_key: true
      json.keys_under_root: true
    output.elasticsearch:
      hosts: ["your-elasticsearch-host"]
      pipeline: "timeMillis"
    

:tada:

6 Likes

Hi there,
thanks for that! I am trying such an approach for days - but did not install filebeat on Openhab, I tried it with Logstash, no success though… This is much more straight-forward!

I took me an hour of fizzeling with everything until I found that line between filebeat.inputs: and - type: log which is quite important:

" enabled: true" (which is default to false)

I recommend adding it to the tutorial for the next noob :smiley:

I want to seperate the logs in Kabana somehow (dashboard?) to have a nice log-per-room view. I would start in Kabana, or can you save me some hours and tell me that this split has to be done somewhere else? :slight_smile:

Hi @chopped_pork I tried to add json appender, but I got error:

org.osgi.service.log.LogService, org.knopflerfish.service.log.LogService, org.ops4j.pax.logging.PaxLoggingService, org.osgi.service.cm.ManagedService, id=33, bundle=7/mvn:org.ops4j.pax.logging/pax-logging-log4j2/1.10.1]: Unexpected problem updating configuration org.ops4j.pax.logging

When I set JSONLayout, do I need extra configuration to write logs in json format?

Hello,
Is this still working in OH 2.5.2?
As I am unsuccessful with configuring JSONLayout as Jackson library is not loaded at boot time.

I’ve recently setup the ELK stack, mainly to learn more about it and to centralise the logs from the various VM’s I have running. I was happy to find this post, but it seems that either I’ve missed something or that has been a change to OH that has stopped this working.

I’m currently running OH 2.5.3, has anyone managed to get this working?