Starting centralised logfile analysing with ELK stack (elasticsearch, logstash, kibana)

I’m starting to look into centralised logfile analysing with the ELK stack (elasticsearch, logstash, kibana).
There’s a Tutorial post here:

But I think, logging changed since OH2. Is there anyone willing to share how it can be done with OH4 and ELK? Thanks!

The only thing that has changed is it uses the XML format for the config instead of the .properties file. But all the same parameters are the same.

You can probably figure it out just by looking at your existing log4j2.xml file. But if not see Log4j – Configuring Log4j 2.

Note, other options for centralized logging with openHAB:

There are of course others.

ELKI stack is pretty heavy weight and Elasticsearch just by itself is going to consume a lot of resources. I don’t know about these other approaches.

1 Like

ok, I just found a docker compose for the whole ELK in one go. The others seem also nice, what I need is just a dashboard showing ERRORs and perhaps WARNings - and based on that perhaps an email, if errors are showing up all the time. let’s see. Thanks for pointing that out.

I revisited this again and did the following (on OH 4.2.1 running on openHABian / Pi4):

  1. install “LOG4J2 Extra” from the marketplace (LOG4J2 Extra)
  2. setup ELK in docker
  3. configure JSON for logstash
  4. edit log4j2.xml to send the openhab.log to logstash and subsequently to elasticsearch/kibana

ad 1)
simple install:

ad 2)
using the following docker compose:

services:
  elasticsearch:
    image: docker.elastic.co/elasticsearch/elasticsearch:8.15.0
    container_name: elasticsearch
    environment:
      - xpack.security.enabled=false
      - discovery.type=single-node
    ports:
      - "9200:9200"

  kibana:
    image: docker.elastic.co/kibana/kibana:8.15.0
    container_name: kibana
    ports:
      - "5601:5601"
    depends_on:
      - elasticsearch

  logstash:
    image: docker.elastic.co/logstash/logstash:8.15.0
    container_name: logstash
    volumes:
      - /YOUR-PATH-TO/ELK/logstash/config:/usr/share/logstash/config
    ports:
      - "5000:5000"
    command: logstash -f /usr/share/logstash/config/logstash.conf
    links:
      - elasticsearch
    depends_on:
      - elasticsearch

caveat: this configures ELK without security, if you’re not alone on your local network, please change accordingly, for example:

ad 3)
my logstash.conf

input {
  tcp {
    port => 5000
    codec => json
  }
}
output {
  elasticsearch {
    hosts => ["elasticsearch:9200"]
  }
}

change, if your docker or physical install doesn’t count for the hostname “elasticsearch”

ad 4)
in my environment it’s /var/lib/openhab/etc/log4j2.xml to edit:

  1. <Sockets…> is added als last entry in <Appenders…>
  2. the reference to that is added in <Root level=“WARN” …>
<?xml version="1.0" encoding="UTF-8" standalone="no"?><Configuration monitorInterval="10">
	<Appenders>
...
		<!-- logstash appender -->
		<Socket name="JSON" protocol="tcp" host="192.168.78.20" port="5000">
			<JSONLayout compact="true" complete="false" eventEol="true" objectMessageAsJsonObject="true" />
		</Socket>
...
		<!-- Root logger configuration -->
		<Root level="WARN">
			<AppenderRef ref="LOGFILE"/>
			<AppenderRef ref="OSGI"/>
			<AppenderRef ref="JSON"/> <!-- added this -->
		</Root>
...

you could also send events.log or others, I don’t need the events in kibana visualized, so I only want the “real” logs! :wink:

that’s it. Now openHAB sends the openhab.log entries also to logstash, which then populates elasticsearch with it.

Now I have to find out, how to insert ALERTs or a decent enough monitoring in kibana. Let’s say for “ERRORs” or some “WARNs”.

1 Like