Sending sensor data to ElasticSearch to display in graphs using Kibana

I was looking for a way to take temperature data received by an RFXcom and send it to ElasticSearch, specifically to a free account on logz.io. Just to be awkward, everything had to run on a Windows box. The only other requirement was that it all had to come back without any intervention after a reboot - there’s Windows Update after all. Looking around, I found I could count the software that could do this ‘out of the box’ on the fingers of zero hands, but I did get it working with openHAB so I thought I’d post.

First of all I should say that this is not an example of how things should be done, it’s more cowboying things together to get something that works, but with that said, here we go!

Setting up openHAB to work with the RFXcom and run as a Windows service was fairly straightforward, I just followed the docs and used Paper UI to configure. I couldn’t find an ElasticSearch persistence module, and there are posts about sending logs to ElasticSearch, but not about persisting data, so it was time to get inventive :slight_smile:

For logz.io Filebeat seems the way to go, so I just needed to get data into a log file in a form that filebeat could parse. I couldn’t get logging.persist or exec.persist to work, although running on Windows might not be helping there. Filebeat likes logs in JSON format and there are posts about how to do that, so I ‘adapted’ the advice there. I added to C:\openhab2\userdata\etc\org.ops4j.pax.logging.cfg:

# RFXCom Logger
log4j2.logger.RFXCom.name = org.openhab.binding.rfxcom
log4j2.logger.RFXCom.level = DEBUG
log4j2.logger.RFXCom.additivity = false
log4j2.logger.RFXCom.appenderRefs = RFXCom
log4j2.logger.RFXCom.appenderRef.RFXCom.ref = RFXCOM

# RFXCom log appender
log4j2.appender.RFXCom.type = RollingRandomAccessFile
log4j2.appender.RFXCom.name = RFXCOM
log4j2.appender.RFXCom.fileName = ${openhab.logdir}/rfxcom.log.json
log4j2.appender.RFXCom.filePattern = ${openhab.logdir}/rfxcom.log.json.%i
log4j2.appender.RFXCom.immediateFlush = true
log4j2.appender.RFXCom.append = true
log4j2.appender.RFXCom.layout.type = JSONLayout
log4j2.appender.RFXCom.layout.compact = true
log4j2.appender.RFXCom.layout.eventEol = true
log4j2.appender.RFXCom.policies.type = Policies
log4j2.appender.RFXCom.policies.size.type = SizeBasedTriggeringPolicy
log4j2.appender.RFXCom.policies.size.size = 16MB

This lead to the java.lang.NoClassDefFoundError: com/fasterxml/jackson/databind/ser/FilterProvider error so I started C:\openhab2\runtime\bin\client.bat and entered (gratefully taken from https://github.com/openhab/openhab-distro/issues/1083):

bundle:install -l 5 mvn:com.fasterxml.jackson.core/jackson-core
bundle:install -l 5 mvn:com.fasterxml.jackson.core/jackson-annotations
bundle:install -l 5 mvn:com.fasterxml.jackson.core/jackson-databind

I tried various modifications to C:\openhab2\userdata\etc\startup.properties to load the modules but without success. Eventually (this is naughty I know) I found the relevant files the bundle:installs had downloaded:

C:\openhab2\runtime\system\com\fasterxml\jackson\core\jackson-annotations\2.9.10\jackson-annotations-2.9.10.jar
C:\openhab2\runtime\system\com\fasterxml\jackson\core\jackson-core\2.9.10\jackson-core-2.9.10.jar
C:\openhab2\runtime\system\com\fasterxml\jackson\core\jackson-databind\2.9.10\jackson-databind-2.9.10.jar

and copied them to C:\openhab2\runtime\lib\boot. I restarted the service and JSON logging burst into life.

With the configuration I’d added to org.ops4j.pax.logging.cfg above (so logging at DEBUG level) the RFXcom binding logs lines like this to C:\openhab2\userdata\logs\rfxcom.log.json:

{"thread":"Thread-64","level":"DEBUG","loggerName":"org.openhab.binding.rfxcom.internal.handler.RFXComBridgeHandler","message":"Message received: Raw data = 0A52099D070100EF340189, Packet type = TEMPERATURE_HUMIDITY, Seq number = 157, Sub type = TH9, Device Id = 1793, Temperature = 23.900000000000002, Humidity = 52, Humidity status = COMFORT, Signal level = 8, Battery level = 9","endOfBatch":false,"loggerFqcn":"org.ops4j.pax.logging.slf4j.Slf4jLogger","instant":{"epochSecond":1597499020,"nanoOfSecond":540000000},"threadId":186,"threadPriority":5}

I then set up Filebeat as per the logz.io instructions. My config file C:\ProgramData\Elastic\Beats\filebeat\filebeat.yml looks like this:

############################# Filebeat #####################################

filebeat.inputs:

- type: log
  paths:
    - C:\openhab2\userdata\logs\rfxcom.log.json
  fields:
    logzio_codec: json
    token: <removed, use your own :)>
    type: rfxcom json
  fields_under_root: true
  encoding: utf-8
  ignore_older: 3h

#For version 6.x and lower
#filebeat.registry_file: 'C:\ProgramData\Filebeat\registry'

#For version 7 and higher
filebeat.registry.path: 'C:\ProgramData\Elastic\Beats\filebeat'

#The following processors are to ensure compatibility with version 7
processors:
- rename:
    fields:
     - from: "agent"
       to: "beat_agent"
    ignore_missing: true
- rename:
    fields:
     - from: "log.file.path"
       to: "source"
    ignore_missing: true
# I added this 'dissect' section, the rest is generated by the logz.io wizard
- dissect:
    tokenizer: "%{}Message received: Raw data = %{rfxcom.raw_data}, Packet type = TEMPERATURE_HUMIDITY, Seq number = %{rfxcom.sequence_number}, Sub type = TH9, Device Id = %{rfxcom.device_id}, Temperature = %{rfxcom.temperature}, Humidity = %{rfxcom.humidity}, Humidity status = %{rfxcom.humidity_status}, Signal level = %{rfxcom.signal_level}, Battery level = %{rfxcom.battery_level}\"%{}"
    field: "message"
    target_prefix: ""

############################# Output ##########################################
output:
  logstash:
    hosts: ["listener.logz.io:5015"]  
    ssl:
      certificate_authorities: ['C:\ProgramData\Elastic\Beats\filebeat\TrustExternalCARoot_and_USERTrustRSAAAACA.crt']

Yes I can hear the groans :slight_smile: but the ‘dissect’ section converts the log line into individual elements, and we can change the type of those in Kibana so that we can plot them. It’s cheap and cheerful and I’m lucky that the RFXcom binding logs just the thing I need, but it works! I’m also not doing anything special with timestamps, just using the time the message is seen.

If it works then in Kibana Discover messages will look like this:


(more images to come)

So success! It’s a bit clunky but and not exactly robust, but I can now get alerts from the Cloud when my freezer gets too hot. I’m still surprised you can put one those cheap 433.92MHz temperature sensors in a freezer and they just work but it seems they do!

1 Like

I set my types like this (this is all in Kibana):

I used a Static Lookup to map RFXcom Device IDs to names (again in Kibana):

…and then you can create visualisations, dashboards, alerts and all of the other Kibana/Grafana stuff.

The finishing touch was to get Filebeat to add a synthetic rfxcom.device_name field, mapping from the device ID. The snippet below goes after the - dissect: section in filebeat.yml:

- script:
    lang: javascript
    id: rfxcom_device_name
    source: >
      function process(event) {
        switch (event.Get("rfxcom.device_id")) {
          case "1793":
            event.Put("rfxcom.device_name", "Top floor");
            break;
          case "7427":
            event.Put("rfxcom.device_name", "Ground floor");
            break;
          case "21253":
            event.Put("rfxcom.device_name", "Freezer");
            break;
          case "21254":
            event.Put("rfxcom.device_name", "Fridge");
            break;
          case "31500":
            event.Put("rfxcom.device_name", "Shed");
            break;
          case "43010":
            event.Put("rfxcom.device_name", "Outside");
            break;
          }  
      }

That makes it easier to get the labels right, especially in Grafana, and the Static Lookup configuration is no longer necessary. It’s nice to keep this mapping in one place anyway since these particular sensors tend to change their device IDs when I change the batteries :open_mouth:

There was one more change needed to tap into the Metrics side of logz.io, where you get much longer retention times. It means changing the output of Filebeat so it looks like Metricbeat, and using the token logz.io give you for metrics with Metricbeat, not for logs with Filebeat. So filebeat.yml now looks like this:

############################# Filebeat #####################################

filebeat.inputs:
- type: log
  paths:
    - C:\openhab2\userdata\logs\rfxcom.log.json
  fields:
    logzio_codec: json
    # Token below must be the one logz.io tell you to use with Metricbeat in the Metrics section
    # of the 'Send your data' example configurations, not the general one used for logs
    token: <removed>
    type: beats
  fields_under_root: true
  encoding: utf-8
  ignore_older: 3h

filebeat.registry.path: 'C:\ProgramData\Elastic\Beats\filebeat'

processors:
- dissect:
    tokenizer: "%{}Message received: Raw data = %{}, Packet type = TEMPERATURE_HUMIDITY, Seq number = %{}, Sub type = TH9, Device Id = %{rfxcom.device_id}, Temperature = %{rfxcom.temperature}, Humidity = %{rfxcom.humidity}, Humidity status = %{}, Signal level = %{rfxcom.signal_level}, Battery level = %{rfxcom.battery_level}\"%{}"
    field: "message"
    ignore_failure: false # Set true if you're sending additional things that won't match this tokenizer
    target_prefix: ""
- script:
    lang: javascript
    id: rfxcom_to_metrics
    source: >
      function process(event) {
        // "system" and "core" below need to be names that logz.io will recognise as metrics,
        // because they're something metricbeat would typically send
        event.Put("event.module", "system");
        event.Put("metricset.name", "core");

        try {
          var deviceIdToName = {
            "1793": "TopFloor",
            "7427": "GroundFloor",
            "21253": "Freezer",
            "21254": "Fridge",
            "31500": "Shed",
            "43010": "External"
          };
          var deviceId = event.Get("rfxcom.device_id");
          if (deviceId in deviceIdToName) {
            var deviceName = deviceIdToName[deviceId];
            var batteryLevel = parseFloat(event.Get("rfxcom.battery_level"));
            var humidity = parseFloat(event.Get("rfxcom.humidity"));
            var signalLevel = parseFloat(event.Get("rfxcom.signal_level"));
            var temperature = parseFloat(event.Get("rfxcom.temperature"));

            // Prefix 'system.core' should match the names above, and values need to be numeric, not just numbers as strings
            event.Put("system.core." + deviceName + ".battery_level", batteryLevel);
            event.Put("system.core." + deviceName + ".device_id", parseInt(deviceId));
            event.Put("system.core." + deviceName + ".humidity.pct", humidity);
            event.Put("system.core." + deviceName + ".signal_level", signalLevel);
            event.Put("system.core." + deviceName + ".temperature.C", temperature);
          } else {
            event.Cancel();
          }
        } catch (e) {
          // The logz.io metrics receiver drops a lot of fields, but does keep the tags, so use those to record the error
          event.Tag("script_exception");
          event.Tag(e);
          event.Put("system.core.ScriptError", 1);
        }
      }
- drop_fields:
    # Drop fields that the receiver will drop anyway
    fields: ["logzio_codec", "input.type", "log.offset", "agent.hostname", "agent.type", "agent.id", "ecs.version", "@metadata.type", "@metadata.beat", "log.file.path", "agent.ephemeral_id", "agent.name", "agent.version", "message", "rfxcom.device_id", "rfxcom.battery_level", "rfxcom.humidity", "rfxcom.signal_level", "rfxcom.temperature"]
    ignore_missing: true

############################# Output ##########################################
output:
  logstash:
    hosts: ["listener.logz.io:5015"]  
    ssl:
      certificate_authorities: ['C:\ProgramData\Elastic\Beats\filebeat\TrustExternalCARoot_and_USERTrustRSAAAACA.crt']

This topic was automatically closed 41 days after the last reply. New replies are no longer allowed.