JSON transformation of 85x diagnostics on an MQTT topic

so I have an MQTT device that has a state topic that returns 85 diagnostic key pair state values wrapped in a single JSON object.

In PaperUI, please don’t tell me I have to create 85 Things, or 85 channels on an MQTT Thing, to link to Items?

I can no longer use files because having started from a file based config, I’ve been working in the UI for several days now and the files are obsolete but needed for a few of the Things I have .

?? Thanks in advance.

What do you want to do with these key pairs? Depending on what you want will drive the answer.

85 Channels

But no matter how you look at it, you won’t need 85 Things. But the worst case is you might need/want 85 Channels on the one Thing (assuming you care about all 85 combos). Then you would link each of those Channels to Items. You don’t have to use PaperUI to click/create each of the Channels. You can use the REST API to create the first one and then query for the Thing’s JSON, copy/paste/edit that JSON and updating the Thing with the result. You can see a walk through for how to do that at Migrating from text files to Paper UI for Things using REST API. The only difference will be that you will use the “PUT /things/{thingUID}/config” end point to update the Thing instead of creating a new Thing.

Trigger Channel, Rules DSL Rule

Another approach is to set up the subscription as a Trigger Channel on the MQTT Broker Thing. Then create a Rule that triggers when that topic receives a message. Inside the Rule, parse out the values from the JSON (you can use the JSONPATH Transform from inside Rules: transform("JSONPATH", "$.key", jsonstring)) and postUpdate or sendCommand to your Items from the Rule. You can see an example of something similar at MQTT 2.5 Event Bus where an MQTT trigger channel drives a Rule that synchronizes Items between two instances of OH. In your case you are only subscribing to one topic so you wouldn’t supply the “Separator character” in the Trigger Channel config because all you care about is the message, not what topic it came from. And of course, instead of splitting and parsing each side of the event you would just need to transform the event String using JSONPATH.

rule "Subscribe diagnostics"
    Channel 'mqtt:broker:broker:diagnositcs' triggered
    Item1.postUpdate(transform("JSONPATH", "$.key1", receivedEvent)
    Item2.postUpdate(transform("JSONPATH", "$.key2", receivedEvent)

Trigger Channel Scripted Automation

If using Scripted Automation and a clever naming scheme you can do this with even less work. Python has a JSON library that can parse JSON and give you a dict. Then if you use a clever naming scheme for your Items you can construct the Item name from the key in the JSON and just loop through the dict and postUpdate to all 85 items in a few lines of code.

import from core.rules rule
import from core.triggers when
import json

@rule("Subscribe diagnostics")
@when("Channel 'mqtt:broker:broker:diagnositcs' triggered")
def diag(event):
    parsed = json.loads(event.event)
    for key in parsed:
        events.postUpdate("Diag_{}".format(key), parsed(key))

I’m certain the same can be done in JavaScript. To install and use Python for Rules see:

The above assumes you name your items “Diag_key” where “key” is the key as it exists in the JSON.

Trigger Channel, UI created Rule

If you only want to use the UI, even for Rules, I think it’s possible for something like this. Assuming you have followed the instructions at the link above to install Python and the NGRE:

  1. Create a new Rule in PapeRUI
  2. For the “When…” choose “a trigger channel fires” and choose your MQTT Broker Trigger Channel
  3. For the “Then…” choose “execute a given script”. Select “Python 2.7” as the Script type and enter the following into the script section.
    parsed = json.loads(event.event)
    for key in parsed:
        events.postUpdate("Diag_{}".format(key), parsed(key))

NOTE: All code above has been just typed into the forum from my phone. They likely contain typos.

Hey rich
that’s perfectly detailed.
I will follow that through, will take time, but I just wanted to reply now to acknowledge your kind, and very thorough, advice.

I’ll come back with thoughts on due course…