The documentation on this isn’t clear at all (it really isn’t), but what i’m looking to do is log all the state changes and information coming out of OpenHAB to a remote MQTT broker (Mosquitto or HiveMQ). I want it to be a secondary connection (copy, or listener) and not local persistence, which should be different (Mongo or other NoSQL localised DB).
The remote cloud broker has other clients subscribing to it to record the messages to a centralised database, and is reacting to them to different ways which are not related to the specific OpenHAB device.
Let’s say i have 3 network devices (things) i want to ping every 60 seconds. I want the result of that ping to be sent as a JSON message to a remote Mosquitto instance on a VPS (i’m aware i’ll need a transformation potentially).
I have 3 Switch items set up cycling every 60s:
- Network Device 1 (Thing) -> Online (Channel) --> (link) -> Device1_Online_Status (Item)
- Network Device 2 (Thing) -> Online (Channel) --> (link) -> Device2_Online_Status (Item)
- Network Device 3 (Thing) -> Online (Channel) --> (link) -> Device3_Online_Status (Item)
What i want is OpenHABian to keep its own persistence information, but for the remote MQTT Broker to receive 3 messages to 3 different topics (acting as both stateTopics and commandTopics), published to it by OpenHAB: Note the names need to be dynamic if possible.
- room-name/network/device1/status --> state or command (R/W) as JSON payload
- room-name/network/device2/status --> state or command (R/W) as JSON payload
- room-name/network/device3/status --> state or command (R/W) as JSON payload
I also want ALL stateTopics and commandTopics sent to room-name/all.
I can seem to find a guide anywhere to this which doesn’t end in a maze of local/remote MQTT stuff. The goal is to create a stream of OpenHAB events monitored remotely, while allowing it to be configured as it wishes on a local level.
Is there a way to do this in PaperUI? All the examples with MQTT seem to involve config files.