I’m very new to openhab and trying to develop a custom processing pipeline service.
The pipeline will be a plug-able system which should be configured by the user.
Examples for plug-ins of this pipeline could be a filter or a data-type transformer.
The purpose of this pipeline service is to route all events from openhab through it,
do some filtering, do some transformations and processing
and actually at least at the end of the pipeline push some events/commands back to the openhab event-bust.
Additionally i need read and write access to to the persistence layer in each plug-in.
Filter all events concerning doors, do some processing defined by the user
and push an action/event back to openhab which switches the lights in the relevant rooms on or off.
I have looked into the sources of mysql persistence bundle , and created a service similar to it.
I have configured it like a persistence bundle which pushes all events to my pipeline service.
All events are routed to this pipeline.
So far this is a good starting point for accessing all event,
but persistence and pipeline won’t work at the same time.
I admit this is the wrong way to do it, but i do not know how to it the right way.
For doing it the right way i need your help.
How can create a service which can register itself or can be configured to have access to all events
without hindering the persistence bundle from doing its work?
Which are the relevant classes i need to extend?
How can read and write from/to the persistence service/database in each of the plug-ins of the pipeline?
Thanks in advance!