I bought one of these guys: Rain Gauge Model RG-11. I configured it to simulate a tipping bucket and feed the signal to my KNX binary inputs.
I would like to implement the following functionalities with it:
- Count the tipping bucket impulses and sum them to get the total tipping bucket count.
Scale the tipping button count with a factor to calculate the total amount of rain fallen since the counter was last started/reset. In my case the factor is 0.001 mm rain per tick.
- Calculate the amount of rain fallen for at least the last hour, the last day and the last month as derived values.
The functionality 1. I’ve already solved by feeding the tipping bucket pulses to my KNX binary sensor configured as a counter. So summing the pulses since last device start/reset is already available. I’ve gone this way because the pulses are coming fastest with 50 ms pulse width and I thought that this will maybe to fast for feeding them directly to a contact switch on OpenHAB.
The functionality 2. and 3. would be easy when the counter never gets reset. Then you only have to take the minimum and maximum value of a given time interval and calculate the delta.
But here lies the rub. If the counter gets reset at least once in the given interval, the simple calculation does not lead to the correct result. You have to find the resets and calculate the individual intervals for that.
To give a better understanding I made the following example. For simplicity I only took the counter part and I want to have the total count at 01:00:00 o’clock for the last hour. If using the minimum and maximum difference, I get 12 - 0 = 12 as the total count.
But taking into account that the counter was reset at approximately 00:34, I get the two intervals with 0 to 10 and 0 to 12. This will lead to the total counter value of 22 in the last hour. And this would be the correct value.