What is the math precision of Blocky rules

Hi

I have a water meter that counts pulses, but there are some 50 pulses per litre of flow. The meter is read by the HTTP binding every 10 minutes to give a current value. Very soon this goes up into the 100’s of millions. I want to measure the amount of water I use each day, so I store a starting value at midnight and then subtract this from a current value.

I am have problems making this work reliably. Could it be that Blocky does not have the maths precision to perform subtraction and division (to get ther result in litres) to cope with very large numbers?

PJG

Can you show us why you say this?

I think you could give us a bit more help here - no-one can see what values you are operating with nor what operations you are performing.

Most of the numbers behind the scenes of openHAB will be held or used as a java Big Decimal type at some point.
(Big Decimal can be quite precise for numbers in the ‘middling’ range, but as you get to very large or very small members it can become approximate.) EDIT - I’ll take that back - Big Decimal can handle very large integers.
But within openHAB - data messages, binding, update this and that, calculations, your rules - I would expect it is not always Big Decimal consistently.

If your “storing” of numbers involves some persistence/database, that likely has its own limitations.

Only if it’s Rules DSL or passed to an Item or the like. If it’s local variables to the rule it’ll be what ever the max value for that language happens to be.

Blockly “compiles” into JavaScript. The max number value for JavaScript is 1.7976931348623157e+308. I seriously doubt you are dealing with 300+ digit numbers. So this is almost certainly not your issue unless you are measuring the distance between starts in meters or something like that.

Thank you all for the replies. Knowing that the math would not be a problem, I looked again at the rules and found a calibration constant which I changed, but must have forgotten to save. All working well now.

PJG