Inaccurate Item Values That Are Longer Than They Should Be

I am finding it hard to think of the technical definition for this, so I will have to describe what is happening.

Essentially, displaying numbers without any formatting as strings, is not showing the correct value - e.g. 19.8 becomes 19.80000000000000071054273576010018587112426757… and 19.7 becomes 19.6999999999999992894572642398998141288757324218

I first noticed this while using SetPoint to create a heating control - where the value is increased or decreased in 0.1 degree increments. It is also happening with values received from sensors.

Formatting the value as a number displays the desired result, however the value must be being stored incorrectly as it is affecting any processing, such as calculating averages, etc.

It is reminiscent of random bit-flips caused by cosmic radiation - hence the need for ECC memory.

I am running OH2 on a Raspberry Pi 2B.

There is a dirty little secret about computers most people do not realize. Computers are really really bad at math, particularly math with floating point numbers. If you have 19.8 + 0.1 the result will be really close to 19.9 but not exactly.

This is normal. This is how ALL computers work (at least all computes you are likely to encounter). This is something you usually do not have to worry about but it occasionally does cause problems. For example, if you do something like:

if(19.8 + 0.1 == 19.9)

That if statement will NEVER evaluate to true. It can’t because the computer is simply incapable of doing floating point math that accurately.

There are ways to deal with this problem depending on what you need to do with the numbers including:

  • rounding to the nearest decimal place
  • multiplying by 10^n (where n is the number of decimal places), doing the math using long values, then divide by 10^n, and then round (because the divide will not be dead on accurate either)
  • avoiding == and using > < comparisons where possible

It is indeed impacting and calculations but realize that the numbers are really really close to accurate. So as long as you don’t need precision beyond the ten-thousands place it doesn’t really matter.

So, to provide a tl;dr, there is nothing wrong. This is just how computers work.