It’s going to be something confusing about default units.
Persistence stores numbers without units at all. When data is retrieved later, assumptions have to made about the proper unit.
So far as I know, there’s no conversion for storing - it just strips the unit and stores the number e.g. Item state “4.5 mm” store “4.5”.
First thing perhaps is to look in API explorer at the Item and see what the real state is, never mind what the UI display says (because that can be transformed).
Next, use API explorer to view actual persisted values to verify what is stored.
On retrieval, persistence must try to reconstruct the appropriate unit. There’s a number of places it can look for a unit.
Each individual Item may have a default unit assigned in ‘pattern’ metadata. I think this takes priority, if it is there.
Each type of Item may have a system default unit. Not all types do, but Number:Length I believe does - it’s probably “m” for non-USA folks.
It could look to see if the current Item state has any unit, and use that, but I don’t think it does.
The upshot here is that the assumption of a unit on retrieval might not be the same as the unit when it was stored.
You might run a test in a rule to see what units a e.g. historicState()
returns , to see what persistence is thinking here.
Meantime, charting is doing its own thing. Any of these factors might come into play, affecting both charted data and the labels on the axis, they might not match. But find out exactly what persistence is doing first.