I’m testing a blueprint to control my thermostat and I’m finding that the calculations in the blueprint display strange results some times. what i’m finding is that the calculation is adding inserting additional decimals.
this is my calculation:
heat_temp_1: “{{ (target_temp_1 - heat_tolerance) - actual_temp_1 }}”
and here are the values i’m passing in the calculation for a particular run.
heat_tolerance: 1,
target_temp_1: 22.5,
actual_temp_1: 21.1,
however this is the result:
heat_temp_1: 0.3999999999999986,
I expect the result to be 0.4 not 0.3999999999999986
the weird thing is that it always seems to be off by 0.0000000000000014 or 0.0000000000000007 either positive or negative.
Similar issue happening on a second calculation for cooling, but that one seems of by 0.0000000000000001 instead.
this is causing me issues when the target and actual temps are off by the heat tolerance value as it executes the wrong command, its also just weird. Not sure where the extra value are coming from as the values of the actual temp reading which is the only item coming from a sensors seem to be reported to the first decimal only.
here are some log output that show the problem
its not a major issue since most of the time it just accelerates the action that was going to happen on the next run, or delays it to the next run, but something feels off.