I have a statistics sensor that takes an average of the value of another sensor over the last two hours. I then use that value to drive an alert. Specifically, what I am seeing is a random spike in the average, almost as if a very high value has been averaged in.
This is strange for two reasons:
I can’t see any high values in the history view
The value I’m averaging is the set temperature of my thermostat, which is never set to anything outside the 70s.
For those who are curious about why I might do such a strange thing: My furnace intake has a habit of freezing in extremely cold temperatures that we get once or twice a year. A large differential between the set value of the thermostat and the actual temperature indicates a potential problem. By taking an average of the set temperature, it prevents the alert from being triggered by someone adjusting the temperature of the thermostat, because it takes 2 hours to fully adjust to the new value. That gives the furnace enough time to keep up.