An image says more then 1000 words.
And before you ask, the precision of that sensor is set to 2 decimals.
There is an issue on GitHub.
opened 07:31AM - 09 Dec 23 UTC
### Checklist
- [X] I have updated to the latest available Home Assistant versiā¦ on.
- [X] I have cleared the cache of my browser.
- [X] I have tried a different browser to see if it is related to my browser.
### Describe the issue you are experiencing
There is a sensor with some history (`hours_to_show: 1`):
![ŠøŠ·Š¾Š±ŃŠ°Š¶ŠµŠ½ŠøŠµ](https://github.com/home-assistant/frontend/assets/71872483/76555044-cfac-4f3f-b170-9811da6f3754)
At ~09:15 for this sensor I added `state_class: measurement`.
Then restarted HA, it started at 09:22.
Now it is ~10:21 (< 1 hour).
Here is a statistics graph for this sensor with `5minute` period:
![ŠøŠ·Š¾Š±ŃŠ°Š¶ŠµŠ½ŠøŠµ](https://github.com/home-assistant/frontend/assets/71872483/7c5ddad3-bbb6-4fe7-a0e1-c08bf793a824)
Here is with `hour` period:
![ŠøŠ·Š¾Š±ŃŠ°Š¶ŠµŠ½ŠøŠµ](https://github.com/home-assistant/frontend/assets/71872483/470c20a5-f914-46d2-a37d-451feaf5014b)
Upper & lower bounds look very weird.
Guess it happened because there was < 1 hour since HA started.
But even at ~10:30 the card looks the same.
Note: no accuracy settings were defined for this sensor:
![ŠøŠ·Š¾Š±ŃŠ°Š¶ŠµŠ½ŠøŠµ](https://github.com/home-assistant/frontend/assets/71872483/a76300ea-22d2-4830-adcc-599e2477acf9)
### Describe the behavior you expected
Accuracy should be not so weird)))
### Steps to reproduce the issue
as above
### What version of Home Assistant Core has the issue?
2023.12.0
### What was the last working version of Home Assistant Core?
_No response_
### In which browser are you experiencing the issue with?
Chrome 119.0.6045.200
### Which operating system are you using to run this browser?
Win10x64
### State of relevant entities
_No response_
### Problem-relevant frontend configuration
_No response_
### Javascript errors shown in your browser console/inspector
_No response_
### Additional information
_No response_
Suggest you to post screenshots in English next time.
Thanks, although Iām not sure itās the same issue. I have this sensor for a long time already, and itās not a statistics sensor, but a normal one from an ESPHome device. It just randomly goes to a weird rounding error from time to time.
If by āstatistics sensorā you mean a sensor created by the Statistics integration - this issue is not about it.
The issue is about entities with LTS - which I guess could be your case (hard to say since you did not post screenshots in English).
finity
December 19, 2024, 3:41pm
5
I know thereās a technical term for the phenomenon but I canāt remember it off the top of my head but if you plug this into your preferred search engine it will explain what is happening:
ānumerical rounding error pythonā
In short, itās not HA. Itās Python.
No idea about python, with C from 96 (+ true rocket ārocket scienceā). So, the whole issue with rounding seems rather strange for me.
petro
(Petro)
December 19, 2024, 4:03pm
7
Itās every language. All of them do this.
Machine epsilon or machine precision is an upper bound on the relative approximation error due to rounding in floating point number systems. This value characterizes computer arithmetic in the field of numerical analysis, and by extension in the subject of computational science. The quantity is also called macheps and it has the symbols Greek epsilon
Īµ
{\displaystyle \varepsilon }
.
There are two prevailing definitions, denoted here as rounding machine ...
Itās usually hidden from users (I.e. The UI should handle it properly).
1 Like
That is what I am talking about. Float epsilon can be handled.
finity
December 20, 2024, 5:12am
9
Machine Epsilonā¦thatās what I couldnāt remember.