I stumbled upon an interesting effect regarding utility meters. I use a (calculated) sensor which is constantly increasing.
I then created two utility meters, one with the parameter “net_consumption: true” and one without the “net_consumption” parameter. Both use the same sensor, which is constantly increasing.
If I understand the purpose of utility meters right, an utility meter calculates the sum of the differences of the provided sensor. So in my understanding it should not make any difference, if a sensor is constantly increasing, if I use the parameter “net_consumption” or not. But it does!
After a while the utility meter with “net_consumption: true” has a higher value compared to the utility meter without “net_consumption”.
Am I misunderstanding the way the utility meter is working?
I have done some further research and discovered, that I made a mistake. My calculated sensor got it’s value from an energy sensor. As I did some further research concerning utility meters and energy sensors I discovered, that energy sensors often do not constantly increase its values but in-/decrease in very small steps. Analyzing my sensor, it showed exactly this issue
I stumbled across the same issue and came to the same conclusion as well. I have some template sensors setup to calculate the real consumption of my home and heatpump (calculation is required to add solar self consumption to the actual meter reading). However there can be certain instances where the sensor is decreasing very shortly due to different refresh cycles of the different meters.
I will now try to set all affected HA Utility Meters to net_consumption to see if this helps.
If you use a triggered template sensor and only trigger an update of the sensor from a change in one of the entities it might make your sensor more consistent. Trigger off the sensor that updates the least often.