I’ve spent many hours trying to figure out how to get proper water measurements using examples that others have posted, including the one you linked to. I’m still not certain how it all works, but here’s my code for my sensor that outputs one pulse for every 0.0748 gallons (13.36898395721925 pulses per gallon). I’ve had this running for a few weeks now and it’s very accurate.
The Energy Dashboard uses sensor.water_total as it’s source, which is the total_increasing value.
The sensor.water_rate is the current rate of water flow, but if you watch that value change as you run a faucet it varies wildly and I’m not sure it provides any value when using it directly.
To get the hourly, daily, monthly, etc. usage I created a single Reimann Sum entity in sensors.yaml which uses the sensor.water_rate as it’s source. The Reimann Sum seems to smooth out the fluctuating values from the sensor.water_rate:
- platform: integration
name: Water RSI
Once the sensor.water_rsi entity created I used that as the source for each of the hourly, daily, weekly, etc sensors. I didn’t use the yaml method for creating these sensors as the Brinkman article shows, but instead used Helpers (Settings - Devices - Helpers - Utility Meter) to create them, with each using the same sensor.water_rsi as a source and selecting the different meter reset cycles.
- platform: pulse_counter
name: "Water Rate"
- debounce: 1.0s
- lambda: return (x / 13.36898395721925) * 60;
name: "Water Total"
- multiply: 0.0748