Use case
I’m using utility meter for my water sensor. Sensor is reporting values in no constant time (between 90s and 20 minutes). It’s reporting more often when water flows. However if last reading happened while ago, it’s going into “lazy mode”.
I have two separte utility meters definied - one for gathering usage for 3 minutes and second for 15 minutes. Since sensor most of the time reports less than 20 min - 15min utility sensor provides quite accurate values (deltas). However if sensor report less often than 3min - 3min utility sensor gathers over-normative readings:
Previous water sensor report time : 10:01:00 [value 10]
Last water sensor report time : 10:09:00 [value 25]
Delta between last reading will be 15. However real water usage would be 5 per 3 minute period. So if we could obtain values every 3 min - it wold be a sequence of deltas [5,5,5] and last 3min utility meter would report 5 instead of 15.
Idea
The idea is to add attribute to utility sensor to add such capability - to normalize delta if utility meter cycle is less than sensor reading. Of course there is a tradeoff - it need to be interpolated - probably only rational solution is to use linear interpolation.
If attribute is set utility meter would check if related sensor last_change time is more than cycle. If so it do computation (cycle_period / timedifff(related_sensor.last_change - utility_meter.last_updated) * delta_value)
. If else it provide raw value without altering it.
In general we calculate what percent of time beetween last read and current read is our cycle.
Previous water sensor report time : 10:01:00 [value 10]
Last water sensor report time : 10:09:00 [value 25]
Cycle : 3 min = 180s
10:09:00 - 10:01:00 = 8 min (480s)
percent = 180/480 ~ 38%
value interpolation = (25-10)*38% = 5,7
reading without interpolation = 15
real world would be 5.
Visualization
Sensor readings and utility meter deltas combined.