I have a sump pump, and I’m monitoring the level of the water in the the pit.
The sump pump triggers when the level hits approximately 200mm, and then runs until the level is approximately 100mm. Due to sensor noise, sometimes the water level reported goes down by a millimeter or two between readings.
I would like to monitor the rate of water level increase.
The derivative integration would be perfect… except for the resets.
Also, the sensor calibration may drift over time, so just creating an offset sensor and treating it like a rain gauge seems fraught.
What is the best way to set this up?