I found another strange thing from last night - out of curiosity I went ahead and triggered a manual calculation around 9:16PM for all the zones. If I am not mistaken this prunes all previous weather data, which is as per expected.
At this point the bucket size went “up” to -9.79 (went into negative), which would indicate that according to the system the zone needed 9.79mm watering. All fine until this point, this is as per expectations.
Then from 9:27PM we had a storm and precipitation went up to 9.3mm until 9:46PM. Mind you, this is after I triggered the manual calculation of the zone - at this point I was expecting the bucket to “fill” significantly, but the automatic calculation, which runs at 11:55PM every night only decreased the watering time by a minute roughly.
Given the parameters of the zone, I just don’t get how the setup arrives to this conclusion - I understand that due to the ET calculation the bucket size does not exactly deduct the measured precipitation as is, but I would still expect that a nearly 10mm rainfall would have a significant impact on the bucket size, especially if it takes place at nigh when ET is minimized and when it takes place so close to the daily calculation/reset. But let’s say the ET calculation does not take the time factor into account (ie. when did the rainfall occur) and assumes that it took place during the hottest period of the day (or maybe it just does a calculation for the whole day not taking specific time points into account at all). For reference, I would have to run my zone for about 40 minutes to get the equivalent water output of 10mm rainfall.
The nozzles are calibrated to put out 17mm/h according to the manufacturer if arranged in a square shape with head-to-head coverage - I configured my zones according to this
I could make a completely unrealistic argument that nearly all of the 9.3mm rainfall was all factored into the ET calculation and therefore the integration considered that the majority of it evaporated - so this tells me that the integration “thinks” that nearly 10mm of water evaporates on a hot day with my setup. At this point this is then considered a loss as green vegetation cannot utilize this water.
So running with the above logic and taking that standpoint as the baseline, I don’t understand then why my daily watering time only increases by ca. 20 minutes, which is a rough equivalent of putting out 5-6mm of water. Weather conditions were basically the same the past few weeks, we had 0mm rainfall until last night’s storm and temperatures up in the upper-mid 30’s.
You see my dilemma? Either 9mm of water is considered to be “system loss” due to it completely evaporating as per the ET calculation in which case the daily watering time should exceed this significantly (whereas the calculation only accounts for half of it on a daily basis - check the history graph of the watering time), or 9mm is in fact considered to be a valid “water output” where most of it does not evaporate, in which case I don’t get why last night’s 9.3mm rain did not impact the calculation basically at all.
I apologize for raising so may questions, my intention is not to be problematic, I just want to understand clearly how the integration works, so I can set my expectations and decide where/how I can rely on it.