Updating statistics from integration and calculation of max, min, mean values

Hello

Background:
I am maintaining a couple of integrations and am updating one of them to use statistics as a sensor instead of having data as attributes. The reason for having data as attributes is that the data fetched from the API endpoint are always at least 24 hours old, hence a state value with current state makes no sense as the date stamp is always wrong.

So I have this code snippet:

            # _sum, _max, _min, _mean are all initialised with last data point values.

            # Array of statistics points
            statistics = []

            # Populate statistics array
            last_value = data[meter_type][0]["Value"]
            for val in data[meter_type]:
                from_time = dt_util.parse_datetime(f"{val["DateFrom"]}").replace(tzinfo=dt_util.get_time_zone(self.hass.config.time_zone))
                _sum += val["Value"]
                _max  = last_value if val["Value"] <  last_value else val["Value"]
                _min  = last_value if val["Value"] >= last_value else val["Value"]
                _mean = (_mean + val["Value"])/2
                last_value = val["Value"]

                statistics.append(
                    StatisticData(
                        start=from_time,
                        state=val["Value"],
                        sum=_sum,
                        min=_min,
                        max=_max,
                        mean=_mean)
                )

            metadata = StatisticMetaData(
                has_mean=True,
                has_sum=True,
                name= None,
                source=RECORDER_DOMAIN,
                statistic_id="sensor.novafos_water_statistics",
                unit_of_measurement=unit,
            )

            async_import_statistics(self.hass, metadata, statistics)

What I see is that the the calculation of _sum works as intended. A statistics graph using “change” and graphing over hours (raw data), day, week and month all sum up nicely in the intervals specified.
Turning on min, max, mean works out for hours, but aggregation over days, weeks, months does not aggregate data so that these values are recalculated over, say the 24 data points for a day-to-day graph.

This is the hourly view which looks right:
image

Going to daily view, the min, max, mean is still the levels from the hourly data:
image
The tall bar is the change in this time interval, but min, max and mean are still the ones from the hourly view.

Contract this to the temperature sensor on my Hue motion sensor which logs data every 5 minutes:
image

And going to a higher aggregation level, it does so rather nicely:
image

So I have a pretty good idea this is related to the aggregation happening on the “state” of a sensor, and not on “change”, so am I attempting something completely impossible, or am I just hitting a fringe case where metering data not current with realtime and statistics does not match up?