I am calculating the difference of two electricity power meters.
So i created a helper value that holds this difference.
The two values change about every second.
p1; the total power of the house
p2: power of a car charger.
pH(elper); difference p1-p2; when this is negative i know the solar pannels are helping in charging the car.

A problem i have is that sometimes this helper gives incorrect values because the two values dont change at the same moment.

I would call the value incorrect as in the real world the difference calculated is far of any reality of the actual difference.

Example below;

Timex is some time where either p1 or p2 changes and pH is updated and time.n+1 is after time.n.

Example;

@time1 p1=2kw, p2=4kw and pH
is calculated pH=-2kw
At a @time2 p2 drops to 0kw. But the value of p1 hasnt changed yet.
-@time2 p1=2kw, p2=0kw and pH is calculated to +2kw
=> this 2kw is a fake value as in reality the difference between the 2 power never got to +2kw.; it actually should be -2kw.
-@time3 p1=-2kw, p2=0kw and pH is calculated to -2kw
Which is the correct valueā¦
So i am confronted with a fake value of +2kw that rever realy happened; the value of the difference never got above 0kw.
Note that the first change noticed by HA could also be on p1. @time2 p1=-2kw, p2=4kw and pH is calculated to -6kw which is also āveryā incorrect.

My question; is there a way to get rid of these fake values?

Or do i just have to live with them?

a solution for examplisot to always use logic on the helper where the helper is below or higher of some value during a certain time. But the incorrect values still end up in a graph.

Note; when thinking adoinga solutions I always hit the problem that when doing the calculation you are just never sure if the 2 values are āin syncā. In this situation you know changes in p2 change p1. But not that changes in p2 influence p1.
So when delta p2 is big there should be a big delta in p1 (at least chances are very high). So with this idea there might be some way to drop certain calculated values. But that would also mean i have to hold back the calculated value of pH for a short time until I know it is a correct value. That extra 2-3 second is maybe not a problem.
And doing some avereging over some time I dont see a valid solution as the wrong value can be way of and having a big impact on the average.

Note; even when I can force a reading, I think there is still a risk of the issue.
Let me see; when I start the calculation from change on P2, then read P1; knowing P2 influences P1, the impact of P2 is added to P1;
There is just the risk that between starting the P2 update, P1 changed because in the mean time P2 changed againā¦
As P1, P2 give values about every second, I doubt there will be really a lot of difference between the result. I would be suprised if forcing a read on the integration can do sub second round trips. But this is an assumptionā¦
To solve the problem, it should be in ms round trips. Maybe as these devices have no low power requirement they can actually react quickly. And also not sure if this integration allows forcing a read. I need to read up.

The problem most likely is also in the fact that the (k)Watt readings are sampled, and not averaged between samples. Combine that with fluctuating values from solar and home use, and you get weird readings. You could flatten those out a bit by averaging over time. But it would require for the errors to be uniform, which they probably arenāt. So unless you can get a higher sample rate, it will probably never be great.

Substracting (k)Wh values especially over time, is more likely to poduce proper results (assuming they are calculated internally at high sample rates). You could try to take derivatives of that and average those to get more reliable wattage output, but it will be quite a hassle.

Youād still see minor effect from them updating at different times, so some averaging over a few previous samples should help. If you donāt youāll likely see a zigzag line in the difference because the add and subtract happen at different times. What @PeteRage suggested would eliminate it fully.

For the individual entity the values seem very correct and consistant. Adding averages would make an individual entity i would say just āwrongā.

At the moment I would say there is only max 1 inconsitant value in the substraction value. Adding averages woudl make the value of the substration less accurate over like x seconds versus a potential 1 incorrect value.

If the wattages are constant between samples, then indeed, the problem isnāt worth the trouble. For me, that certainly isnāt the case.

And I was thinking of a rolling average over the difference, not the individual sensors. If it is only one peak in multiple samples, maybe an outlier filter might help?