I have a 120ah 24v battery, and I want to have an indicator on HA that tells me how many ah I have charged or discharged at specific time.
If I start the day with 10 ah on the battery and by using my solar panels I get 50 ah during 5 hours, that means at hour 5 I should have 60 ah (supposing I’m only putting power to the battery).
I have an EPEver solar charge controller that is reporting to my HA how much watts are being charged to the battery in positive float number, and in negative when I’m discharing from the battery using the “load” terminal on the charge controller. It also can report the Amps sent to the battery, but the measure is all over the place, so I don’t trust it, but I trust the watts it says because I compare it to an external meter and it’s almost identical.
This is what I’m currently using, but I don’t know how to set it to 100% or 0%, because if I do it from Developer tools, it stays for a second and then changes to whatever was before.
I’m calculating the amps by taking the watts from the epever and dividing it by the voltage epever reports for the battery leads.
Then I take the output of this sensor and feed the integration sensor:
- platform: integration
source: sensor.calculated_instant_battery_charging_current
unit: A
round: 3
name: Battery_Ah_Balance
method: right
Then it gives me the ah that the battery should have, but with a little problem:
It gives me a measurement, but I currently don’t have any way to set it to 120ah or 0 ah, so the number is totally lost from the real value. I have an external device that counts the ah from/to the battery but it doesn’t have any means to export this information to HA
So my question is:
Is any better way to calculate AH on a battery?
Is there any device that can measure the AH sent to/from the battery and report in modbus/mqtt or other method to HA?
Is this possible with my formulas but set a way to calibrate it?
Any idea would be appreciated!
From what I understand on the utility meter sensor page, it needs a source feeding kwh. I don’t have a source for that, instead I can feed it with amps.
Will it work with a different measurement unit than kwh?
I’ll give this a try, I’m still confused, because the utility meter sensor doesn’t have any reference or mention to Ah, but if you say it works, I’m giving it a try. Thanks
The source is in Amps, because is how many amps are going to/from the battery, it been to my load (negative) or from my solar charge controller (positive)
The source is in amps. Which is integrated to Ah. Just like Watts are integrated to Wh ( or kW to kWh). Integrate power and you get energy. Integrate current and you get capacity.
This is the history I’m getting from the sensor, which is (from my limited understanding) incorrect, based on my Amps sensor, because at this moment where I’m, I’m using electricity from the battery, so the values are negative, but somehow the utility meter goes positive then negative, then positive again.
Ok I think I finally understood that I have to use integration sensor as the input for the utility_meter! (I’m not native English speaker so a bit of confusion because of language interpretation).
So I did it and I have a number now!
I’ll be testing it and calibrating it and will report back so anybody else can confirm to use this. I searched online for this and found nothin, incredible…
@tom_l I really appreciate your patience and help. I’ve seen you replying to lots of post, you really know how to use Home-Assistant!
PD: I’m still looking for a device that can measure this for me, because I would like to have a point of comparison, I don’t trust my solar controller numbers so if someone knows a current measure device for DC with mqtt or modbus, it will be really appreciated!
Then use Developer Tools > Services > utility_meter.calibrate to set the amount of Ah you have on your battery. You can know this charging your battery to full, then taking the capacity of your battery as the maximum. For example, I have a brand new battery of 120Ah, so when my battery is full charged, 0 amps going to it, I know I have 120Ah of balance, so I’ll use the utility_meter.calibrate as next:
Then, as soon as you start discharging your battery, utility_meter is going to decrease, and when charging it, it’s going to increase.
Any sugestions on how to improve this would totally be appreciated.
Also, if someone finds a good device to count amp-hours with rs-485, please post a link!
Thanks
this works nice, thanks
I was wondering if there is a way to limit maximum “energy_on_battery” value, so it can’t go above 120Ah.
The charge/discharge value is pooled from the inverter and it’s not very accurate, so on full charge it gets above or below maximum Ah. After few weeks, this stacks to unrealistic value (eg. you get 187Ah on 120Ah battery).
PZEM-017 does not differentiate between Charge and Discharge current, unfortunately. I.e. the value of Current reported by the device is always positive regardless of whether the battery is charging or discharging.
The workaround however could be to create a template sensor that converts the positive current to a negative current based on the voltage (generally the charge voltage is higher that a discharge voltage). E.g. if voltage is > 13V then battery is charging (positive current), otherwise it’s discharging (negative current).