It is really important to understand the sources of error with the coulomb counting method as this forms the backbone for most SOC estimation algorithms. SoC Estimation by Coulomb Counting is based on the measurement of the current and integration of that current over time.

There are five main sources of error with the coulomb counting method [1]:

- Initial SOC
- Current measurement error
- Current integration error
- Uncertainty in the knowledge of battery capacity
- Timing oscillator error

Typical errors in SOC estimation using coulomb counting are reported as 3 to 4%. Improved coulomb counting algorithms [3] suggest errors of less than 2% in SOC estimation over a full cycle can be achieved.

### Initial SOC

The assumption with coulomb counting is that you know the initial SOC starting point. A reference point from which you can count coulombs in or out of the cell based a measured current and known time step.

Any error or uncertainty in the initial SOC will remain throughout the Coulomb counting process, until you reach a known point.

The initial SOC is often defined by an Open Circuit Voltage (OCV) to SOC lookup table. OCV relies on the cell having been rested so that we don’t see any voltage drop. It is also impacted by hysteresis. The accuracy of the hardware voltage measurement also plays a part in this [2].

### Current Measurement Error

Current sensors are corrupted by varying measurement noise; simple, inexpensive current sensors are likely to be more noisy.

This post has been built based on the support and sponsorship from: **Eatron Technologies**, **About:Energy**, **AVANT Future Mobility**, **Quarto Technical Services** and **TAE Power Solutions**.

Current sensors can have an accuracy of better than 1%, but that depends on the type and on the measurement range. A battery pack designed for very high peak currents will have a current sensor sized to measure these peaks. However, that current sensor could be very inaccurate when measuring small currents.

*Shunt Current Sensor*

Typically shunt sensors are robust and highly accurate. They have a simple structure. So, they are rather robust against thermal, mechanical, and electrical overloads.

*Magnetic Field-based Current Sensors*

The other type of battery current sensor uses electromagnetic elements and measures the magnetic field of current. Therefore, by their nature, these types of current sensors are isolated. However, these sensors measure the current indirectly. Because of this, the accuracy could be less than the shunt sensors.

### Current Integration Error

Coulomb counting methods employ a simple, rectangular or trapezoidal approximation for current integration. Such an approximation results in errors that increase with slower sampling rates or faster fluctuations in the current profile.

[2] shows a generic illustration of integration sampling error.

### Uncertainty in Known Battery Capacity

Coulomb counting assumes a single value for the battery capacity at the beginning of life and under known conditions. However, the capacity of a cell varies cell to cell, with charge / discharge rate, temperature and age.

### Timing Oscillator Error

Timing oscillator provides the clock for (recursive) SOC update, i.e., the measure of time comes from the timing oscillator. Any error / drift in the timing oscillator will have an effect on the measured Coulombs. This error should be well controlled, but needs to be considered when designing the BMS hardware.

#### References

- Performance Analysis of Coulomb Counting Approach for State of Charge Estimation in Li-Ion Batteries Charge Estimation in Li-Ion Batteries, Kiarash Movassagh, University of Windsor
- Kiarash Movassagh, Sheikh Arif Raihan, Balakumar Balasingam and Krishna Pattipati, A Critical Look at Coulomb Counting Towards Improving the Kalman Filter Based State of Charge Tracking Algorithms in Rechargeable Batteries, IEEE Transactions on Control Systems Technology, Jan 2021
- Ines Baccouche, Sabeur Jemmali, Asma Mlayah, Bilal Manai, Najoua Essoukri Ben Amara, Implementation of an Improved Coulomb-Counting Algorithm Based on a Piecewise SOC-OCV Relationship for SOC Estimation of Li-Ion Battery, INTERNATIONAL JOURNAL of RENEWABLE ENERGY RESEARCH

Nigel, Did you ever noticed that the recharged Ah is often less than the discharged Ah on a LIB?

It seem related to the fact that the nominal voltage over a recharge is higher than during a discharge( Ri and other factors).. so this could explain why… However the Wh are about the same, maybe less the charge efficiency losses.

I guess comparing a lookup table of Ah vs SOC with a second look up table of Wh vs SOC could be interesting. And even more comparing these from different C rate too.

Hi Doctorbass, The Ah required to charge a cell will always be higher than the Ah discharged, that is true based on fundamental laws of physics.

The problem will be down to measurement accuracy and ensuring that the same conditions are the same. If you charge the cell cold and discharge hot you might get slightly more energy out as the DCIR will have reduced with higher temperatures. However, if you then recharge the cell you will have to put more Ah back in than you discharged.

Hence I think you need to ensure that the charging conditions are always the same.

Best regards, Nigel

Yes, I totally agree about law of physics and yes this could be related to cell temperature and DCIR changes when comparing both chr and disch. In any cases, the discharged Wh will always be lower than the recharged Wh this is a fact for me too just to make it clear. However at a certain point I was wondering if some volt to amp conversion could happen , just like dc-dc or motor controller does, but chemically instead. I would be very surprized if my instrument would be off calibration but to make sure I might have to re-check the last calibration of both of my DC electronic load and power supply and make sure the negative current and positive current have the appropriate gain and offset. I will conduct some test to cofirm everything. This observation of lower charged Ah vs discharged Ah began during my long trips on my ZERO motorcycle. I was quite surprized but I also noticed the same on other EV that I have. and if it would be related to DCIR, than it look like it is the opposite that was hapenning because during the 1C charging the battery temp reached temp up to 48c, but during the discharging it never went up above 35c. So in fact while recharging the battery, every single Ah will be at higher voltage due to the DCIR, than during the discharge, so every Ah during recharge worth more Wh.. right? or is only related to charge efficiency and the delta V from the charging DCIR and the deltaV during the discharge are only related to pure resistive loss.. ( Electrical and non-electrical losses and maybe the voltage dissusion are different, probably partialy explaining that..