“I just bought a 6000mAh battery, but my charger shows it only charged 5800mAh. Is the capacity falsely labeled?”
As a lithium battery manufacturer, we hear this question all the time from customers. Today, let’s set the record straight: why the number flashing on your charger screen is not the final verdict on whether a battery’s capacity is “real” or not.
-
The Core Misunderstanding: What Is Your Charger Actually Counting?
First, understand how a charger works. When it charges a completely depleted battery, it acts like a “gas pump,” calculating how much electrical charge it pumps into the battery.
This number is called the “Charge Capacity” (measured in mAh). For example, if it shows 5800mAh, that’s how much charge it delivered during that specific charging session.
Here’s the crucial part: The charger measures what goes IN. The battery label states what can come OUT. These are two different things!
-
Why Does “Going IN” Not Equal “Coming OUT”?
Imagine filling a slightly porous balloon with air. You might pump in 1000 liters (charge capacity), but due to absorption and tiny leaks, only about 950 liters come out when you release it (discharge capacity). Similarly, batteries have inherent “invisible losses” during charging and discharging, mainly for three reasons:
- Natural Energy Conversion Loss:The chemical reactions inside a battery aren’t 100% efficient. Some energy is always lost as heat. It’s a law of physics—what comes out will always be less than what went in.
- The Direct Impact of Discharge Rate – How Capacity is Rated:This is key! The capacity printed on a battery label (e.g., 6000mAh) is measured under very specific, standardized lab conditions. It uses a very small, steady discharge current (like 0.2C, which is 1200mA for a 6000mAh battery) down to a set cutoff voltage.
If you use a higher discharge current in real life (like in a high-power device), losses inside the battery increase, voltage drops faster, and the actual usable capacity will be lower than the label. Generally, the higher the current, the bigger the gap.
- Temperature & Battery Health:Cold temperatures slow down battery chemistry, drastically reducing usable capacity. Also, as a battery ages through charge cycles, its maximum capacity naturally decreases.
-
The True Judge: Standard Discharge Testing
In an industry lab, finding a battery’s true capacity follows a strict, objective process:
- Standard Charge:The battery is fully charged under specific, controlled conditions.
- Standard Discharge:It is then immediately discharged at a specified, steady current (like the 0.2C mentioned above) until it reaches the cutoff voltage.
- Precise Calculation:Only the total energy released during this discharge process is the battery’s true, usable capacity.
So, the complete picture is: Label Capacity ≈ Standard Discharge Test Result > Your Charger’s “Charge Capacity” Reading.
Conclusion
Your charger shows the energy input during charging, which is naturally lower than the rated output capacity due to inevitable efficiency losses. It cannot determine if a battery’s label is accurate.
The labeled capacity is the “output” measured under ideal, controlled conditions, which are hard to perfectly replicate in everyday use.
Your real-world usable capacity is what you actually get from your specific device, influenced by discharge rate, temperature, and battery health.
At Keeppower, with 17 years in the lithium-ion battery industry, we are committed to providing high-quality, high-capacity products. We share this knowledge because we believe an informed choice is the best choice. We strive to cut through the technical complexity, building trust through “real capacity” and “reliable quality.”
Got more questions? Leave a comment below! Keeppower is here to keep the explanations coming.