Mobile devices are becoming an integrated part of our daily life. Let’s use the smartphone as an example. Instead of a simple phone call function, smartphones are now packed with rich features for social networking, web browsing, messaging, gaming, large HD screens and many others. All these features are pushing the phone to become a power-hungry device. Battery capacity and energy density have been increased substantially to meet the higher power requirements. A 10-minute charge that supports a device for one day or charging one hour to achieve an 80 percent state-of-charge is becoming a trend for high-end user experiences. When combining rapid charge requirements with a large battery capacity, the charge current in a portable device could reach up to 4 A or higher. This demand for high power brings a lot of new challenges to a battery-powered system design.
USB Power
A 5 V USB power source is commonly used in portable devices. A traditional USB port has a maximum output current of 500 mA for USB 2.0, or 900 mA for USB3.0, which is insufficient to rapidly charge a portable device. A USB adapter (dedicated charge port, or DCP) can increase the output current up to 1.8 A with a micro-USB connector. Unfortunately, a typical 5 V/2 A power adapter only provides a total of 10 W power capability. Using such a power adapter as a power source for the charger, the battery charger only provides up to 2.5 A charge current. This is not high enough to fast-charge a battery pack of 4,000 mAh and above. Can we continue to increase the output current of a 5 V power adapter in order to increase power? In theory, yes, if we increase the cost and use a special cable. However, there are some limitations:
• A higher adapter current (for example, 2 A or higher) requires a thicker cable wire and special USB connector, which increases the system solution cost. Additionally, the traditional USB cable is not good enough due to power loss and safety concerns.
• The typical impedance of an adapter cable wire is from around 150 to 300 mOhm, depending on the cable’s length and thickness. High adapter output current causes a higher voltage drop across the cable wire and reduces the effective input voltage of the charger input. When the charger input voltage is close to battery charge voltage, the charge current significantly decreases, which increases charge time.
Using a 5 V/3 A adapter with 180 mOhm cable resistance, for example, the voltage drop across the cable is 540 mV. Now the charger’s input voltage is 4.46 V. Let’s assume that the total resistance from the charger input to the battery pack is 150 mOhm. This includes the power MOSFET’s on-resistance from the charger and inductor DC resistance. The maximum charge current is only 730 mA for charging a 4.35 V lithium-ion (Li-Ion) battery cell, even if the charger is capable of supporting 3 A. Less than 1 A charge current is definitely not high enough to achieve a fast charge.
Based on the above analysis, the power source input voltage must increase to provide enough voltage in order to keep the charger from entering dropout mode. Due to these limitations, when a system requires higher than 10 W or 15 W power, a high-voltage adapter is preferred, such as 9 V or 12 V. A high-voltage adapter requires less input current for the same power and has more input voltage margin to fully charge the battery voltage. The only limitation with a high-voltage adapter is a backwards compatibility issue. If a high-voltage adapter is plugged into a portable device designed to support 5 V input, either a system shut-down (due to overvoltage protection), or possibly damage the device (due to insufficient high-voltage protection) could occur.
Because of these limitations, many new hybrid high-voltage adapters such as the USB Power Delivery adapter are coming to market. A common feature of these hybrid voltage adapters is the capability to recognize system voltage requirements with a handshake between the adapter and system controller. The adapter starts with 5 V output as a default value. It only raises the voltage to a higher value of 9 V or 12 V when the system confirms it can support it to achieve fast charging. The system and adapter communication can be accomplished through either VBUS or D+ and D- lines using a special handshaking algorithm or signal. This new hybrid adjustable voltage adapter can become a universal power source to support both a traditional 5 V as a popular power source, and a high input voltage system for fast charging.
Fast Battery Charging
Can we further reduce the charge time through some unique battery charging approach without increasing input power or charge current? To find out, let’s take a look at the battery-charging cycle.
There are two operation modes in a battery charging cycle: constant current (CC) and constant voltage (CV) modes. A charger operates at CC mode when the battery voltage is below the regulated charge voltage. Once the sensed battery pack terminal voltage reaches a predetermined regulation voltage, it enters CV mode. Battery charging is terminated when the real battery current reaches termination current. This is usually about five to 10 percent of the full fast charge current.
In an ideal charging system, without any resistance in the battery pack, only constant current mode exists. It has the shortest charge time without CV charging mode. This is because the charge current immediately drops to zero and reaches the charge termination current once the charge voltage reaches the regulated charge voltage.
However, in a real battery-charging system, there is a series of resistances from the battery voltage-sensing point to the battery cell. Such resistance includes: 1) the PCB trace; 2) the on-resistance of two battery charging and discharging protection MOSFETs; 3) current sense resistance to measure the battery charge and discharge current for over-current protection in the fuel gauge; and 4) battery cell internal resistance, which is a function of battery cell aging, temperature, and state-of-charge.
With a 1C charge rate for a new battery cell, the charger takes about 30 percent charging time in CC mode to reach about 70 percent of the battery capacity. Conversely, it takes about 70 percent of the total charge time to achieve only 30 percent of the battery capacity in CV mode. The higher the battery pack’s internal resistance, the longer the charge time is required in CV mode. Only when the battery open circuit voltage reaches the maximum charge voltage is the battery then fully charged. With a higher resistance between the battery charge voltage sensing point and real battery cell, even when the battery pack sensing voltage reaches the regulated voltage, the real battery cell open circuit voltage is still lower than the desired regulated voltage.
Things become more challenging with a higher charge current like 4 A or higher in smartphone and tablet applications. At such a high-charge current, the voltage drop on PCB trace, or the battery pack’s internal resistance, increases significantly. This causes the charger to enter CV mode earlier, which slows down the charging. How do we reduce the charge time due to this high voltage drop?
By closely monitoring the charge current, the voltage drop in the charge path can be estimated accurately in real time. The resistance compensation technique called IR compensation raises the battery regulation voltage to compensate for the additional voltage drop in the charging path. Now the charger stays in constant-current regulation mode longer until the real battery cell open circuit voltage is very close to the desired voltage value. In this way, CV charging-mode time can be reduced significantly, reducing total charge time by as much as 20 percent.
System Thermal Optimization
To achieve a fast-charging function, use a higher power adapter like 9 V/1.8 A and 12 V/2 A. In addition to charging the battery, a battery charger also powers the system. It is one of the hottest spots in the portable power device. For a better end user experience, the maximum temperature rise between the device case and ambient temperature should not exceed 15°C. This is why the battery charger power conversion efficiency and system thermal performance become more critical. How do we achieve best thermal performance and efficiency?
Figure 1. This block diagram represents a 4.5 A I2C high-efficiency switching charger.
Figure 1 shows the simplified application circuit diagram of a 4.5 A high-efficiency switch-mode charger. This charger supports both a USB and AC adapter, and all the MOSFETs are integrated. MOSFETs Q2 and Q3 and inductor L are composed of a synchronous switching buck-based battery charger. This combination achieves the highest possible battery charging efficiency and fully uses the adapter power for achieving fastest charging. MOSFET Q1 is used as a battery reverse blocking MOSFET to prevent battery leakage to the input through the body diode of MOSFET Q2. It is also used as an input current-sensing element to monitor the adapter current. MOSFET Q4 is used to actively monitor and control the battery-charging current. All the FETs have to be designed with sufficiently low on-resistance to achieve high efficiency. To further improve thermal performance, a thermal regulation loop is introduced. It maintains the maximum junction temperature by reducing the charge current once it reaches the pre-defined junction temperature.
Figure 2. Charge time comparison with different charge current: 2.5 A vs 4.5 A.
Experimental Test Results
Figure 2 shows the relationship between the charge current to the charge time. It is easy to understand that a high-charge current achieves faster charge, as long as the battery charge current rate doesn’t exceed the maximum rate specified by the battery cell manufacturer. As shown in Figure 2, charge time is reduced by 30 percent. In other words, charging minutes are reduced from 269 to 206 when the charge current increases from 2.5 A to 4.5 A.
Figure 3 shows the charging time benefit using an IR compensation technique in a practical charger design. Charge time is reduced by 17 percent, or from 234 to 200
Figure 3. Fast charge comparison with IR compensation, driving charge time down from 234 to 200 minutes.
minutes with 4.5 A charge current. This is accomplished by compensating 70 mOhm resistance for charging a single-cell 8,000 mAh battery without additional cost increase and thermal impact.
Summary
Rapid charge is becoming ever more important in many portable devices. This requires new design considerations in the practical charging system, including a new type of high-voltage adapters, charge current and thermal optimization. Also required is an advanced charging profile to optimize charge time and improve battery life. The experimental results provided demonstrate the effectiveness of the design for rapid charge.
Michelle Qiong Li is a systems and applications engineering manager and leads the engineering team in TI’s Battery Charger Management group. She holds 14 U.S. patents.
Jinrong Qian is a product line manager of battery charge management and an Emeritus Distinguished Member of the TI’s Technical Staff for Battery Management Solutions. He has published a myriad of peer-reviewed power electronics transactions and power management papers, and holds 28 U.S. patents.
Source : Batterypoweronline.com