Qualcomm has officially announced Quick Charge 3+ support on USB Type-A - the next generation fast charging technology applied on the "classic" USB Type-A interface.
Qualcomm's goal is to provide the same charging speed as Quick Charge 4+, but is compatible with USB Type-A cables, thereby helping smartphone users in mid-range and low-end tablets still have To experience today's top fast charging technologies.
Quick Charge 3+ on USB Type-A will soon be supported on Qualcomm's two flagship chips in the mid-range / high-end segment, Snapdragon 765 and 765G, and the first device with this fast charging technology. will be the newly launched Xiaomi Mi 10 Lite Zoom.
In fact, both the Snapdragon 765 and 765G support Quick Charge 4 Plus, so devices running these two chips will have both Qualcomm's most popular fast charging technologies available at the moment, and The choice of which fast charging standard will depend on the ports it supports: Quick Charge 3+ with USB Type-A and Quick Charge 4+ on USB Type-C.
What is the difference between Quick Charge 3+ on USB Type-A?
As revealed by Qualcomm, Quick Charge 3+ offers the ability to charge the battery from 0-50% in just 15 minutes, promising 35% faster and up to 9 ° C cooler than its predecessor technology, but still guaranteed Focus on performance and high level of safety. According to the output power, the new technology will support the standard 20V / 3A, or 60W.
In addition, Qualcomm also supports any USB Type-A to USB Type-C adapter that can expand with 20mV steps from Quick Charge 4. Compared to Quick Charge 4/4 +, The key factor that makes Quick Charge 3+ stand out is its ability to support the low-cost USB Type-A connector, which is still being used on many new devices today.
Overall, both Quick Charge 3+ and Quick Charge 4/4 + are compatible with all the different quick charge settings in the same device, making them suitable for every customer. In the coming time, Qualcomm will continue to "universalize" this fast charging technology on all current and upcoming CPU models.
No comments:
Post a Comment