Do USB-C PD Wall chargers regulate their current based on input?

Thread Starter

SethB

Joined Mar 30, 2021
31
For example:
If I have a 100W USB-C Wall charger, that is plugged into a 75W AC power inverted, will the output of the USB-C charger also be 75W or will it be 0 due to not having enough input?

My assumption is 0 - the internal electronics require a specific amount to operate. I do know the device being charged determines the output current of the wall charger. But I do not know if the wall charger is "smart" enough to regulate its own power based on input.

Application:
I want a USB-C wall charger that will work on planes and with a power inverter for times of emergencies/outages. But, buying a 100W charger (to take advantage of the added current) that can regulate its own power based on input would be super ideal instead of 2 different chargers.

Thanks all!
 

ronsimpson

Joined Oct 7, 2019
3,052
If you pull 10W from the charger the charger will pull 11W from the inverter, that will pull 13W from the batteries.
If you get close to 65W you will probably start having problems.
 

DickCappels

Joined Aug 21, 2008
10,187
After ronsimpson,

The charger supplies a given voltage. The number of watts is determined by the characteristics of the load. The more current the load needs, the more current (and therefore watts at that voltage) the charger supplies.
 

Thread Starter

SethB

Joined Mar 30, 2021
31
If you pull 10W from the charger the charger will pull 11W from the inverter, that will pull 13W from the batteries.
If you get close to 65W you will probably start having problems.
After ronsimpson,

The charger supplies a given voltage. The number of watts is determined by the characteristics of the load. The more current the load needs, the more current (and therefore watts at that voltage) the charger supplies.
Ahh. So if the charger is rated at 100W, and the load is asking for 100W, the charger will try to give that many watts, regardless of the source. I see. Is there any way to limit the load?
 

DickCappels

Joined Aug 21, 2008
10,187
Using a resistive load as an example, power = (E * E)/R. (voltage squared over resistance of the load).

A 10 volt power supply would cause a:
100 ohm resistor to dissipate 1 watt,
10 ohm resistor to dissipate 10 watts,
1 ohm resistor to dissipate 100 watts,
0.1 ohm resistor to dissipate 1,000 watts.
but in no case more power than the power supply is capable of delivering, so in the example above the most power you could deliver to the load would be 100 watts because that is the rating of the power supply.

A good example is a typical AC power outlet in a home in North America. A typical outlet can supply 120 VRMS at 15 amps, which would 1,800 watts. Above this a fuse will blow or a circuit breaker would trip. A 1,000 watt clothes iron would draw 1,000 watts and a 2 watt night light would draw 2 watts.
 

Thread Starter

SethB

Joined Mar 30, 2021
31
Using a resistive load as an example, power = (E * E)/R. (voltage squared over resistance of the load).

A 10 volt power supply would cause a:
100 ohm resistor to dissipate 1 watt,
10 ohm resistor to dissipate 10 watts,
1 ohm resistor to dissipate 100 watts,
0.1 ohm resistor to dissipate 1,000 watts.
but in no case more power than the power supply is capable of delivering, so in the example above the most power you could deliver to the load would be 100 watts because that is the rating of the power supply.

A good example is a typical AC power outlet in a home in North America. A typical outlet can supply 120 VRMS at 15 amps, which would 1,800 watts. Above this a fuse will blow or a circuit breaker would trip. A 1,000 watt clothes iron would draw 1,000 watts and a 2 watt night light would draw 2 watts.
Thank you!

Now if I put a resistor in series in the cable that goes from the charger to the device being charged, would that be a way to limit the current draw on the charger?

This way, the 100W wall charger (being supplied only 75W) will be limited by the device plugged into it. Is this theory/approach correct?
 

DickCappels

Joined Aug 21, 2008
10,187
Thank you!

Now if I put a resistor in series in the cable that goes from the charger to the device being charged, would that be a way to limit the current draw on the charger?

This way, the 100W wall charger (being supplied only 75W) will be limited by the device plugged into it. Is this theory/approach correct?
The resistor in series with the input to the charger would limit the power available to the charger and thus limit the amount of current that could be drawn on the charger's output. However, the resistor would also reduce the voltage to the charger and some chargers would not work properly if that were the case. You should use a charger designed for the kind (chemistry) and size of the battery you want to charge.

The resistor probably would not be dangerous to an old fashioned charger - the kind made with a heavy iron transformer, some diodes and a capacitor. Modern light weight chargers, which should withstand a drooping input voltage without damage, I worry about a poorly designed charger failing if the input voltage drops by too much. There is some pretty flaky stuff out there.

Recommendation: Get the right charger for your battery and be safe.
 

Irving

Joined Jan 30, 2016
3,898
Now if I put a resistor in series in the cable that goes from the charger to the device being charged, would that be a way to limit the current draw on the charger?
Maybe, probably not, and the resistor will get hot too... The charger is matched to the battery technology - unless is a very dumb charger the resistor is going to screw up the charging algorithm, so you might not get a charge.

To give a better answer need to know battery technology and type of charger.
 
Top