buck converter as a charger quesiton

Thread Starter

mike _Jacobs

Joined Jun 9, 2021
223
So i am trying to get my head around some of the buck converter nuances.

If i have a theoretical buck converter that i setup to run at a constant PWM and a constant output voltage of say 70V

And i connect the converter to a hypothetical battery battery that is say at 50V and something less then 100% SOC .... what is the output terminal voltage of the buck converter??

Is it 70V? or is it 50V?

I imagine the converter has no choice but to clamp to the battery voltage of 50V.
If that is true some amount of current will flow into the battery and charge it up....
But how much?????????


What dictates how much current will flow into the battery? The only guess i have is...

the (regulation voltage - battery voltage )/ (battery ISR)

So in this case lets say the battery has an ISR of 1 ohm

70-50/1 = 20 amps??

Are any of these assumptions correct?
This is totally hypothetical i am just wanting to consider the ideal case here.
 
Last edited:

Ian0

Joined Aug 7, 2020
10,002
So i am trying to get my head around some of the buck converter nuances.

If i have a theoretical buck converter that i setup to run at a constant PWM and a constant output voltage of say 70V

And i connect the converter to a hypothetical battery battery that is say at 50V and something less then 100% SOC .... what is the output terminal voltage of the buck converter??

Is it 70V? or is it 50V?

I imagine the converter has no choice but to clamp to the battery voltage of 50V.
If that is true some amount of current will flow into the battery and charge it up....
But how much?????????


What dictates how much current will flow into the battery? The only guess i have is...

the (regulation voltage - battery voltage )/ (battery ISR)

So in this case lets say the battery has an ISR of 1Amp

70-50/1 = 20 amps??

Are any of these assumptions correct?
This is totally hypothetical i am just wanting to consider the ideal case here.
The current will increase as much as it can, usually until some over-current protection circuit cuts in or something fails (usually a power semiconductor).
If the battery had a series resistance of 1Ω (which I presume is what you meant to write) then the battery terminal voltage will rise to 70V and 20A will flow.
If nothing else limits the current, and nothing fails, then the output impedance of the source will be the limiting factor, and the output voltage of the source will be dragged down so that the output voltage divided by the input voltage is equal to the duty cycle.

I'll just add, even though you didn't ask. . .
If you had set the duty cycle so that the output should be 40V and then connected the 50V battery, then the input current would become discontinuous i.e. it will be a triangle wave with a gap in it.
 

Thread Starter

mike _Jacobs

Joined Jun 9, 2021
223
So the regulator actually forces the battery terminal voltage to 70v. I’m sure that makes for a heck of a transient connecting the regulator to a battery with a large difference in voltage to start off.
I have never seen battery terminals change like that. In a boost converter the output clamps to the battery voltage.
Why is it opposite in a buck?

thanks for your answer
 
Last edited:

Thread Starter

mike _Jacobs

Joined Jun 9, 2021
223
If you had set the duty cycle so that the output should be 40V and then connected the 50V battery, then the input current would become discontinuous i.e. it will be a triangle wave with a gap in it.
sorry I should not of made the other post because now we are talking about this here

I thought you told me in the other post that the imput current of a buck is always disconuous?
 

Ian0

Joined Aug 7, 2020
10,002
sorry I should not of made the other post because now we are talking about this here

I thought you told me in the other post that the imput current of a buck is always disconuous?
Yes. That's confusing. . .
I meant the current into the battery, which is the output current of the buck regulator. Sorry.
 

Thread Starter

mike _Jacobs

Joined Jun 9, 2021
223
The current will increase as much as it can, usually until some over-current protection circuit cuts in or something fails (usually a power semiconductor).
If the battery had a series resistance of 1Ω (which I presume is what you meant to write) then the battery terminal voltage will rise to 70V and 20A will flow.
If nothing else limits the current, and nothing fails, then the output impedance of the source will be the limiting factor, and the output voltage of the source will be dragged down so that the output voltage divided by the input voltage is equal to the duty cycle.

I'll just add, even though you didn't ask. . .
If you had set the duty cycle so that the output should be 40V and then connected the 50V battery, then the input current would become discontinuous i.e. it will be a triangle wave with a gap in it.
next part to this

if you have a little tiny buck converter like a single IC, there has to be a certain point where when you connect it to a sufficiently large battery, the converter cannot actually regulator the terminals and or similarly arbitrarily large load.

what is the limiting factor to what the converter can and cannot regulate?
how strong is the regulator so to speak, or what factora Influence how strong it is.

how do you prove system stability at that point
 

Ian0

Joined Aug 7, 2020
10,002
So the regulator actually forces the battery terminal voltage to 70v. I’m sure that makes for a heck of a transient connecting the regulator to a batter with a large difference in voltage to start off.
I have never seen battery terminals change like that. In a boost converter the output clamps to the battery voltage.
Why is it opposite in a buck?

thanks for your answer
Actually it's the same: you're thinking of them on different timescales.
In a boost converter, thinking of it on a cycle-by-cycle basis, when the transistor switches off, the only thing that is definite is the amount of current that is going to flow, and that's the current that is already flowing in the inductor. It has to remain the same, and it can do regardless of the voltage. So if it is connected to a battery, current flows into the battery, and the battery gets more charged and perhaps its voltage might go up a little bit.

Now, thinking of it on a more long-term basis.
If the battery voltage (this is still a boost converter) is lower than it really should be for the duty cycle, then the cycle will end before the current in the inductor has got back down to zero.
So the next cycle starts and the transistor switches on, but there is still current in the inductor.
During the transistor-on phase the current increases by ΔI. ΔI = Vt/L so ΔI is the same regardless of the actual value of I when the cycle started.
So the inductor current when the transistor switches off is higher than it was on the previous cycle.
And every cycle it keeps getting higher, until the ratio of input to output voltage is what is should be for the duty cycle.

That's the same as what happens in the buck regulator.
 

crutschow

Joined Mar 14, 2008
34,679
what is the limiting factor to what the converter can and cannot regulate?
The converter is designed for a maximum current output which depends mainly on the size of the switching transistor and the magnetics.
Beyond that the circuit will either limit the current of fail, depending upon the design.
If you are charging a battery, then there needs to be some provision for limiting the current to a safe value for both the converter and/or the battery.
 

Ian0

Joined Aug 7, 2020
10,002
next part to this

if you have a little tiny buck converter like a single IC, there has to be a certain point where when you connect it to a sufficiently large battery, the converter cannot actually regulator the terminals and or similarly arbitrarily large load.

what is the limiting factor to what the converter can and cannot regulate?
how strong is the regulator so to speak, or what factora Influence how strong it is.

how do you prove system stability at that point
Most tiny little buck converters have a current limit circuit, which switches off the power transistor when the current gets too high. The effect is that the duty cycle is reduced to what it should be for the actual ratio of the input and output voltages, rather than the ratio which you wanted it to be.
 

Thread Starter

mike _Jacobs

Joined Jun 9, 2021
223
Actually it's the same: you're thinking of them on different timescales.
In a boost converter, thinking of it on a cycle-by-cycle basis, when the transistor switches off, the only thing that is definite is the amount of current that is going to flow, and that's the current that is already flowing in the inductor. It has to remain the same, and it can do regardless of the voltage. So if it is connected to a battery, current flows into the battery, and the battery gets more charged and perhaps its voltage might go up a little bit.

Now, thinking of it on a more long-term basis.
If the battery voltage (this is still a boost converter) is lower than it really should be for the duty cycle, then the cycle will end before the current in the inductor has got back down to zero.
So the next cycle starts and the transistor switches on, but there is still current in the inductor.
During the transistor-on phase the current increases by ΔI. ΔI = Vt/L so ΔI is the same regardless of the actual value of I when the cycle started.
So the inductor current when the transistor switches off is higher than it was on the previous cycle.
And every cycle it keeps getting higher, until the ratio of input to output voltage is what is should be for the duty cycle.

That's the same as what happens in the buck regulator.
Hmmm. Not sure I fully understand this

So I have built MPPTs before from a boost topology.
When I hook it up to a battery , the output terminals of the boost converter are always pinned to the battery voltage. As the battery becomes charged the the output follows the battery voltage up until I give it 100% pwm and fully shunt the MPPT

When I build a model of this in simulation I can see the behavior


When I build a model of the buck and connect it to a battery all of a sudden the battery voltage terminals are at the regulation voltage of the buck regardless of what the battery voltage was before connection.

The obvious different here is that in the case of the mppt, it gets pin def to the battery and in the case of the buck, the battery gets pinned to the regulation voltage somehow …..

Not sure why or how
 

Ian0

Joined Aug 7, 2020
10,002
Hmmm. Not sure I fully understand this

So I have built MPPTs before from a boost topology.
When I hook it up to a battery , the output terminals of the boost converter are always pinned to the battery voltage. As the battery becomes charged the the output follows the battery voltage up until I give it 100% pwm and fully shunt the MPPT

When I build a model of this in simulation I can see the behavior


When I build a model of the buck and connect it to a battery all of a sudden the battery voltage terminals are at the regulation voltage of the buck regardless of what the battery voltage was before connection.

The obvious different here is that in the case of the mppt, it gets pin def to the battery and in the case of the buck, the battery gets pinned to the regulation voltage somehow …..

Not sure why or how
Because all buck or boost regulators include a current limiter, so the duty cycle will be limited to the value that equates to Vout/Vin at the current limit condition.
 

Thread Starter

mike _Jacobs

Joined Jun 9, 2021
223
Because all buck or boost regulators include a current limiter, so the duty cycle will be limited to the value that equates to Vout/Vin at the current limit condition.
appreciate your efforts to explain. I get that the duty controls the Vin/Vout

But they seem to be fundamentally different.
One clamps to the battery voltage and the other forces the battery terminal voltage to the regulated voltage.
I don't get why. Or maybe it only clamps to an MPPT boost design.
 

Ian0

Joined Aug 7, 2020
10,002
Maybe you underestimate the amount of current it takes to raise the voltage on a battery. Impedances are in the milliohm region.
It would like to increase the output voltage to match the duty cycle, but it can’t. The current limiting prevents it.
in the case of an MPPT the amount of sunshine prevents it. The input current is inherently limited to Isc of the solar panel.
 

ronsimpson

Joined Oct 7, 2019
3,131
A good battery charger has multiple modes of operation.
At first the charger should be in CC or constant current mode. It is not voltage regulating at all.
As the voltage gets close to a set point it moves over to a constant voltage mode. CV mode. The current drops back.
There may be a trickle charge mode where the charger sends out a very small amount of current.
All of this can come from any type of power supply. buck, boost, buck/boost, etc.

Over voltage and over current shortens the life of the battery.

MPPT really complicates things. Its job is to get the most power out of something. (solar or wind or fuel cell or even a gas generator) It really does not care what the load is. Combining MPPT and charging (CC/CV) is a challenge.

If the power source can make more power than the battery can take, we are in CC or CV charging mode. If the battery can take more power, then the source can deliver, we are in MPPT mode, and the battery will be charged at a lesser current.
 

Thread Starter

mike _Jacobs

Joined Jun 9, 2021
223
Maybe you underestimate the amount of current it takes to raise the voltage on a battery. Impedances are in the milliohm region.
It would like to increase the output voltage to match the duty cycle, but it can’t. The current limiting prevents it.
in the case of an MPPT the amount of sunshine prevents it. The input current is inherently limited to Isc of the solar panel.
Im not sure what you mean....

So following your suggestion i made a simulation to try and help understand a bit more.
I made a buck converter in simulation. The i used a current limiting block that is available in the simulation software to simulate a current limit on the output of the converter to 1 amp.

The regulation is set to 70V which is does nicely. Then in simulation i connect a battery that is t 50V with 1 ohm of ISR

Meaning at best i should get a 1V rise from the battery ISR and the 1amp current limit.
However , regardless of all this, the battery terminals still jumped up to 70V even with 1A and 1ohm of ISR and a starting voltage of 50V.

I dont get it..... How is the regulator making the battery terminal rise 20V with 1A of current.
 

Thread Starter

mike _Jacobs

Joined Jun 9, 2021
223
ok i figured it out finally. I was able to put a fictional current limiter in there and was able to see the difference in voltage. The battery voltage was pinned to the regulation voltage just by V = IR and the series resistance of the battery.
 
Top