MPPT and DC Conversion

Thread Starter

InPhase277

Joined Feb 15, 2018
23
Hello! First post, thanks for looking.

I have a couple of questions in the area of DC-DC converters, specifically buck style, and some crossover for maximum power point tracking in solar charge controllers.

1) Is the standard buck converter topology scalable for large (>50 amps) currents by just a suitable selection of components? I mean, are high current buck converters the same as low current buck converters, just bigger?

2) Solar MPPT. Everything I read about MPPT shows a single DC-DC conversion, but I can't figure out how that could allow for the impedence-matching necessary for MPPT. It seems to me that you would need two conversions: one for controlling the solar input and one for battery charging. I say this because batteries have different charge modes depending on their state of charge and I can't see how a single converter could give the batteries what they need while simultaneously altering the solar input side for tracking. It looks like a single converter would allow you to track the maximum power point, but then the output would not always be what a battery needs.

For example, a single converter could take 120 VDC from a solar array and convert it to 12 volts to charge a battery at 20 amps. So 240 watts going into the battery. How is the input side controlled to allow for MPPT?

Clear as mud questions I know. Thanks.
 

crutschow

Joined Mar 14, 2008
34,464
1) Basically, yes.

2) You can either do MPPT to get maximum power from the solar panel, in which case the battery must absorb all the power, or you can reduce the current to the battery if it's near full charge, in which case you are not doing MPPT and getting the maximum solar power.
You obviously can't do both at the same time.
Both these functions are typically done by one converter, performing MPPT when the battery can accept the charge, and limiting the power when the battery is full.
 

Thread Starter

InPhase277

Joined Feb 15, 2018
23
1) Basically, yes.

2) You can either do MPPT to get maximum power from the solar panel, in which case the battery must absorb all the power, or you can reduce the current to the battery if it's near full charge, in which case you are not doing MPPT and getting the maximum solar power.
You obviously can't do both at the same time.
Both these functions are typically done by one converter, performing MPPT when the battery can accept the charge, and limiting the power when the battery is full.
Thanks. So would there be any benefit to using two converters back-to-back? Going backwards, let's say you have a converter that is dedicated to optimizing the battery charge state. It would swing to as high as, say, 15 V but never going below 13.8 V except for current limiting. This converter is fed from another converter that is always optimizing the V-I relationship of the solar panels. Obviously there would be feedback from the output to the input, in an attempt to drive the solar watts as close to the output watts as possible.

Is there any gain there?
 

crutschow

Joined Mar 14, 2008
34,464
Thanks. So would there be any benefit to using two converters back-to-back? Going backwards, let's say you have a converter that is dedicated to optimizing the battery charge state. It would swing to as high as, say, 15 V but never going below 13.8 V except for current limiting. This converter is fed from another converter that is always optimizing the V-I relationship of the solar panels. Obviously there would be feedback from the output to the input, in an attempt to drive the solar watts as close to the output watts as possible.

Is there any gain there?
No.
If the battery controller is limiting the power to the battery, then there's no purpose to optimizing the power from the solar panel.
 

Thread Starter

InPhase277

Joined Feb 15, 2018
23
No.
If the battery controller is limiting the power to the battery, then there's no purpose to optimizing the power from the solar panel.
I'm missing the point to MPPT obviously. Since the battery charger will always be limiting the power to the battery, when does tracking the maximum power point come into play?
 

crutschow

Joined Mar 14, 2008
34,464
I'm missing the point to MPPT obviously. Since the battery charger will always be limiting the power to the battery, when does tracking the maximum power point come into play?
The point you seem to be missing is that getting the maximum power from the panel and limiting the power to the battery at the same time are a contradiction and not possible.
MPPT is used when the battery can absorb all the power form the panel and you thus want to maximize the power from the panel.
That's the desired situation until the battery becomes charged.
If the battery can't absorb all the power from the solar panel, then you need a bigger battery.
Otherwise you are not utilizing the full power capability of the panel.
 
Last edited:

Thread Starter

InPhase277

Joined Feb 15, 2018
23
The point you seem to be missing is that getting the maximum power from the panel and limiting the power to the battery at the same time are a contradiction and not possible.
MPPT is used when the battery can absorb all the power form the panel and you thus want to maximize the power from the panel.
That's the desired situation until the battery becomes charged.
If the battery can't absorb all the power from the solar panel, then you need a bigger battery.
Otherwise you are not utilizing the full power capability of the panel.
You're doing great. I understand a little better now. Don't take my incessant questions as anything but trying to understand.

So, if the battery needs a bulk charge and can take all that's given to it, how can the solar voltage be altered in order to not put too much voltage across the battery and at the same time find the MPP? Is it just a matter of setting a battery parameter in the controller and having that limit the voltage applied to the battery?
 
There is never too much voltage across the battery. When you are doing MPPT you are stepping the voltage down to battery voltage only. But you need not worry about the "output voltage" of the buck converter ( until the battery voltage goes up to 13.8 V). Just worry about the input voltage and current ( hence power ). When battery/ output voltage reaches near 13.8 V just reduce the pulse width ( so the input power reduces ). Hope this helps!
 

crutschow

Joined Mar 14, 2008
34,464
how can the solar voltage be altered in order to not put too much voltage across the battery and at the same time find the MPP?
You seem to have a misconception about how a battery acts when being charged.
When a battery is charging it exhibits a very low impedance, so if you try to raise the voltage it will just take more current without much change in its voltage. Thus there's not normally a problem of too much charging voltage to a battery that's not fully charged, only too much current.

So if the system is correctly designed, it will have a battery large enough to take all the current (power) the solar panel can deliver when MPP is extracting the maximum power from the panel.
Thus the MPP point is used and maximum energy is being stored by the battery.

It's only when the battery approaches full charge and its voltage starts to rise that you need to reduce the current to avoid too high a battery voltage and battery overcharge. At that point the converter obviously can no longer operate in the MPP mode.
 

Thread Starter

InPhase277

Joined Feb 15, 2018
23
You seem to have a misconception about how a battery acts when being charged.
When a battery is charging it exhibits a very low impedance, so if you try to raise the voltage it will just take more current without much change in its voltage. Thus there's not normally a problem of too much charging voltage to a battery that's not fully charged, only too much current.

So if the system is correctly designed, it will have a battery large enough to take all the current (power) the solar panel can deliver when MPP is extracting the maximum power from the panel.
Thus the MPP point is used and maximum energy is being stored by the battery.

It's only when the battery approaches full charge and its voltage starts to rise that you need to reduce the current to avoid too high a battery voltage and battery overcharge. At that point the converter obviously can no longer operate in the MPP mode.
OK, I think I'm putting it together now. I suspected something along those lines because battery info always talks about charging current but only mentions voltage when it rises at the end of a charge.

So, just to be clear: If you had a 12 V battery that liked to be charged at 20 amps max, you could connect it to a 40 V source because the internal battery impedence will pull the source voltage down and all that needs controlling is the current. As the battery voltage rises, you scale back the current until it reaches its rated voltage. Sound about right?
 

crutschow

Joined Mar 14, 2008
34,464
you could connect it to a 40 V source because the internal battery impedence will pull the source voltage down and all that needs controlling is the current.
Technically if it's a pure voltage source, then its voltage can't be pulled down.
It's the current regulating circuit that provides the voltage drop between the voltage source and the battery.
As the battery voltage rises, you scale back the current until it reaches its rated voltage. Sound about right?
Right.
 

ebp

Joined Feb 8, 2018
2,332
I have actually designed several MPP switchers. They had three "error amplifiers" - that is, three interacting "closed loops" all controlling the same power converter.

One loop, the most basic for a conventional buck voltage regulator, controlled the output voltage. Providing nothing else was limiting, this established the maximum charging voltage for the battery.
One loop worked similarly, but established the maximum charging current for the battery.
The third was actually an input voltage regulator for the buck converter, and managed MPP when not limited by either of the other control loops. It established the minimum voltage at the input, so that the array voltage was close to MPP.
Typically, with full illumination on the array and a discharged battery, the current limiting would control conversion at first, then the input voltage regulator and finally the output voltage regulator.

MPP of a PV array varies primarily with temperature.
[edit to clarify - I said that poorly: MPP of course varies primarily with the amount of illumination on the array. However, the voltage at MPP does not vary a great deal with illumination until you get down to quite low light level. The voltage at MPP does vary with temperature (in the same manner as the forward voltage of any PN junction diode does.) The MPP voltage is generally a reasonably accurately predictable fraction of the the open-circuit voltage at any particular temperature for a particular cell technology. I can elaborate on why this can be useful if the discussion goes in that direction.

Except when you get to extremes, the MPP voltage is not dramatically changed with load current. If you want the very best MPP tracking, everything needs to be considered, but temperature is usually adequate, and even that can sometimes be ignored. If you want maximum power at -40°C, which often corresponds with very short solar day and substantial reduction in battery performance, then you often want every milliwatt you can scrounge.

With a power limited source, if you don't take special action a buck converter will simply "collapse" the input voltage to be equal to the output voltage. This is the "negative input resistance" nature of the beast. A buck will go to maximum duty cycle in trying to raise the output voltage. The Vout = Vin x dutycycle equation still applies, and since Vout is, in the short term, fixed by the battery, the input collapses.

A 50 amp buck converter is possible, but not something I would recommend to anyone without lots of experience, especially if you are talking about 120 volts input and 12 volts out. I've done 30A 30V buck converters, with somewhat lower input voltage. The switch must deal with the worst case of the output current and the input voltage, so for 50 A out and 120 V in, you'd probably need a 150 V 60-70 A switch - not impossible, but not cheap (for switchers, I virtually never looked at current rating, but ON resistance for FETs). Heat management in the power path can be quite a challenge.
At that power level a forward converter, a push-pull or a half-bridge is better suited. All of these are essentially buck converters with transformers for current step-up and voltage step-down. There are some significant issues with a half-bridge from a DC supply. It can be done, but it's more complex than when AC is the input. There are other topologies that can be applied, all with tradeoffs. I often say of switchers that everything is in conflict with everything else - and that isn't much of an exaggeration.

I should add that I used custom transformers and inductors for just about everything. There are now lots of really nice inductors you can buy off the shelf, but most are best suited for low-voltage outputs (i.e. they are low inductance and therefore suited for low volt·second product). As far as I know, there is nearly nothing available for off-the-shelf transformers for anything other than low power. You buy some cores, bobbins, magnet wire, foils, nomex, transformer tape, etc. etc. and have at it.
 
Last edited:

ebp

Joined Feb 8, 2018
2,332
If I were trying to do a really high current buck these days, I would consider an interleaved design. You have multiple copies of part of the power path working together, allowing each copy to be more manageable.
For example, you might use 3 buck circuits, all running at the same frequency but with the timing staggered so each would reach peak inductor current at different time. Capacitors are a significant life-limiting factor in switchers and interleaved designs can be much kinder to the caps and worth it for that reason alone in some cases.
 
Last edited:

ebp

Joined Feb 8, 2018
2,332
How you charge batteries in a PV system may differ considerably from how you would do it with an AC line powered charger. A great deal depends on what the system does and what compromises you are willing (or forced) to make.

I'm assuming lead acid batteries. They are best suited for many PV systems.

The battery can be "float charged" indefinitely. This means that the charger maintains some accurately regulated voltage across the battery. The current into the battery as it approaches full charge will drop without the charger taking any action. For best system performance and battery longevity, the float voltage should be temperature compensated. The exact (negative) temperature coefficient to use and the limits beyond which the coefficient should go to zero are very elusive and different sources will give different numbers.

If you want to bash the maximum amount of charge into a lead acid battery in the least amount of time, you must raise the terminal voltage above the float voltage unless the battery is in a deeply discharged state. This can also be useful to "equalize" the charge on the individual cells. But there is risk. Once the voltage gets above that which drives the desired oxidation-reduction processes of charging it will begin to electrolyze water in the electrolyte. Sealed cells usually have a mechanism by which the hydrogen and oxygen are recombined, as long as the rate of generation is low enough. If the rate is too high, the pressure in the battery will rise high enough to open the safety vents, the gasses will escape and water is lost forever from the battery. With conventional open cells, the lost water can be replaced manually. Again, if the voltages are managed properly, the current can probably be left unregulated, but that is not necessarily true. You can be pretty mean to a lead-acid battery in terms of current as long as the duration of overcurrent is short.

In a PV system, it is often the case that the battery can safely take far higher charge current than what the array can deliver, but it depends on the intent of the system. I was involved with one system where the battery output cables were, iirc, increased to 000 AWG to handle the load, which hints at the battery size. The charger delivered 15 A maximum. I designed the charger with fast current regulation, but to protect the charger. Those batteries could probably have taken 200 A or more of charge current.

If performance in a PV system is important, use of a microcontroller to manage everything is generally much superior to other methods. In PV systems, the load is very often permanently connected to the battery, with implications in management of voltages and currents.
 

Thread Starter

InPhase277

Joined Feb 15, 2018
23
There's a lot to take in there, but good info!

My initial question was brought about due to a lightning-damaged solar charge controller. In my day job I'm an electrician so I was the one called to replace it. I took the damaged controller apart to look at the internals. The circuit board was vaporized so I couldn't analyze much of it, but there is a huge inductor (could be a toroidial transformer) and some big capacitors so I figured buck converter. This unit was rated 40 amps at 12 or 24 volts so I was curious how a single converter could regulate the voltage and current at the battery and simultaneously alter the voltage and current of the PC array. Now I have a better understanding of battery charging thanks to thks thread and a lot of reading over the last week.

But I am curious to try to build one for my own battery system at my camp. Currently I just use a 5 amp CC/CV buck circuit that I cobbled together dead bug style with a PWM IC.
 
Top