DC/DC Buck Converter

Discussion in 'The Projects Forum' started by eyybro, Dec 5, 2012.

1. eyybro Thread Starter New Member

Dec 5, 2012
4
0
Hello all,

I am currently doing a DC/DC buck converter application circuit project using a ROHM Semiconductor BD9778F.

My parameters can be viewed in the photo attachment.

My circuit is complete, and when I probe the output with no load I get 5V-which is what I want.

I am currently in the process of generating a conversion efficiency graph over 0mA ~ 500mA. When I hook up an 82Ω resistor and a standard 3.3V/20mA LED in series with the output, the LED lights up and draws about 20mA ~ 21mA as expected. However, the expected current draw only occurs when Vin = 5V. Every 1V incremental increase of Vin decreases the current draw-Vout stays at 5V and the LED retains its original brightness as if nothing changed. This is problematic, because my graph needs to be at a test condition of Vin = 12V and the current draw decreases with every voltage increase.

Thank you!

Regards,
Andy Park

File size:
24.2 KB
Views:
116
2. ErnieM AAC Fanatic!

Apr 24, 2011
7,958
1,827
That's exactly how a buck converter works. It is basically a power converter, not a voltage limiter.

A voltage limiter would have a constant input current over the input voltage range as the current in is very close to Vout / Rload.

A buck converter is adjusting the time that VCC is connected to L. For a higher input voltage this time is less and less, leading to less input current being drawn.

Try looking up buck converter on Wikipedia.

eyybro likes this.
3. thatoneguy AAC Fanatic!

Feb 19, 2009
6,357
728
Efficiency is measured by Power Out / Power In.

Power is voltage * current

measure and multiply both the voltage by the current on output, and same for the input. Divide the two and you have the efficiency (typically in the 80-95% range). If you end up with a number > 100%, your measurements are off.

Modern Super Bright LEDs will "Look" as bright with a variation in current, therefore, visual brightness observation will not be an accurate method of determining efficiency.

eyybro likes this.
4. eyybro Thread Starter New Member

Dec 5, 2012
4
0
Shouldn't a buck converter do this:

Input - 12V / 20mA --> Output - 5V / 20mA

If a buck converter keeps decreasing the input current with every subsequent input voltage increase, wouldn't that make the buck converter useless since the appliance hooked up to the output of the converter wouldn't consume much power?

5. ErnieM AAC Fanatic!

Apr 24, 2011
7,958
1,827
No. The current is not a constant thru the circuit.

eyybro likes this.
6. eyybro Thread Starter New Member

Dec 5, 2012
4
0
This has been very helpful. After using the equation given by "thatoneguy" and evaluating the comments of "ErnieM", I was able to create a conversion efficiency chart that seems similar to the one posted on the ROHM website which proves that I did my project correctly.

Thank you all and happy holidays!

-Andy Park

7. crutschow Expert

Mar 14, 2008
20,524
5,810
No. But a linear regulator would do that (with the efficiency reducing as the input voltage increases).

As previously stated the purpose of a switching converter is to do the voltage conversion with high efficiency. Thus to keep the input power proportional to the output power, the input current must go down as the input voltage goes up (or vice versa). But if the output current increases, the input current will also increase with a constant input voltage. Once again the input power would be proportional to the output power. So the converter will always deliver its rated output current, independent of the input voltage (within its designed input voltage range).

Remember we a talking conservation of energy (power) here, not conservation of current.

Is that clear now?

spinnaker and eyybro like this.