# Buck Converter Strange Behaviour

#### NikoTek

Joined Sep 8, 2020
6
I am using a TLV62568 (https://www.ti.com/lit/gpn/tlv62568) synchronous buck converter. TLV62568's output can be adjusted using two resistors. I have configured the TLV as per the datasheet with the resistors adjusted such that my output voltage is 2.8v. I then attach a resistor of 400 ohm as load. Then I vary the input voltage from 5v to 2.8v using a dc supply. My current draw from the supply should be greater than 7mA i.e. 2.8v/400ohm. But I see some strange behavior, where my current is 4.5mA when input voltage is 5v and then gradually increases and goes upto 7mA as I decrease the input voltage to 2.8v while my output voltage is 2.8v at all times of varying input voltage. If someone could explain what is happening here.

#### Papabravo

Joined Feb 24, 2006
18,403
What you are seeing is the effect of energy storage in the inductor and energy release from the capacitor.. The switch in the buck converter is only on part of the time. Part of the time the current in the load is coming from the output filter capacitor. When the input voltage is at 5 volts, the implied duty cycle is 56%, and when it approaches 2.8 volts it is nearly 100%. I'll have to check the datasheet to see if the particular is happy with such duty cycle values. Some buck converters are not happy with duty cycles in excess of 50%.

I'm assuming here that your current numbers are coming from a display on your DC power supply. You should be able to verify that you are not getting somethin for nothing by measuring the actual current in the load. What you should observe is that Power out will always be less than power in. The ratio is the efficiency of the power conversion process.

EDIT: Ah..yes. The datasheet says you can go to 100% duty cycle for achieving the lowest possible dropout voltage.

• NikoTek

#### crutschow

Joined Mar 14, 2008
29,776
But I see some strange behavior, where my current is 4.5mA when input voltage is 5v and then gradually increases and goes upto 7mA as I decrease the input voltage to 2.8v while my output voltage is 2.8v at all times of varying input voltage. If someone could explain what is happening here.
Not strange at all.
For a switching converter the input power equals the output power (with a little added to the input depending upon the converter efficiency).
So with a constant output power, the input current goes up as the input voltage goes down to keep the input power relatively constant.