Hello all,
I am currently doing a DC/DC buck converter application circuit project using a ROHM Semiconductor BD9778F.
My parameters can be viewed in the photo attachment.
My circuit is complete, and when I probe the output with no load I get 5V-which is what I want.
I am currently in the process of generating a conversion efficiency graph over 0mA ~ 500mA. When I hook up an 82Ω resistor and a standard 3.3V/20mA LED in series with the output, the LED lights up and draws about 20mA ~ 21mA as expected. However, the expected current draw only occurs when Vin = 5V. Every 1V incremental increase of Vin decreases the current draw-Vout stays at 5V and the LED retains its original brightness as if nothing changed. This is problematic, because my graph needs to be at a test condition of Vin = 12V and the current draw decreases with every voltage increase.
Can anyone help me, please?
Thank you!
Regards,
Andy Park
I am currently doing a DC/DC buck converter application circuit project using a ROHM Semiconductor BD9778F.
My parameters can be viewed in the photo attachment.
My circuit is complete, and when I probe the output with no load I get 5V-which is what I want.
I am currently in the process of generating a conversion efficiency graph over 0mA ~ 500mA. When I hook up an 82Ω resistor and a standard 3.3V/20mA LED in series with the output, the LED lights up and draws about 20mA ~ 21mA as expected. However, the expected current draw only occurs when Vin = 5V. Every 1V incremental increase of Vin decreases the current draw-Vout stays at 5V and the LED retains its original brightness as if nothing changed. This is problematic, because my graph needs to be at a test condition of Vin = 12V and the current draw decreases with every voltage increase.
Can anyone help me, please?
Thank you!
Regards,
Andy Park
Attachments
-
24.2 KB Views: 119