Hi
I'm working on a power supply based on a smps and a linear regulator. idea is to use buck converter to drop high voltage to a value 3 volts above the required value, and use a linear regulator to get the output. I'm using LM2596 as buck converter and LM317 as linear regulator.
I wanted to measure the output voltage using a PC16F877A and display on a LCD. so I used a voltage divider at LM317's out put to reduce output voltage ito safe value.
now comes the question. power supply is designed to give variable voltage and 21.24V is the maximum. circuit is fine and voltages are regulating correctly. I used a dummy load to test power supplies ability to with stand current requirements. so far I tried up to 1A and regulator works fine. how ever there is a problem at voltage divider. I'm using 10k trimpot as divide to set divide point voltage. so I have tuned it so that when power supply give 21.25v divide point output is 5V.
under no load condition divide points voltage is correct and varies as i change the output voltage. but when I load the circuit and begin to draw current voltage at divide point increases. actually when I load the regulator, regulator's output drops slightly. I can see this on multimeter. but voltage of dividing point increases rather than decreasing. its funny, just like ohms law got inversed. that is as voltage across trimpot drops voltage on divide point increases. (it should be decreasing right?). Huh
got any idea what the heck is going on?
i have attached the linear regulator circuit here.
PS. I tried changing the trimot but results were same
I'm working on a power supply based on a smps and a linear regulator. idea is to use buck converter to drop high voltage to a value 3 volts above the required value, and use a linear regulator to get the output. I'm using LM2596 as buck converter and LM317 as linear regulator.
I wanted to measure the output voltage using a PC16F877A and display on a LCD. so I used a voltage divider at LM317's out put to reduce output voltage ito safe value.
now comes the question. power supply is designed to give variable voltage and 21.24V is the maximum. circuit is fine and voltages are regulating correctly. I used a dummy load to test power supplies ability to with stand current requirements. so far I tried up to 1A and regulator works fine. how ever there is a problem at voltage divider. I'm using 10k trimpot as divide to set divide point voltage. so I have tuned it so that when power supply give 21.25v divide point output is 5V.
under no load condition divide points voltage is correct and varies as i change the output voltage. but when I load the circuit and begin to draw current voltage at divide point increases. actually when I load the regulator, regulator's output drops slightly. I can see this on multimeter. but voltage of dividing point increases rather than decreasing. its funny, just like ohms law got inversed. that is as voltage across trimpot drops voltage on divide point increases. (it should be decreasing right?). Huh
got any idea what the heck is going on?
i have attached the linear regulator circuit here.
PS. I tried changing the trimot but results were same
Attachments
-
21.5 KB Views: 33