# Need to drop voltage, requesting assistance with the calculation

Discussion in 'The Projects Forum' started by jellytot, Jun 9, 2016.

1. ### jellytot Thread Starter Member

May 20, 2014
72
0
Ahhhgh, my voltage regulator is overheating. After considering several different options (including switching power adapters), I've decided to drop the voltage a little before it gets fed into the regulator. I need help with one of the steps:
R = V / I, Where :
R = Resistance in Ohms
V = Force in Volts
I = Current in Amps
Let's say my wall adapter outputs 12V. I'm going to drop it to 10V, so V=10 in the equation above.

My questions:
1. I'm stuck on I (current). For this value, do I use the "normal" current draw of the project (approx 1 amp), or the max rating of the power adapter (2 amp), or the "spike" rating (3.5 amp). Explanation of "spike": Note that the project at intervals draws 3.5A, but for extremely short periods (milliseconds).
2. When I use a resistor to drop the voltage going into the voltage regulator, will that also drop (decrease?) the current going into the voltage regulator? If so, how do I calculate this drop?
Thanks!

2. ### thumb2 Member

Oct 4, 2015
66
4
Your voltage regulator is overheating (power dissipation) because the load current is high enough to make it warm...
If you put a regulator with maximum output current greater than the load current, the best solution is to put an appropriate heat-sink.

BTW if you want to drop the voltage 2 V, Ohm's law says R = (Vi - Vo)/I.

3. ### dannyf Well-Known Member

Sep 13, 2015
1,767
357
You cannot, because it is possible within thee confines of your topology.

There is no solution as long as your current aoproag stays.

4. ### GopherT AAC Fanatic!

Nov 23, 2012
5,982
3,713

If you are using a 5V regulator and a 12V supply, you need to waste 7V at your nominal of 1 amp (7 W). Yes, some peaks.

So, you want to get the input of the regulator down closer to 7.5V (a regulator needs some head-space). So, at 1 amp, dropping 4.5V you need a 5.5ohm resistor. Since those are rare, you can use a big (5 to 10 watt) 4.7 ohm resistor or, you can put 3 of those very common 10 ohm resistors in parallel to get 3.3 ohms. All should be good.

Also, add a heat sink to the regulator - it helps more than you would think m

5. ### crutschow Expert

Mar 14, 2008
12,977
3,221
V in the equation is 2V, not 10V.
If you want the regulator input voltage to be 10V @ 1A then R would be (12-10) / 1A = 2Ω and the resistor would dissipate 2W (use a 3W or higher power resistor).
But if you draw more than 1A, the voltage will drop below 10V.
The short 3.5A spike could be handled by a large capacitor on the regulator output.

What is the output voltage of your regulator?

6. ### thumb2 Member

Oct 4, 2015
66
4
Well, a schematic would be very useful

7. ### jellytot Thread Starter Member

May 20, 2014
72
0
Hello all, thanks so far for the comments. So here are some details:
• 12V power adapter, feeding into a 5V regulator. I can't switch to a lower power adapter; some components require 12V (they draw power before the 5V regulator). I tried a 9V adapter and sure enough, those components don't work.
• Yep. I've got the "best" heat sink on my voltage regulator. By "best", I mean it's got the lowest Thermal Resistance @ Natural I could get, for the space I have available. I even tried "hacking" the heat sink to maximize heat dispersion for it's orientation. I may try another "hack" later.
• RE: "V in the equation is 2V, not 10V" - thanks for that! I was using an example I got off the net, and misread the instructions.
• RE: "But if you draw more than 1A, the voltage will drop below 10V." - noted. Thank you! The voltage regulator should be able to handle that, but I'll double check the specs. Plus, if I draw more current and the voltage drops, there's the added benefit of less heat from the regulator.