# Amperage vs. Voltage

Discussion in 'General Electronics Chat' started by EJacson, May 11, 2005.

1. ### EJacson Thread Starter New Member

May 11, 2005
2
0
Hello,

Here is a fairly basic question that I think I know the answer to, but I need some verification. If I take a 12 volt battery rated at 5 amps and put a resistor in line to drop the voltage to 6 volts (indicated by a VOM), will the amperage double to 10 amps or stay at 5 amps? Are there any considerations that I need to be aware of if I am trying to power a 6 volt device?

Thank you.

2. ### rukrazy? Member

Mar 5, 2005
21
0
There is a lot more to it than a simple answer, however lets give it a try and use Ohms law.

E/I = R 12 volts / 5 amps = 2.5 ohms
This is the maximum internal resistance of this battery.
now you want 6 Volts?
you have to use Kirchoffs law.
in a series circuit the current is the same at any point in the circuit.
you can reduce the voltage, but you will need two resistors to accomplish this otherwise you have a parallel circuit with no reference to ground.

So if you want 6 volts you need to take two equal resistors of the same resistance value. So if you want 6 volts at 5 amps you would use ohms law again. E/I = R 6 volts / 5 amps = 1.2 ohms
Compare this to the internal battery resistance, See that is exactly half.
That means that if you put two resistors that each had a resistance value that was half of the 1.2 ohms or .6 ohms then you would have a tap using the center connection of the two resistors you can get your 6 volts. Each resistor would drop half the battery voltage.
Since you are going to use a load from the center tap you will of course change the load resistance down and it would change the total resistance across the battery. Then you would have to change the the total load and that would change the current.

One additional thought if you use a load that is the same as the internal resistance of the battery you will get the most efficient current output.,

For most all batteries you cannot reduce the voltage and get an increase in current, however if you increase the voltage up you can increase the current in direct proportion to the voltage.. you can increase the resistance of the circuit and therefore increase the voltage but you will decrease the current directly.
Hope this helps. :huh:

If you use a circuit that requires 6 volts you can find out what its resistance is using an ohm meter then using a second resistor in series of equal value and that may solve your problem because half the voltage would be dropped by your resistor and your load then. Thats assuming of course your using 12 volts as a source.

You can never increase the current by reducing the voltage, unless it approches a dead short, then if you would short the battery you would overheat the battery and potentially make it possible to explode.

3. ### EJacson Thread Starter New Member

May 11, 2005
2
0
Thank you for taking the time to answer my question. So, If I wanted 6 volts at 5 amps, then I would have to use two resistors? In series? Would the value of each resistor be 1.2 ohms? Would that change after placing a load on the circuit?

I am trying to hook up an external battery for a CD player and I was trying to calculate
if it would last longer if I used a larger battery (voltage) and reduced the voltage down to an amount that the cd player can use.

Mar 5, 2005
21
0

5. ### beenthere Retired Moderator

Apr 20, 2004
15,815
282
Hi,

There's a real problem in trying to use a fixed resistor to drop a voltage for an electronic device. Something simple like a led would be fine, as its current draw does not vary, and the solution as to the valu/wattage of the resistor would always be correct.

In a more active device, the load current will vary, and the fixed resistor will not be a good solution. A fixed voltage regulator is more like the right application. A 7806 regulator will cost about \$.78, and will supply the correct voltage over a much grater range than a fixed dropping resistor.