Here's my little bit to this thread: Ohms law says that current times resistance is equal to voltage; voltage divided by current is equal to resistance and voltage divided by resistance is equal to current. You probably already know that much.
So "IN THEORY" if you have a 12 volt battery and a 1 ohm resistor comprising a circuit then you have 12 amps flowing through it. Let's bring this closer to real world: You have a 12 volt car battery capable of 100 cold cranking amps. You put a 1 ohm resistor across the terminals (and let's ignore a couple things - wattage and the absolute real world reactions to these theoretical circuits). Since the capability of the car battery is great, the 12 volts (let's assume 12 volts exactly) will hardly drop when you draw heavy current through it. The 12 volt battery (let's say) drops to 11.9 volts with the 1 ohm resistor in the circuit. The current is now 11.9 amps, not 12 amps. Now lets do the same test again but use a small 12 volt battery, something capable of 50 amps (ok, that's not so small) (stick with me here). When the 1 ohm load is placed in the circuit now the battery voltage drops to 11.8 volts and the current drops to 11.8 amps. Here's the important detail - the resistance doesn't change (in theory). It continues to be 1 ohm. So as the voltage drops so does the current. If you had a 12 volt battery capable of 10 amps and you put the 1 ohm resistor in circuit the battery voltage will drop significantly. Let's assume it drops to 9.8 volts. The current is going to be - yes, you guessed it - 9.8 amps. Real numbers will definitely be lower, but I use these as an example to explain how supply voltage depends on the ability to deliver current at that voltage. Exceed the capacity and the voltage drops significantly - until it reaches some equilibrium. Remember, the voltage changes and so does the current, but the load remains constant. In theory. Resistors DO change value with heat, so the notion that resistance doesn't change is not entirely accurate. But for the sake of understanding, assume a 1 ohm resistor is going to always be 1 ohm. Any voltage applied across it is going to result in the same current as the voltage.
What happens if we introduce ZERO ohms? In theory we have infinite amperage. Which in practical terms doesn't happen. 12 x 0 does not equal infinity. It usually equals a failed circuit. Or a lot of heat.
So "IN THEORY" if you have a 12 volt battery and a 1 ohm resistor comprising a circuit then you have 12 amps flowing through it. Let's bring this closer to real world: You have a 12 volt car battery capable of 100 cold cranking amps. You put a 1 ohm resistor across the terminals (and let's ignore a couple things - wattage and the absolute real world reactions to these theoretical circuits). Since the capability of the car battery is great, the 12 volts (let's assume 12 volts exactly) will hardly drop when you draw heavy current through it. The 12 volt battery (let's say) drops to 11.9 volts with the 1 ohm resistor in the circuit. The current is now 11.9 amps, not 12 amps. Now lets do the same test again but use a small 12 volt battery, something capable of 50 amps (ok, that's not so small) (stick with me here). When the 1 ohm load is placed in the circuit now the battery voltage drops to 11.8 volts and the current drops to 11.8 amps. Here's the important detail - the resistance doesn't change (in theory). It continues to be 1 ohm. So as the voltage drops so does the current. If you had a 12 volt battery capable of 10 amps and you put the 1 ohm resistor in circuit the battery voltage will drop significantly. Let's assume it drops to 9.8 volts. The current is going to be - yes, you guessed it - 9.8 amps. Real numbers will definitely be lower, but I use these as an example to explain how supply voltage depends on the ability to deliver current at that voltage. Exceed the capacity and the voltage drops significantly - until it reaches some equilibrium. Remember, the voltage changes and so does the current, but the load remains constant. In theory. Resistors DO change value with heat, so the notion that resistance doesn't change is not entirely accurate. But for the sake of understanding, assume a 1 ohm resistor is going to always be 1 ohm. Any voltage applied across it is going to result in the same current as the voltage.
What happens if we introduce ZERO ohms? In theory we have infinite amperage. Which in practical terms doesn't happen. 12 x 0 does not equal infinity. It usually equals a failed circuit. Or a lot of heat.