Now, just to let you guys know, I am a beginner relative to you all. I have read up on ohms law, and a lot of other things. I've even got to the point of making a three phase ac wind turbine generator (though I don't understand it all the way.) However, something is really bothering me, and I hope it doesn't sound dumb (keep in mind that I'm only 15.) Ohms law states that V=IxR, and this other law says that W=VxI (watts, volts, current.) However, when placed into a circuit it is very strange. Let's say we have a 6v battery, and for convenience sake we're going to use it on a load (resistor) with a constant resistance, and not take into account the wire's resistance. Now, a 6v battery going through a resistor of 100 ohms will give us 0.06 amps. 0.06 x 6 = 0.36 watts. Now, a circuit with the same battery, and a load of 30 ohms will give us 0.3 amps. 0.3 x 6 = 1.8 watts. Now, tell me why a circuit with less resistance is consuming more power. To me, I've always thought that big power hungry machines in the home like air conditioners have a higher resistance, because a higher resistance means more energy output, right? Now, I know that a short circuit will draw unlimited amount of current, (at least as much as the battery is capable of), and that that power consumption must be pretty high. But there's no resistance! (Apart from the internal resistance of the battery.) Tell me why, particularly with my 6v circuit example, please!