A few days ago, I measured the current draw of a load, powered by an 1.5V AA alkaline battery. It was around 100ma (and today, a retest with a fresh battery, the load draws 150ma).
Today, I measured the same load, but this time it's powered by a 1.5V (1A max) wall adapter. This time, the current draw is 450ma.
Why the difference? It's my understanding the load will draw as much current as it needs, so if it really needs 450ma, why didn't it draw that much from the battery (which I believe can output a maximum of around 2A)? Is there an error in my understanding on how current works?
It's actually better for me if the load uses less current (it's OK if the load doesn't "work" as well). I was going to switch the load from battery power to a 1.5V regulator, but there aren't a lot of 1.5V regulators that can handle 0.5A, plus 0.5A would create a lot of heat in the regulator. Anyway, can anyone shed some light on whats going on?
Today, I measured the same load, but this time it's powered by a 1.5V (1A max) wall adapter. This time, the current draw is 450ma.
Why the difference? It's my understanding the load will draw as much current as it needs, so if it really needs 450ma, why didn't it draw that much from the battery (which I believe can output a maximum of around 2A)? Is there an error in my understanding on how current works?
It's actually better for me if the load uses less current (it's OK if the load doesn't "work" as well). I was going to switch the load from battery power to a 1.5V regulator, but there aren't a lot of 1.5V regulators that can handle 0.5A, plus 0.5A would create a lot of heat in the regulator. Anyway, can anyone shed some light on whats going on?
Last edited: