Ohms law and voltage souces question.

Thread Starter

Iamma

Joined May 13, 2022
8
I can't wrap my head around a concept regarding ohms law. If a voltage source is meant to introduce a constant voltage into a circuit, how does Ohms law function because V = I*R, which in my understanding means that the voltage changes depending on the current. So these two concepts together don't make sense in my head.
 

crutschow

Joined Mar 14, 2008
34,044
I can't wrap my head around a concept regarding ohms law. If a voltage source is meant to introduce a constant voltage into a circuit, how does Ohms law function because V = I*R, which in my understanding means that the voltage changes depending on the current. So these two concepts together don't make sense in my head.
Ohm's law applies for any resistance in the circuit.
An ideal voltage source has zero resistance so the voltage is not affected by the current.
 

Hymie

Joined Mar 30, 2018
1,272
Another way of looking at this is to consider a 12V battery with nothing connected to the terminals, since the resistance between the terminals will be infinite, the current flow will be zero. Now if you connect a resistor of 6 ohms between the terminals, according to ohm’s law a current of 2 amps will flow.
 

WBahn

Joined Mar 31, 2012
29,857
I can't wrap my head around a concept regarding ohms law. If a voltage source is meant to introduce a constant voltage into a circuit, how does Ohms law function because V = I*R, which in my understanding means that the voltage changes depending on the current. So these two concepts together don't make sense in my head.
Ohm's Law is a mathematical description of the relationship between the voltage across a component and the current through that component and it only applies to components that obey Ohm's Law (which, not surprisingly, are known as ohmic components).

An ideal voltage source is not an ohmic component -- it will provide ANY current, be it positive, negative, or zero, that is needed to establish and maintain the nominal voltage across it's terminals.
 

LowQCab

Joined Nov 6, 2012
3,937
A Voltage-Source is generally not considered to be a "Component" in a Circuit,
and is many times not used as a Factor in an Ohms-Law calculation,
with the exception that,
nothing is perfect,
and,
a real-World Power-Supply or Battery will have a small amount of "Internal-Resistance",
or, it could have a substantial amount of Internal-Resistance, such as with very small Batteries or single-Cells,
which sometimes must be taken into consideration.

"Regulated" Power-Supplies are generally considered to have "virtually-zero" Internal-Resistance,
but, of course, everything has it's limits and compromises.

A "Regulated" Power-Supply that is rated for, let's say, 5-Amps maximum,
is rated at 5-Amps because attempting to draw more than 5-Amps may cause
the Voltage to go "Out-of-Regulation" and start to "sag" down to a lower output Voltage,
( amongst other problems like over-heating and shutting-down, or increased noise and ripple ).
There is always a limit on the maximum amount of Current that can be drawn from ANY Voltage-Source.
A "true-Voltage-Source", by the usual assumption/definition, does not actually exist in real life.
.
.
.
 

BobTPH

Joined Jun 5, 2013
8,661
V = I R gives the voltage across a resistor when the resistance and current are known. If a voltage source is put across a resistor, a the current flows obeys that equation.
 

MisterBill2

Joined Jan 23, 2018
17,793
You are starting at chapter 2 in the basic DC theory course. Go back to page one and develop the initial understanding. Then it will all make perfect sense to you. I am serious. Electricity is an area where understandings are built on initial understandings. And when built on the solid foundation of the most basic insights, it all makes perfect sense.
 

dl324

Joined Mar 30, 2015
16,677
If a voltage source is meant to introduce a constant voltage into a circuit, how does Ohms law function because V = I*R, which in my understanding means that the voltage changes depending on the current. So these two concepts together don't make sense in my head.
What doesn't make sense?

If the power supply is operating within its specifications, Ohm's Law will hold.

If too much current is being drawn from the power supply and it starts to current limit, its output voltage will drop and Ohm's Law will still hold.
 
Top