If you have a power supply and you limit its current, while wiring two resistors of different values in parallel to the power supply, would this limited current make it such that the voltage across the larger resistor is 0V because current tends to take the path of least resistance?
I'm curious and don't have this kind of power supply to test but here are 3 cases and what I'm thinking:
Schematic provided in attached pictures.
Voltage: 12V in each case.
I'm thinking like this: The smaller resistor is the path of least resistance, so current would tend to want to go through the smaller resistor before going through the larger resistor.
Case 1) Current is limited to 0.8 Amps (Less than enough current to get a 12V drop across smaller resistor)
My answers: Voltage across 15 Ohm resistor is 0V, voltage across 10 Ohm resistor is 8V.
Case 2) Current is limited to 1.2 Amps (Enough current to get a 12 V drop across smaller resistor)
My answer: Voltage across 15 Ohm resistor is 0V, voltage across 10 Ohm resistor is 12V.
Case 3) Current is limited to 1.6 Amps (More than enough current to get a 12 V drop across smaller resistor)
My answer: Voltage across 15 Ohm resistor is 6V, voltage across 10 Ohm resistor is 12V.
Are these answers correct theoretically? If any is wrong, please correct me with the correct answer to each case, number the cases so I know which case you're correcting me on.
On another note, unrelated to the questions here I'm thinking that when there is a short circuit, the reason the voltage across a resistor in this circuit is 0V is because there is not enough current to go through the resistor since infinite current is flowing to the path of no resistance. So no current really remains to flow through the resistor because in this traffic of infinite current there is no "stop" due to no resistance so current prefers to take this path of least resistance. Is this correct?
I hope this thread can be answered in simple terms. Just want my curiosity answered
Thanks!
I'm curious and don't have this kind of power supply to test but here are 3 cases and what I'm thinking:
Schematic provided in attached pictures.
Voltage: 12V in each case.
I'm thinking like this: The smaller resistor is the path of least resistance, so current would tend to want to go through the smaller resistor before going through the larger resistor.
Case 1) Current is limited to 0.8 Amps (Less than enough current to get a 12V drop across smaller resistor)
My answers: Voltage across 15 Ohm resistor is 0V, voltage across 10 Ohm resistor is 8V.
Case 2) Current is limited to 1.2 Amps (Enough current to get a 12 V drop across smaller resistor)
My answer: Voltage across 15 Ohm resistor is 0V, voltage across 10 Ohm resistor is 12V.
Case 3) Current is limited to 1.6 Amps (More than enough current to get a 12 V drop across smaller resistor)
My answer: Voltage across 15 Ohm resistor is 6V, voltage across 10 Ohm resistor is 12V.
Are these answers correct theoretically? If any is wrong, please correct me with the correct answer to each case, number the cases so I know which case you're correcting me on.
On another note, unrelated to the questions here I'm thinking that when there is a short circuit, the reason the voltage across a resistor in this circuit is 0V is because there is not enough current to go through the resistor since infinite current is flowing to the path of no resistance. So no current really remains to flow through the resistor because in this traffic of infinite current there is no "stop" due to no resistance so current prefers to take this path of least resistance. Is this correct?
I hope this thread can be answered in simple terms. Just want my curiosity answered
Thanks!
Attachments
-
12.6 KB Views: 29
Last edited: