very stupid question on voltage and current

Thread Starter

nearownkira

Joined Feb 19, 2008
20
let say for example:

why 10V sources deliver 100mA is better than 10V sources deliver 50mA, isnt it the latter one has more power dissipation since P=IV?
 

Caveman

Joined Apr 15, 2008
471
I'm gonna take a swag at this horrendously vague question.

What I think you are trying to ask is:
"Why is a 10V source that is capable of delivering 100mA better than a 10V source that is capable of delivering 50mA?"

Because the latter can do something the former cannot. This is assuming everything else is the same, of course.

The second part of the question about power dissipation isn't really correct. The question is why can't the second one supply more than 50mA? It is not necessarily because it has more power dissipation. For example, maybe its parts can't handle more current than that due to small package thermal limitations.
 

Zak

Joined Jun 23, 2008
2
I would just like to add to ur info that, if the two voltages are same (i.e. 10V and 10V) the one with lower current will be better. Nowadays, powerplants increase the voltage and decrease the current, to allow the electricity to travel longer distances.
This is something u might like to know since ur talking about power dissipation.
 

zamansabbir

Joined May 27, 2008
15
I think both of u are wrong. because every source has its own internal resistance which is in series with source. it is delivering a power but if the current is high then I^2*R loss will be high, as the resistance is kept high to produce low current to cause a small loss in supplying power.
 

rvb53

Joined Apr 21, 2008
28
hi everyone,
I too agree with Zak, because when I was doing my summer Internship at a power plant....they followed the same concept of increasing voltage and decreasing current.....I agree with him....
 

Audioguru

Joined Dec 20, 2007
11,248
The power supply is supposed to efficiently deliver its power to the load. The load dissipates the power, not the power supply.

In ancient times a huge resistor in series with a huge zener diode would burn any extra power that the load did not use.
 

Ratch

Joined Mar 20, 2007
1,070
To the Ineffable All,

This is a power transmission question. The electric utilities try to crank the voltage on their transmission lines as high as practicable to reduce their IR losses. A higher voltage means the current is smaller for the same power delivered, and the wires can be of lessor diameter thereby saying weight and cost. The relative ease with which AC can be boosted to a higher voltage (transformer) and then lowered at the point of use (another transformer) gave it such a head start that T. Edison's DC scheme could never compete with it. Ratch
 

veritas

Joined Feb 7, 2008
167
It seems to me as though at least one person in the reply chain is confused about what the maximum current rating means.

The current drawn is totally dependant on the load, not the power supply. If the load will draw 50mA or less at 10V, both power supplies will function exactly the same.

If the load draws more current (e.g. a smaller resistance for a linear circuit) then the 100mA supply will continue to function normally, but the supply rated for 50mA will start to lose voltage and/or overheat.
 
Top