very stupid question on voltage and current

Discussion in 'Homework Help' started by nearownkira, May 6, 2008.

  1. nearownkira

    Thread Starter Member

    Feb 19, 2008
    let say for example:

    why 10V sources deliver 100mA is better than 10V sources deliver 50mA, isnt it the latter one has more power dissipation since P=IV?
  2. mik3

    Senior Member

    Feb 4, 2008
    i think this is a stupid question :) !!! what exactly do you mean?
  3. Caveman

    Senior Member

    Apr 15, 2008
    I'm gonna take a swag at this horrendously vague question.

    What I think you are trying to ask is:
    "Why is a 10V source that is capable of delivering 100mA better than a 10V source that is capable of delivering 50mA?"

    Because the latter can do something the former cannot. This is assuming everything else is the same, of course.

    The second part of the question about power dissipation isn't really correct. The question is why can't the second one supply more than 50mA? It is not necessarily because it has more power dissipation. For example, maybe its parts can't handle more current than that due to small package thermal limitations.
  4. Zak

    New Member

    Jun 23, 2008
    I would just like to add to ur info that, if the two voltages are same (i.e. 10V and 10V) the one with lower current will be better. Nowadays, powerplants increase the voltage and decrease the current, to allow the electricity to travel longer distances.
    This is something u might like to know since ur talking about power dissipation.
  5. zamansabbir


    May 27, 2008
    I think both of u are wrong. because every source has its own internal resistance which is in series with source. it is delivering a power but if the current is high then I^2*R loss will be high, as the resistance is kept high to produce low current to cause a small loss in supplying power.
  6. rvb53


    Apr 21, 2008
    hi everyone,
    I too agree with Zak, because when I was doing my summer Internship at a power plant....they followed the same concept of increasing voltage and decreasing current.....I agree with him....
  7. Audioguru


    Dec 20, 2007
    The power supply is supposed to efficiently deliver its power to the load. The load dissipates the power, not the power supply.

    In ancient times a huge resistor in series with a huge zener diode would burn any extra power that the load did not use.
  8. Ratch

    New Member

    Mar 20, 2007
    To the Ineffable All,

    This is a power transmission question. The electric utilities try to crank the voltage on their transmission lines as high as practicable to reduce their IR losses. A higher voltage means the current is smaller for the same power delivered, and the wires can be of lessor diameter thereby saying weight and cost. The relative ease with which AC can be boosted to a higher voltage (transformer) and then lowered at the point of use (another transformer) gave it such a head start that T. Edison's DC scheme could never compete with it. Ratch
  9. veritas

    Active Member

    Feb 7, 2008
    It seems to me as though at least one person in the reply chain is confused about what the maximum current rating means.

    The current drawn is totally dependant on the load, not the power supply. If the load will draw 50mA or less at 10V, both power supplies will function exactly the same.

    If the load draws more current (e.g. a smaller resistance for a linear circuit) then the 100mA supply will continue to function normally, but the supply rated for 50mA will start to lose voltage and/or overheat.