A question about amperes - Can you help?

Discussion in 'General Electronics Chat' started by mycal, Aug 4, 2008.

  1. mycal

    Thread Starter New Member

    Aug 4, 2008
    3
    0
    I have a question:

    Do I understand correctly that the amperage draw of a device is determined by the internal resistance and " current needs" of the device??

    Therefore, if I use a 15vdc 1.5a power supply or a 15vdc 5a power supply, it does not matter, because the device will determine the actual amperage taken from the power supply ?

    Thus, if the device actually needs/uses 1a, both of these supplies would work without harming the device? Or does excess amperage available put the device into danger of over-current?

    The only way that the amperage to the device would increase is by increasing output voltage or lowering the internal resistance in the device?

    My email is psycan@gmail.com -- Tom.
     
  2. tibbles

    Active Member

    Jun 27, 2008
    249
    3
    hi tom
    im sure the experts here will be getting back to you on this i'm 99% sure you are correct,
    id like to tax them a little more- those old enough to remember dynamos will recall they had a current regulator while alternator systems do not?
    regards
    dougal
     
  3. hgmjr

    Moderator

    Jan 28, 2005
    9,030
    214
    You have the right idea.

    hgmjr
     
  4. Ratch

    New Member

    Mar 20, 2007
    1,068
    3
    mycal,

    It is like drinking from a water fountain. You only swallow what you want or need.

    Ratch
     
  5. SgtWookie

    Expert

    Jul 17, 2007
    22,182
    1,728
    This is subjective. If you are talking about VOLTAGE REGULATED power supplies, then you would be correct.

    However, if you were talking about unregulated "wall wart" type supplies, then "your mileage may vary".

    An unregulated supply will usually put out within 10% of the voltage specified when the specified current is drawn. However, if a much lower current is being drawn, the voltage could be considerably higher.

    Therefore, if a wall wart rated 15vdc 1.5a was being used to power a 15vdc 1.5a rated device, it would be fine. However, if a 15vdc 5a rated wall wart was being used to power the same device, the actual input voltage may be quite a bit higher, and the device may be damaged.
     
  6. mycal

    Thread Starter New Member

    Aug 4, 2008
    3
    0
    Thank you all very much for the replies.

    I am looking at having to use a computer power supply of 15v 5a for a 15v 1.5a device. Would this be a voltage regulated supply where the wall warts are not, or can I connect a voltage regulator into the output line?
     
  7. hgmjr

    Moderator

    Jan 28, 2005
    9,030
    214
    The majority of computer power supplies are fairly well regulated so you should be safe to proceed.

    hgmjr
     
  8. iamspook

    Member

    Aug 6, 2008
    27
    0
    Don't forget too that a source with HUGE amp delivery capability can potentially give SPECTACULAR results under fault conditions. (Like flames and big sparks and busted components.) So sometimes a lab supply with a short-circuit crowbar output and adequate but limited current capability is better.
     
  9. mycal

    Thread Starter New Member

    Aug 4, 2008
    3
    0
    Thank you!
    I ended up using a 15v 1300ma power supply on a 15v 1500ma device. It is working fine. I would think that the device would get near the 1500 ma draw only at peaks.
     
  10. mindmapper

    Active Member

    Aug 17, 2008
    34
    0
    A thing too watch up for is that switch mode power supplies often need a minimum load. If loaded to light the risc is they will not start up or deliver noise/garbish.
     
Loading...