I need some serious understanding of this hobby....help? Million questions!

Discussion in 'General Electronics Chat' started by stanman11, Dec 31, 2014.

  1. stanman11

    Thread Starter Member

    Nov 23, 2010
    I love this hobby but I'm a self learner. I've been told many different things, like voltage doesnt matter only amps do. So how can you use an led on 12v with a resister if its 3 volts but you will burn out a 6v motor using 12v?
    If a LED is 3V 30MAH, why is it V/I=R? even if the PSU is 12V 300mah or 12V 100mah?
    I've built som circuits before so I get some basic concepts.
    Also I live in an apartment with bad lighting.. I was thinking about buying some LED light strips to hang around. if I use a 12v psu 9adapter0 can i be more froogle on electric rather then using the 110v 45w bulbs?
    Of course id have to check the LEDs data on there MAH.

    .03*1000 comes out the 30w right?
  2. MrChips


    Oct 2, 2009
    It's all quite simple actually. Take one step at a time.


    Don't overwhelm yourself with too much all at the same time.

    It all comes down to one thing and that is Ohm's Law.

    I = V/R

    Look at this simple formula. There are three letters which we call variables.

    I = current
    V = voltage
    R = resistance

    These three are inter-related. They are NOT independent.

    If R is fixed, that is, we don't change the load, then I is dependent on V and V is dependent on I. You cannot change one without affecting the other.

    So if V = 12V and R = 600Ω
    then I must be V/R = 12V/600Ω = 0.020A or 20mA

    It doesn't matter if the power supply is rated at 12V @ 100mA or 12V @ 10A.
    It will still deliver 12v @ 20mA into a 600Ω load.
    ronv, subtech and atferrari like this.
  3. MrAl

    Well-Known Member

    Jun 17, 2014

    You mean for driving LEDs. Yes, the current is the main thing as long as you have enough voltage too.

    The LED can run on 12v with a dropping resistor because the resistor drops most of the voltage. If the LED is 3v then the resistor drops 9 volts.
    The resistor wastes power though, so it is best to match the voltage supply to the LED as close as possible, then use a resistor to drop the remaining voltage. The remaining voltage is called the "overhead" voltage. You need some overhead voltage in case the power supply and/or LEDs change voltage slightly, which they will do during normal operation.

    For your case, if you use 3 LED's in series that will bring it up to 9v total, and then a dropping resistor for the remaining overhead voltage. The resistor is always calculated from:
    E is the power supply voltage (12v),
    vLED is the total LED series voltage (9v),
    and I is the desired current (20ma).
    So you have:

    You can do the math.

    Note however that if the LEDs are the white kind, they might drop more like 3.5 volts not 3.0 volts.
  4. takao21203

    Distinguished Member

    Apr 28, 2012
    Distributed electric power accross a component depends on the actually flowing current.

    Which is determined by the components resistance, and the applied voltage.

    Its important to know voltages sources also have an internal resistance, so the voltage will go down with increasing current. Thats why you can use depleted 9v batteries to test LEDs.
  5. k7elp60

    Active Member

    Nov 4, 2008
    I would like to ad to Mr Al comments. The LED current affects it's brightness. Many LED's are clearly bright enough at much less than the specified maximum current. If you have an assortment of resistors put a higher value than the on calculated in the circuit to see if the LED is bright enough. My experience I use 1.5K 1/2W resistors for single LED's on 12V dc supplies, even the white ones.
  6. Wendy


    Mar 24, 2008
    Gdrumm likes this.