Why does the current get low?

OBW0549

Joined Mar 2, 2015
3,566
LED is a diode, right?
Yes.

it has a voltage drop, doesn't it?(between 0.3 - 0.7)
Yes, it has a voltage drop, but more like two to three volts at normal current levels (a couple of milliamperes) because of the particular semiconductor materials it is made from (e.g., GaAsP instead of Si); generally, the shorter the wavelength of light the LED emits, the higher its voltage drop will be. Blue, violet and ultraviolet LEDs will have the highest voltage drop, and red and infrared LEDs the lowest.

Germanium diodes (no longer used very much) have a voltage drop of around 0.3 volts at normal current levels, and silicon diodes approximately 0.65 volts.

What does drop voltage across a diode? resistor?
No resistor. It's a matter of semiconductor physics; see here for details.

if so, then why we use a resistor in series with it?
Because there is no resistor in the LED.
 

Tonyr1084

Joined Sep 24, 2015
7,905
Think of a diode as a one way gate. If you connect a diode across a battery then basically it's a short circuit. If the diode happens to be an LED with a 3 volt forward rating then it's no wonder you saw 3 volts when you measured it. You said current drops. Well, for a small battery like that, dead shorting it will definitely drop the current. Especially if the battery is not a new battery.

I once got a hold of an 80 volt battery. It was about twice the size of a 9 volt transistor battery but it was considered to be "Dead". When I read the nominal voltage (no load) it read 80 volts. But when I hooked a red LED to it (many years ago) the LED would glow somewhat dimly. Under normal circumstances a red LED on an 80 volt source with no current limiter (resistor) it would have flashed like a camera flash bulb and would have been a done deal. The reason why it didn't was because the battery, while it had 80 volts, had no power. It truly was a dead battery. But it could light an LED. I went on to use it as a current tester to test christmas lights. If I hooked the light bulb in series with the LED and the bulb was good then the LED would glow, telling me that light was good. Understand that these christmas lights were low voltage bulbs. Their voltage was around 2.4 volts. A string of 50 would add up to 120 volts. But if any one bulb burned out the whole string would go out. You'd have to sit there and swap bulb after bulb after bulb until you found the dead one. Testing them (and I didn't have the understanding back then to use an ohm meter on the bulb), testing them meant finding the bulb without having to go through all the string. And sometimes more than one burned out. Swapping bulb after bulb would have meant nothing unless you changed absolutely every bulb on the string.

OK, back to your LED and why the voltage was 3 volts: That's because the LED was acting like a near dead short and all that COULD get through was what the forward voltage of the LED was. But voltage is not the same as current. Voltage is likened to "Pressure". Current is likened to the flow of electrons. You can have high current but very low voltage, or vice versa. Or both can be high, or both can be low. They're all a product of how much resistance is in the circuit.

Ohms law says that the voltage is equal to the current multiplied by the resistance. So if you have 1 amp and 100 ohms then the voltage MUST be 100 volts. In your case you had a limited amount of current available from what most of us suspect was a weak battery. That is why your LED didn't go POOF! Someone mentioned the car battery. I 100% agree, you'd have made an LED flash bulb - good for one use.

The right way to light an LED is to know the forward voltage. Subtract that from your voltage supply (assuming fully charged) then using ohms law, calculate the needed resistance to give you 10 mA (ten milli-amps; or 0.01 amps) of current. In your case I'd start with 9 volts and subtract 3 volts, leaving me 6 volts. Wanting a current of 10 mA I'd divide 6 (volts) by 0.01 (amps) and come up with 600 ohms. That way I could measure the voltage across the diode and find 3 volts, and measure the voltage across the resistor and get 6 volts. Kirchhoff's law says all voltage drops will be equal to the supply voltage. So 6 + 3 = 9, the 9 volts I started with.

Why 10 mA? Because MOST LED's will work just fine on that current. Remember, LED's are not voltage devices they are current devices. If you had 120 volts and wanted 10 mA through it then (120 - 3) / 0.01 = 11,700 (ohms, or 11.7KΩ for a total of 0.01 amps.) The same LED would be just as bright in either circuit. But with such a high voltage, subtracting the forward voltage of the LED isn't going to make much difference. If I ignored the forward voltage of 3 volts then I'd be using a 120,000Ω (120KΩ) resistor. That'd give me 0.010256 amps, a difference of 256 micro-amps. Certainly nothing that's going to blow up anybody's circuit.
 

Thread Starter

booboo

Joined Apr 25, 2015
168
@Tonyr1084
Hey Tony
I'm not sure but I think you're an angel! your post was great!

I have a DMM (VC9805) and it has a diode mode. trying to measure voltage of an LED with 1.6v forward voltage and it's ok but it doesn't work(just shows 1) trying to measure the voltage forward of an LED with 3.2v(that LED we were talking about in this topic) forward voltage. is there any limitation for DMMs?
 

dl324

Joined Mar 30, 2015
16,943
I have a DMM (VC9805) and it has a diode mode. trying to measure voltage of an LED with 1.6v forward voltage and it's ok but it doesn't work(just shows 1) trying to measure the voltage forward of an LED with 3.2v(that LED we were talking about in this topic) forward voltage. is there any limitation for DMMs?
My DMM will check LEDs with a forward voltage > 1.999V. The LED lights dimly and the displayed value indicates overrange. The current is around a mA, but it changes for different loading.
 
Top