# regarding diode theory.

Discussion in 'General Electronics Chat' started by mentaaal, May 31, 2007.

1. ### mentaaal Thread Starter Senior Member

Oct 17, 2005
451
0
Hey there guys, i have been reading up on the allaboutcircuits texts on diodes and understand it all except that it doesnt really mention what happens when you dont use a series resistor in the circuit at all. Like say for example i was fiddling with some LEDs the other day and had them connected to two batteries in series. so the output of the batteries was something like 3.2 volts unloaded and i also read the biasing voltage of the led with my diode check function on my multimeter. the LED measured something like 1.8 volts. As i said i connected it all up in series such that the LED was in a circuit with a 3 volt dc supply and it worked just fine. I am not entirely sure what is happening here. According to allaboutcircuits texts if there was a series resistor here, 1.8 volts would be dropped across the led and the remainder would be dropped across the resistor. (assuming perfect battery, cables etc)

so as i dont have a resistor in the circuit is the remainder of the voltage being dropped across the circuit leads and the batteries themselves or across the led?

Cheers!

2. ### lightingman Senior Member

Apr 19, 2007
374
22
LED's do have a forward voltage of around 1.2 or so volts. The resistor is there to limit the current through the LED........Daniel.

3. ### cumesoftware Senior Member

Apr 27, 2007
1,330
10
Depends on the LED that you use. The shorter the wavelenghts of the light emmited, the greater the voltage at a stated current. New red LEDs may have a drop of 1.9V at 20mA.

4. ### Pootworm Member

May 18, 2007
29
0
I'd think that the full voltage would still drop across the LED, but the current that it draws would be much, much higher than at its 1.8V (or whatever) rating (current is exponentially related to voltage in diodes). Can you measure the current in the circuit, and maybe find the max current/voltage rating for your LED?

5. ### cumesoftware Senior Member

Apr 27, 2007
1,330
10
The drop would occur entirely across the LED. New red LEDs can handle 3V, but they will be overdriven (more than 50mA for sure). You should use a resistor, even with 3V, so the LED will last longer.

6. ### mentaaal Thread Starter Senior Member

Oct 17, 2005
451
0
thanks alot for the reply... yeah that was what i was inclined to think except that i was always told that the led's votage drop doesnt really change...well a little here or there due to the diode equation so i was wandering what would happen because i have only ever seen examples of leds connected in series with a resistor so the remainder voltage will always be dropped across the resistor.

7. ### cumesoftware Senior Member

Apr 27, 2007
1,330
10
It changes quite a bit, but not to much. It would not change in an ideal LED, but surely will change in a real one. To give an idea, a typical high-efficiency red led from kingbright would give:
0mA - 1.7V
5mA - 1.9V
10mA - 1.95V
15mA - 1.98V
20mA - 2.00V
30mA - 2.05V
40mA - 2.1V

A LED is no more than a ordinary diode, being made of GaAs or other materials instead of silicon, and emitting visible light instead of infrared/microwaves.