Hey there guys, i have been reading up on the allaboutcircuits texts on diodes and understand it all except that it doesnt really mention what happens when you dont use a series resistor in the circuit at all. Like say for example i was fiddling with some LEDs the other day and had them connected to two batteries in series. so the output of the batteries was something like 3.2 volts unloaded and i also read the biasing voltage of the led with my diode check function on my multimeter. the LED measured something like 1.8 volts. As i said i connected it all up in series such that the LED was in a circuit with a 3 volt dc supply and it worked just fine. I am not entirely sure what is happening here. According to allaboutcircuits texts if there was a series resistor here, 1.8 volts would be dropped across the led and the remainder would be dropped across the resistor. (assuming perfect battery, cables etc)
so as i dont have a resistor in the circuit is the remainder of the voltage being dropped across the circuit leads and the batteries themselves or across the led?
Cheers!
so as i dont have a resistor in the circuit is the remainder of the voltage being dropped across the circuit leads and the batteries themselves or across the led?
Cheers!