Calculating Required Voltage Drop In A Circuit

Thread Starter

RAMBO999

Joined Feb 26, 2018
230
Sounds easy enough. Right? But I'm not getting the desired effect.

A simple circuit with a supply and an LED. See attached. (LED had range 3.2V - 3.8 V)

So here's what I have done.

Put a 3.4V supply across the circuit. LED shines. Supply unit tells me the current through the circuit is 51 mA.

So, I deduce that 51 mA makes my LED shine brightly.

Now, if I want to up the supply to 9V I need to create a voltage drop of 5.6 volts in the circuit between the supply and the LED.

I do my calculation using V = iR. ie: V = 5.6V, i = .051A. So R = 5.6/.051 Ohms = 110 Ohms.

So, I put my 110 Ohms in place and it has very little effect.

What am I doing wrong here?

Thanks
 

Attachments

Last edited:

Thread Starter

RAMBO999

Joined Feb 26, 2018
230
A bit more information.

With the 110 ohm resistor in place and at 3.4V supply I get he following voltage readings.

Source: 3.388V
Resistor: 0.582V
LED: 2.805V

Current through the circuit is now 4mA.

So I am clearly using the wron approach to calculating this resistance because I am not taking into consideration changing current.
 

dl324

Joined Mar 30, 2015
10,072
So, I put my 110 Ohms in place and it has very little effect.
Your eye can't detect small differences in brightness; even if you have two LEDs operating at different currents close enough to make a comparison. If you're looking at the same LED operating at different currents, you can't make an objective comparison.
 

Thread Starter

RAMBO999

Joined Feb 26, 2018
230
Your eye can't detect small differences in brightness; even if you have two LEDs operating at different currents close enough to make a comparison. If you're looking at the same LED operating at different currents, you can't make an objective comparison.
Why would I go on brightness? I am measuring the variation in voltage accross the LED.
 

Thread Starter

RAMBO999

Joined Feb 26, 2018
230
"With the 110 ohm resistor in place and at 3.4V supply I get he following voltage readings."

Sounds like you forgot to change to 9 volts.
I changed the voltage to 9V. The LED blew instantaneously. I then repeated the exercise incrementing the voltage 1 volt at a time. The LED blew at 4 volts. That's why I measurd the voltages across the two components a 3.4V. See comment #4.
A voltage that the LED can handle. They are:

Source: 3.388V
Resistor: 0.582V
LED: 2.805V

Which doesn't make sense because on its own the LED draws 51 mA of current a 3.4V which gives an ESR of 66 ohms at that voltage. So looking at those numbers the voltage drop accross the 100 ohm resistor should be 2.01V and 1.37 accross the LED.
According to Kirschoff's law. Not enough to light the LED up. But it lights up.
 

Thread Starter

RAMBO999

Joined Feb 26, 2018
230
LEDs don't have an abrupt turn on voltage.
Yes. I am aware of that. I referred to a range in my original post I believe. But I do think this behaviour might have something to do with the fact that it is a LED in the circuit. They are a diode and do have capacitance after all. 30 pF in this case. And a minimum activation voltage of 2.8V.
 

dl324

Joined Mar 30, 2015
10,072
30 pF in this case. And a minimum activation voltage of 2.8V.
If you're not switching the LED, junction capacitance is irrelevant. Even if you're switching, it's rarely a factor that needs to be considered.

What datasheet gives a minimum "activation" voltage"? What brightness level is considered "activation"?
 

Thread Starter

RAMBO999

Joined Feb 26, 2018
230
Why do you keep mentioning ESR? When the diode starts conducting enough to emit detectable light, diode resistance should be small compared to the current limiting resistor and can be ignored.
Because the LED has a capacitance of 33 pF and can be considered to offer some resistance in the circuit as any cap does. In this case when it is the only thing in the circuit at 3.4V the measured current is 51 mA. That works out to a resistence of 66 ohms at that particular voltage.
 

crutschow

Joined Mar 14, 2008
24,403
I changed the voltage to 9V. The LED blew instantaneously.
Obviously you didn't have the resistor in series with the LED. :eek:

LEDs are diodes and are not ohmic devices.
You cannot calculate the "ESR" by dividing the operating voltage by the operating current, as that value is basically inversely proportional to the current.
As has been stated LEDs are current operated devices, not voltage.
If you keep acting as if it it not, you will not understand how they work.
 

Thread Starter

RAMBO999

Joined Feb 26, 2018
230
Would anyone care to comment on the original question? The calculation of R in the circuit. The required resistance to produce a 5.6V voltage drop in that circuit. Some advice perhaps along the lines of

Is the method of caculation I have used correct or incorrect? If not does anyone have a correct method?

Has anyone arrived at a different value to the value I arrived at? I would appreciate your method and could put it to the test.

Thanks
 

Tonyr1084

Joined Sep 24, 2015
4,192
Voltage source: 9 Volts DC (battery)
Forward Voltage Drop: "Unspecified".
Designed Current: 50 mA
(all LED's have a specific forward voltage - red is typical but not always 2 volts, green and blue are around 3 volts. Others can be higher (or lower).)

Solution:
9V - unspecified = Unresolved voltage
Unresolved voltage ÷ 50 mA = Unresolved resistance

Without knowing the forward voltage it's not possible to determine the proper resistance.

ASSUME:
9V DC
3.1 Vf
50 mA

Solution:
9V - 3.1Vf = 5.9V working voltage
5.9V (wv) ÷ 50 mA (0.05A) = 118Ω

OBSERVATION:
50 mA is too much for conventional LED's. By conventional I mean those that typically have two leads and are soldered into a board (or SMD type). Typical MAX current for those is recommended at 30 mA. 20 mA is a far better current, and the LED's don't burn as bright and hot - but still VERY bright.

Suggested solution:
9 -3.1 = 5.9
5.9 ÷ 20 mA (0.02A) = 295Ω

Suggest trying a 300 Ω resistor.
 
Last edited:
Top