Voltage drop over resistor with changing source voltage

Thread Starter

lordeos

Joined Jun 23, 2015
33
Hi all,

I'm experimenting with a light/dark circuit with an LDR.

Which brought me to something else that i can't seem to figger out.

- How can you calculate the change in voltage drop & current trough a resistor with changing source voltage.

Example :

Led= 2v , 20Ma
Source voltage = 6 V

So i need to drop 4V over my resistor and limit the current to 20Ma for the LED.

4/0.02 = 200 Ohm ... so resistor = 200ohm

How can i calculate the change in voltage drop and current trough my resistor that stays 200 ohm if my source voltage is suddenly 5V ?


Thanks for the help
 

crutschow

Joined Mar 14, 2008
34,280
............
How can i calculate the change in voltage drop and current trough my resistor that stays 200 ohm if my source voltage is suddenly 5V ?
If the LED voltage is considered to stay constant at 2V (which is a good enough assumption for these calculations) then the resistor voltage would equal the supply voltage minus the LED voltage or 5V - 2V = 3V.
The new current is then 3V / 200Ω = 15mA.
 

MikeML

Joined Oct 2, 2009
5,444
If the LED is just being used as a visual indicator, then it will not matter much if the current changes. Pick a resistor such that at the lowest expected input voltage, the LED is bright enough to see, and at the highest expected input voltage, the LED max. allowed current is not exceeded.
 

Lestraveled

Joined May 19, 2014
1,946
LEDs operate over a wide range of current. The series resistor is often called a current limiting resistor in that the input voltage might change but the LED current changes less. The higher the input voltage is, the less the LED current will be affected by a voltage change.
Your example was 6 to 5 volt change equals a 83.3% voltage change (5/6). The result is the LED current only changed by 75% (15/20). Again, the higher the input voltage, the less sensitive the LED current will be to voltage changes.
 

Thread Starter

lordeos

Joined Jun 23, 2015
33
Hi,

For starters i just want to create a simple LDR circuit

1) Fixed resistor on top
2) LDR bottom

basicly i want to create a voltage devider and between the fixed resistor and the LDR i attach a resistor in series with a LED.

My input voltage will be 6 volts

How can i calculate the fixed resistor so that my voltage doesn't fluctuate between +- full source voltage and the voltage i need because of the LDR ?

thx for the help
 

peter taylor

Joined Apr 1, 2013
106
u need to limit the current through the LED to 20mA or poof, smoke.
if your LDRs lowest resisance in light is L ohms, then the LED current will be at max
u want (source voltage - LED voltage) / (series resistor + minimum LDR resistance) < 20 mA
(6 - 2) / (200ohm + L) < 20mA
even in full sunlight, L will be > 0 ohms
current will always be less than 4v / 200 ohm < 20 mA
 

Thread Starter

lordeos

Joined Jun 23, 2015
33
u need to limit the current through the LED to 20mA or poof, smoke.
if your LDRs lowest resisance in light is L ohms, then the LED current will be at max
Thx for the help peter ..

- I thought that when my LDR has a low resistance current will flow to my LDR and not trough the LED --> current follow the way of lowest resistance ?

- in your explanation you talk about two resistors : The LDR and the series resistor , but what about the the first resistor on top --> do i need to limit the current there or do i need to remove the top resistor and only use the LDR and series resistor and limit the current on the series resistor ?
 

Thread Starter

lordeos

Joined Jun 23, 2015
33
maybe with a picture it's more clear want i want to do (included in this reply)

The output voltage part will be the series resistor and the LED

thx Mike
 

Attachments

Thread Starter

lordeos

Joined Jun 23, 2015
33
Hello Peter,

Well i made a LDR circuit and it works as expected ... so many thx for that :):)

When i was testing the circuit i made some observations and wanted to check with you if that's correct :

1) my first resistor (in your above pic = r1) is a +- 266 ohm resistor dropping 4V and allowing 20Ma ... when checking the voltage in light and dark the voltage dropped over that resistor remains +- 4v and the reading on my power supply stays at 20Ma (without series resistor) , is it correct then assuming i don't need a series resistor with my LED to control the current and voltage, since this job is already done by the first resistor ? I see some change in voltage when switching between light and dark but the difference is a max of 0.7 V and the change of voltage in my led is only +- 0.2 V.

2) the voltage drop on my LDR is 1,5 in light and 2.0V in dark (lowest = 50 ohms highest = 200.000 ohms) ... i did check the LDR in full dark and it goes up to several Mega ohms. Is the 0.5V change in LDR caused by the fact the normal conditions (daylight and dark) is not enough to pump up the resistance of the LDR to megaohms ... if that would be the case then i would get almost supply voltage on the LDR, correct ?

3) When i do put a series resistor before my LED (example 1 Mega ohm) i see that in light conditions the circuit draws more Mili amps then in dark conditions (in dark circuit draws +- 3 miliamps, in light +- 20 Miliamps) ... Does this mean that in light conditions (LDR at lowest) the 20Ma goes directly through the LDR and in light the the 1megaohm resistor blocks limits the total circuit current and current trough the LED to 3Ma ?

4) The equivalent resistance with 2 resistors in parralel is lower then the smallest resistance. So by adding the 1 megaohm resistor in parralel to the LDR, is it correct that my Total resistance in the whole circuit does not change much (because the lowest resistance is 50 ohms of the LDR). it only influances the voltage and current in the branch of the LED ,however i do see that in that case in light or dark almost full supply voltage is dropped over the LDR propably caused by the fact that de resistance in LED branch is so high thats the current chooses the LDR path, correct ?

Thx for the help
Mike

 
Top