# Is Voltage divider enough?

Discussion in 'Digital Circuit Design' started by pjreijiri, Apr 19, 2017.

1. ### pjreijiri Thread Starter Member

Aug 19, 2015
80
2
Hello Everyone,
I want to use this sensor http://www.mouser.com/ds/2/678/V02-2259EN0-1100110.pdf
In the datasheet it says that the supply voltage should be 1.5V
In my circuit I have 5V, can I use a voltage divider of 3.5Kohm and 1.5Kohm to bring it down to 1.5V or do I need some sort of a voltage regulator?

Thank you

2. ### AnalogKid AAC Fanatic!

Aug 1, 2013
5,391
1,541
1.5 V is the typical forward voltage of the LED, not a power supply requirement. The datasheet says the max current is 100 mA. First, determine what current you want to push through the LED. Second, use Ohms Law to calculate the series resistor for the LED.

That was all guesswork. Until you post your schematic there can be no real answers.

ak

3. ### pjreijiri Thread Starter Member

Aug 19, 2015
80
2
This is my schematic.
Would the voltage divider work in this case?
The 15ohm resistor with the 1.5V gives me 100mA

Last edited by a moderator: Apr 19, 2017
4. ### WBahn Moderator

Mar 31, 2012
19,523
5,399
You would need to take into account the equivalent resistance of the voltage divider itself. Let's say that this somehow worked an you got 100 mA flowing through the 15 Ω resistor. Where is that current coming from? Since it has to be coming from the 5 V supply, it has to be flowing through the 1.5 kΩ resistor. But that means that the voltage drop across that resistor is 150 V. Seems a bit unreasonable, no?

You are abusing Ohm's Law by picking some voltage and dividing it by some resistance to get some current. But Ohm's Law requires that the voltage you use be the voltage across THAT resistor, not just the voltage on one side of it.

But you don't need the voltage divider. If you want 100 mA then you need to put a series resistor between the LED and the 5 V supply that drops 3.5 V across it when it has 100 mA flowing through it. The other 1.5 V will be dropped across the LED.

5. ### WBahn Moderator

Mar 31, 2012
19,523
5,399
I would not recommend running it at 100 mA. That is the absolute max continuous current that it is spec'ed to tolerate. If you are only pulsing it for short durations with plenty of time between pulses, you would probably be okay. But if you are going to run current through it for long periods it looks like the optical specs were tested at 20 mA, so that is probably a more reasonable operating point.

6. ### dl324 Distinguished Member

Mar 30, 2015
4,593
972
Why would you want to operate the LED at 100mA? That's the absolute max parameter. Reliable designs don't operate components at their absolute max parameters.

If you want to use a resistor divider, the current in the divider needs to be at least 10X the load current. If you do the math, you'll see that for a 100mA load, a divider isn't a good solution.

7. ### pjreijiri Thread Starter Member

Aug 19, 2015
80
2
What do you suggest then?

8. ### WBahn Moderator

Mar 31, 2012
19,523
5,399
It has been suggested twice to just use a series resistor to limit the current in the LED to the desired level.

I specifically told you to size the resistor so that it drops 3.5 V when the desired current is flowing in it.

I specifically suggested that you use 20 mA and not 100 mA.

Are we just wasting our time?

9. ### panic mode Senior Member

Oct 10, 2011
1,383
325
read the data sheet... know the limits and stay within them. right after absolute max parameters is info on typical parameters and it tells that Vf = 1.5V , If=20mA

10. ### dl324 Distinguished Member

Mar 30, 2015
4,593
972
The datasheet suggests a maximum sensing distance of 6cm. What do you require? That distance and the reverse voltage on the detector will be factors in determining an appropriate current in the LED.

Will you be conditioning the signal for the microcontroller? Is it a digital input?

11. ### AnalogKid AAC Fanatic!

Aug 1, 2013
5,391
1,541
You do not need a voltage divider to drive the LED. With the cathode tied to ground, a single resistor to the +5 V is enough. Look at the datasheet for the *typical* operating current, and calculate the resistor value from that.

ak

Apr 5, 2008
16,712
2,833
Hello,

For temperature reasons, keep the current well below 50 mA.
To calculate the resistor for the led at 5 Volts @ 20 mA, you would see a forward voltage of the led of about 1 Volt:

The needed resistor will be (5 Volts supply voltage - 1 Volt led voltage drop) / 20 mA = 200 Ohms.
Taking 180 Ohms, the current will be a bit higher, taking 220 Ohms, the current will be a little lower.

Bertus

pjreijiri likes this.
13. ### ian field AAC Fanatic!

Oct 27, 2012
5,317
948
Using the divider just makes the maths more complicated - all you need is to calculate a suitable value for a simple series current limiting resistor.

Is there any particular reason the data sheet shows PWM drive?

The PWM circuit shown includes a current limiting resistor - that will need higher resistance if you feed the LED steady DC.

pjreijiri likes this.
14. ### pjreijiri Thread Starter Member

Aug 19, 2015
80
2
Thank you for taking the time to explain it in an easy matter.
So it should be something like this?

File size:
241 KB
Views:
9

Aug 19, 2015
80
2
16. ### AnalogKid AAC Fanatic!

Aug 1, 2013
5,391
1,541
Two things. First, that circuit takes all the fun out of growing the functions from scratch. Second, that package is more difficult than most surface mount packages to solder by hand. Other than that, it obviously is the right part for the job if your application can stand the time delay caused by an integrating detector.

ak

pjreijiri likes this.
17. ### pjreijiri Thread Starter Member

Aug 19, 2015
80
2
hahaha yes I know, but sometimes you have to keep it simple and make sure to really have something working.
As for the soldering, it won't be done by hand.