40vdc to 5vdc

OBW0549

Joined Mar 2, 2015
3,566
Waitaminnit... you mean all you're aiming to do is drive an optoisolator's IR emitter from a 40V DC source? Why not just use a simple series resistor for current limiting?

40V / 0.02A = 2000 ohms. Power dissipated in the resistor will be 0.8W, so use a 1W resistor.

Am I missing something here?
 

Thread Starter

cmartinez

Joined Jan 17, 2007
8,257
Waitaminnit... you mean all you're aiming to do is drive an optoisolator's IR emitter from a 40V DC source? Why not just use a simple series resistor for current limiting?

40V / 0.02A = 2000 ohms. Power dissipated in the resistor will be 0.8W, so use a 1W resistor.

Am I missing something here?
Actually.... yes... but it's my fault for not explaining. I also need to drive a controller circuit running at 5V that consumes about the 20 to 30mA that I explained before... But I just found this very small power supply that I have in stock that will convert 115VAC to 5V and will do the job just nicely, since I also have the 115VAC available. But the control signal is still being delivered at 40VDC (this is an old machine I'm trying to fix), and I have to deal with that.
I pictured I could use the control signal to also feed the controller circuit, since the signal can provide 100mA. But after some consideration I decided to complicate things a bit more to make them simpler... (did I make sense?)
Now, the 40V comes from a power supply, and what I want to do is shut down everything else when this power supply is turned off... but the thing takes about 10 seconds to reach zero volts if it's unloaded... I figured if I used the voltage divider circuit will light up the led AND help it reach zero a bit faster after shutdown.
 

Thread Starter

cmartinez

Joined Jan 17, 2007
8,257
... Am I missing something here?
What you're missing is that it has become clear that I don't fully understand the behavior of an LED... shame on me... but then again I trust that no one here's gonna make me take the walk of shame for admitting my ignorance.. :oops: well... maybe @Hypatia's Protege will take this chance... I still owe him one... :D

Anyway, it is clear to me that an LED is a current driven device. So your calculations are easy for me to understand, although you didn't take into account the LED's voltage drop, which when subtracted from a 40V source would be negligible, I guess.
My question is... is there a limit to the voltage source that one could use to drive an LED? Say, is it a valid assumption that I could drive an LED from a 1,000V source, so that all I'd need to use is a 1,000V/0.02A = 50KΩ resistor with a power dissipation capability of 20W ?
What if I were to use a 50,000V source? wouldn't it be too much for the LED, even with the proper resistor installed?
 

ScottWang

Joined Aug 23, 2012
7,409
Don't forget to count the current only 80% of the rating current as 20mA * 80% = 16mA, that is for the using life and light fading, it just like if you earn US$2000/month then do you wanna spend for US$2000/month or just US$1600/month?

If you using LED flashing then that is another issue.
 

Thread Starter

cmartinez

Joined Jan 17, 2007
8,257
Don't forget to count the current only 80% of the rating current as 20mA * 80% = 16mA, that is for the using life and light fading, it just like if you earn US$2000/month then do you wanna spend for US$2000/month or just US$1600/month?

If you using LED flashing then that is another issue.
Thanks for the reply... but my question was made using hypothetical data... I normally run my LEDs at only 6mA, since I use them as indicators inside control panels and they don't have to burn too brightly.
 
Last edited:

OBW0549

Joined Mar 2, 2015
3,566
What you're missing is that it has become clear that I don't fully understand the behavior of an LED... shame on me... but then again I trust that no one here's gonna make me take the walk of shame for admitting my ignorance.. :oops:
Of course not; all hoots, jeers and catcalls will be administered most discreetly, out of the public eye...

Anyway, it is clear to me that an LED is a current driven device. So your calculations are easy for me to understand, although you didn't take into account the LED's voltage drop, which when subtracted from a 40V source would be negligible, I guess.
Correct. The IR LED in an optoisolator will have about 1.2 volts across it at typical current levels. 40 volts, 38.8 volts, whatsa difference?

My question is... is there a limit to the voltage source that one could use to drive an LED? Say, is it a valid assumption that I could drive an LED from a 1,000V source, so that all I'd need to use is a 1,000V/0.02A = 50KΩ resistor with a power dissipation capability of 20W ?
All that matters is the current: so long as it's 20 mA or so, the LED will be perfectly happy. You can drive it from 10V with a 440 ohm series resistor, from 100V with a 4.94K ohm series resistor, a 1000V source with a 49.94K ohm resistor... take your pick, it's all the same to the LED because each combination results in 20 milliamps.

What if I were to use a 50,000V source? wouldn't it be too much for the LED, even with the proper resistor installed?
Nope. 20 mA is still 20 mA.
 

Thread Starter

cmartinez

Joined Jan 17, 2007
8,257
Of course not; all hoots, jeers and catcalls will be administered most discreetly, out of the public eye...


Correct. The IR LED in an optoisolator will have about 1.2 volts across it at typical current levels. 40 volts, 38.8 volts, whatsa difference?


All that matters is the current: so long as it's 20 mA or so, the LED will be perfectly happy. You can drive it from 10V with a 440 ohm series resistor, from 100V with a 4.94K ohm series resistor, a 1000V source with a 49.94K ohm resistor... take your pick, it's all the same to the LED because each combination results in 20 milliamps.


Nope. 20 mA is still 20 mA.
Clear and simple... just the way I like it... thank you, my friend.
 

OBW0549

Joined Mar 2, 2015
3,566
Yer welcome.

Here's another way of looking at this whole thing: to any component in a circuit, such as our LED, all that matters are the voltages at its own terminals, and the currents flowing into or out of them. Everything else going on in the circuit, such as the unspeakable horror being inflicted upon that poor resistor in your last example with 49,998.8 volts across it, is a complete don't-care as far as the LED is concerned. 1.2 volts. 20 milliamps. That's the whole bit.
 
is there a limit to the voltage source that one could use to drive an LED? Say, is it a valid assumption that I could drive an LED from a 1,000V source, so that all I'd need to use is a 1,000V/0.02A = 50KΩ resistor with a power dissipation capability of 20W ?
What if I were to use a 50,000V source? wouldn't it be too much for the LED, even with the proper resistor installed
Inasmuch as electrical 'conduction' across a semiconductor junction is a discontinuous phenomenon (at the quantum level), your question may be more profound than it appears at 'first blush'...:)

Then too, from a purely practical standpoint, large potential differences with respect to (environmental) ground will produce corona and/or arcing with attendant undesirable effects...

well... maybe @@Hypatia's Protege will take this chance... I still owe him one...
Um... No -- I'll reserve that option for a special occasion...;););):D

OBTW: Style points please for my eschewal of 'ambiguous pronoun' insistence/enforcement:D
Best regards
HP
 
Last edited:

jpanhalt

Joined Jan 18, 2008
11,087
What if I were to use a 50,000V source? wouldn't it be too much for the LED, even with the proper resistor installed?
I agree with the responses that current is current, and a resistor will work for 50 KV too. Be aware though, that resistor have voltage ratings too. If my calculations are accurate, you need 2.5 MΩ at 1000W for 20 mA. A 100W resistor is about 10" long (http://www.mouser.com/ds/2/303/res_powermox-180789.pdf ).

For 1/4W to 1/2W resistors, voltage ratings of 200V to 500V sticks in my mind for "typical."

John
 

Roderick Young

Joined Feb 22, 2015
408
...
I want to power up a LED in an optoisolator (H11L2M) that works at a maximum of 30mA. ...

Any thoughts anyone might want to share?
Why not just use a high-value dropping resistor to power the optoisolator? Being generous and assuming the forward voltage of the LED is zero, what would be a resistor of about 1500 ohms to supply 30 mA. You could then put an NPN transistor (or an open collector output, if you have one that will tolerate 40 volts) between the cathode of the optoisolator LED and ground, to control switching it on. A mosfet would work, too.
 

Thread Starter

cmartinez

Joined Jan 17, 2007
8,257
Why not just use a high-value dropping resistor to power the optoisolator? Being generous and assuming the forward voltage of the LED is zero, what would be a resistor of about 1500 ohms to supply 30 mA. You could then put an NPN transistor (or an open collector output, if you have one that will tolerate 40 volts) between the cathode of the optoisolator LED and ground, to control switching it on. A mosfet would work, too.
Thank you, Roderick.
In the end what I did is I powered the optoisolator's LED from a 40VDC supply using five 470Ω, 1/4W resistors connected in series. With a total of 2.35K, that would allow for about 17mA to pass through the LED (a minimum of 10mA is required, according to the datasheet). That pretty much solved everything.
 

ian field

Joined Oct 27, 2012
6,536
Is there a simple way to go from 40vdc to 5vdc without the need for transformers or inductors?
I was going to use an LM317, but then found out that its datasheet specifies a maximum of 30vdc at its input.
Maximum power drawn will be about 20mA @ 5vdc....
Or maybe I should just use a zener for this?
Have a look at the higher output voltage regulators - you may just find one that can handle the Vin you have.

A pre-regulator may be the answer - as long as you don't exceed the dissipation limit, one of the high voltage regulators linked may do the job all by itself:

http://ww1.microchip.com/downloads/en/DeviceDoc/LR12 C080113.pdf

http://www.ti.com.cn/cn/lit/ds/symlink/tl783.pdf
 

Thread Starter

cmartinez

Joined Jan 17, 2007
8,257
Have a look at the higher output voltage regulators - you may just find one that can handle the Vin you have.

A pre-regulator may be the answer - as long as you don't exceed the dissipation limit, one of the high voltage regulators linked may do the job all by itself:

http://ww1.microchip.com/downloads/en/DeviceDoc/LR12 C080113.pdf

http://www.ti.com.cn/cn/lit/ds/symlink/tl783.pdf
Wow! this is extremely valuable information. Thank you!

I'm impressed with the performance of those two devices. That's exactly what I was initially looking for. I've already solved my original problem, but this info will come in handy in the near future.
In particular, I found figure 19 (shown in page 12) of the TI chip very interesting. It's a 48-V 200-mA Float Charger. Does this mean that this circuit can be used to safely charge a battery, requiring no additional components?

Thanks again!
 

ian field

Joined Oct 27, 2012
6,536
Wow! this is extremely valuable information. Thank you!

I'm impressed with the performance of those two devices. That's exactly what I was initially looking for. I've already solved my original problem, but this info will come in handy in the near future.
In particular, I found figure 19 (shown in page 12) of the TI chip very interesting. It's a 48-V 200-mA Float Charger. Does this mean that this circuit can be used to safely charge a battery, requiring no additional components?

Thanks again!
I think they may be referring to the regulator "floating" between a high input voltage and low output voltage - particularly in respect of arrangements for the reference pin.

Float charging batteries can be complex, and is different for each of the many chemisties.

For example; lead acid should be bulk charged at no more than 14.4V (12V nominal battery) but float charging should be 13.6V to avoid excessive electrolyte loss due to gassing. Nickel chemistry can be a right PITA, and float charging lithium can be bloody dangerous!
 

Thread Starter

cmartinez

Joined Jan 17, 2007
8,257
Assuming that I use a 1:1 isolation transformer and connect it to 120VAC. Can I connect it's output to a bridge rectifier, then add a couple of capacitors of sufficient value, and then use the TL783 properly configured to get a regulated 5V output?

Also, from what I understand in the datasheet, the circuit must consume a minimum of 15mA for the device to be stable.
But would that be all I needed? Would that mean I don't have to use a step-down transformer?
 

Alec_t

Joined Sep 17, 2013
14,335
Would that mean I don't have to use a step-down transformer?
120V AC rectified and smoothed would give >160V DC, but the TL783 is rated for 125V. Methinks a step-down tranny is called for.
 

Thread Starter

cmartinez

Joined Jan 17, 2007
8,257
120V AC rectified and smoothed would give >160V DC, but the TL783 is rated for 125V. Methinks a step-down tranny is called for.
You're right... I hadn't considered that 120VAC is actually an RMS value... with peaks at what you've just mentioned...
Thanks for sharing
 

ian field

Joined Oct 27, 2012
6,536
120V AC rectified and smoothed would give >160V DC, but the TL783 is rated for 125V. Methinks a step-down tranny is called for.
I've seen appnotes that described this type of regulator is fed with Vin more than its maximum input/output differential - it stands to reason that a short on the output will kill the regulator instantly.

Cascading 2 regulators might get away with it.

It'll probably get the thread closed - but I have seen an off the shelf regulator that can take rectified and smoothed mains and give a low output voltage. There is a version for 230V mains.

Unfortunately I cant remember offhand what the type number is.

Try the Supertex website - not certain, but the name rings a bell.
 
Top