# Voltage are totaly droped after i use resister

Discussion in 'General Electronics Chat' started by emraan, Nov 20, 2004.

1. ### emraan Thread Starter New Member

Nov 20, 2004
4
0
hi,

I m using resister in a circut to reduce the voltage from dc 50 volt to dc 12 volts. when i use resister it decrese the volts but when i use 12 volt bulb on it the voltage came to zero volts...

plz help me in the problem...

2. ### Perion Active Member

Oct 12, 2004
43
1
Don't know any of the details like what or where you are measuring, what type of bulb, or how you figured the resistor value, etc, but... I'll assume you're using an incandescent (filament) type lamp and all you have is a 50 vdc source in a series circuit with the resisitor and lamp. Measure the bulb's resistance. Lets say it measures some value R. Your resistor should be about (38*R)/12, as near as possible. The resistor will drop 38 volts and the lamp 12 (i.e. 12 vdc appears across the bulb). This also assumes your power supply is capable of maintaining the 50 vdc under load.

Perion

3. ### Xray Well-Known Member

Nov 21, 2004
58
1
Perion is correct, for the most part. The difficulty with what he said would be in measuring the resistance of an incandescent lamp. Low voltage lamps typically have resistance values that are much less than one ohm. The average Radio Shack type of meter can not measure such low resistance values. The other problem is, even if you were able to measure the lamp's resistance with a miliohmmeter, you would be measuring its "cold" resistance. Lamp filaments increase dramatically in resistance when they are hot! The best thing to do is to find the manufacturer specs on your lamp. It should tell you its normal operating voltage and current rating. From those values, using Ohm's Law, you would be able to determine your resistor value, and the current requirment of your power supply.

Regards

4. ### Martin P Dembrowski Member

Dec 3, 2004
10
0
HI guys...

You can always measure the "hot" resistance of a bulb with an inline ammeter and applying Ohm's Law, right?

If the bulb is meant to be lit for it's resistive value then all you have to do is concoct a simple series circuit with some sort of limiting resistance, possibly a high current potentiometer or rheostat, and be careful not to let the current pin the meter which means using a true ampere meter, although if you have some idea of what the max current will be you could always put a sufficiently low value bypass resistor in parallel with the meter to take most of it and use the parallel resistance law to calculate the total current.

Sometimes a bulb was used as a very low "trimming" resistance is some circuits. These weren't meant to light and from what I remember were usually a cheap way to get a good low value resistor for some HF and other radio frequency circuits.

Martin

5. ### Brandon Senior Member

Dec 14, 2004
306
0
Don't use a voltage divider. When you add a voltage divider to a DC source and then connect a load to that voltage divider, you load down the divider. It has to do with the whole concept of ideal voltage sources and ideal current sources.

You want an ideal voltage source to have 0 resistance. The voltage divider does not put it at 0. It will put it at the parrallel combination of your divider, and if your load has a lower reistsance than that divider (light bulbs do) you will load down the circuit. Same idea for current sources, except you want them at infinite ohm of output resistance and thats where all your current mirrors on what not come in.

I would forgo your divider and go get yourself a 12 volt regulator and just take the voltage down. Make sure the voltage reg is can handle the power disipation since your dropping a lot of voltage. Should fix the problem.

6. ### Battousai Senior Member

Nov 14, 2003
141
44
If the resistors you use are too large, the bulb will not illuminate well.

Good idea. Or use a switching regulator.

7. ### mozikluv AAC Fanatic!

Jan 22, 2004
1,437
1
hi

using a voltage divider without due consideration of the current requirement would render the circuit partly defective. since you are dropping about 75% of the original voltage you have to use high value resistors which in effect drops also the current to a level that can't supply the need of your bulb. this process is a waste of energy considering that a lot of heat has to be dissipated just to power up at 12v translating to inefficiency.

likewise using a voltage regulator like the LM7812 is also inefficient, a lot of heat has also to be dissipated and the max. input voltage of such device i believe is limited to 45v.

using a switching reg. would do the job as battousai suggested.

however the cost of building a switching reg as against buying a 12v transfor would also have to be considered

8. ### Xray Well-Known Member

Nov 21, 2004
58
1

Hey guys..... All he wants to do is light a low voltage lamp with a higher voltage power supply. He doesn't need complex solutions like switch-mode power supplies and DC regulators, etc! (Ever hear of Occam's Razor?). He didn't even mention anything about a voltage divider. He has nothing more than a resistor and a lamp, and is wondering why the voltage drops to zero when he applies power. It's a very simple matter of finding the proper resistor value and wattage for his application. Let's not look for a complex solution for a simple problem!!

Merry Christmas everyone!

9. ### dragan733 Senior Member

Dec 12, 2004
152
0
All depends how much current needs the 12V bulb and to calculate the resistor as:
R=(50-12)/Ibulb and the power of the reistor: P=(50-12)Ibulb. So, if the current of the bulb is Ibulb=1A, then P of the reistor is verry big, P=38W and it is not acceptable. Then one needs to use a PWM regulator.

10. ### Keith New Member

Dec 25, 2004
1
0
When the circuit is first powered up the fillament is cold and therefore has a very low resistance. Ohm's law shows that almost all of the voltage is then dropped across the resistor, leaving a fraction of a volt for the bulb, which is not enough to heat it up.
Try placing a capacitor in parallel with the resistor. This will allow a quick spike of energy to bypass the resistor when the circuit is first powered. Because you are using a DC source, the capacitor will quickly charge and have no further effect on circuit operation.
Start with a small value, (1 uF?) and increase if necessary. Too small a cap. and it will not work, too large a cap. could blow the bulb.