Don't know any of the details like what or where you are measuring, what type of bulb, or how you figured the resistor value, etc, but... I'll assume you're using an incandescent (filament) type lamp and all you have is a 50 vdc source in a series circuit with the resisitor and lamp. Measure the bulb's resistance. Lets say it measures some value R. Your resistor should be about (38*R)/12, as near as possible. The resistor will drop 38 volts and the lamp 12 (i.e. 12 vdc appears across the bulb). This also assumes your power supply is capable of maintaining the 50 vdc under load.Originally posted by emraan@Nov 20 2004, 09:41 AM
hi,
I m using resister in a circut to reduce the voltage from dc 50 volt to dc 12 volts. when i use resister it decrese the volts but when i use 12 volt bulb on it the voltage came to zero volts...
plz help me in the problem...
[post=3679]Quoted post[/post]
Perion is correct, for the most part. The difficulty with what he said would be in measuring the resistance of an incandescent lamp. Low voltage lamps typically have resistance values that are much less than one ohm. The average Radio Shack type of meter can not measure such low resistance values. The other problem is, even if you were able to measure the lamp's resistance with a miliohmmeter, you would be measuring its "cold" resistance. Lamp filaments increase dramatically in resistance when they are hot! The best thing to do is to find the manufacturer specs on your lamp. It should tell you its normal operating voltage and current rating. From those values, using Ohm's Law, you would be able to determine your resistor value, and the current requirment of your power supply.Originally posted by Perion@Nov 20 2004, 09:26 AM
Don't know any of the details like what or where you are measuring, what type of bulb, or how you figured the resistor value, etc, but... I'll assume you're using an incandescent (filament) type lamp and all you have is a 50 vdc source in a series circuit with the resisitor and lamp. Measure the bulb's resistance. Lets say it measures some value R. Your resistor should be about (38*R)/12, as near as possible. The resistor will drop 38 volts and the lamp 12 (i.e. 12 vdc appears across the bulb). This also assumes your power supply is capable of maintaining the 50 vdc under load.
Perion
[post=3681]Quoted post[/post]
Don't use a voltage divider. When you add a voltage divider to a DC source and then connect a load to that voltage divider, you load down the divider. It has to do with the whole concept of ideal voltage sources and ideal current sources.Originally posted by emraan@Nov 20 2004, 09:41 AM
hi,
I m using resister in a circut to reduce the voltage from dc 50 volt to dc 12 volts. when i use resister it decrese the volts but when i use 12 volt bulb on it the voltage came to zero volts...
plz help me in the problem...
[post=3679]Quoted post[/post]
Good idea. Or use a switching regulator.I would forgo your divider and go get yourself a 12 volt regulator and just take the voltage down. Make sure the voltage reg is can handle the power disipation since your dropping a lot of voltage. Should fix the problem.
Originally posted by mozikluv@Dec 18 2004, 01:18 AM
hi
using a voltage divider without due consideration of the current requirement would render the circuit partly defective. since you are dropping about 75% of the original voltage you have to use high value resistors which in effect drops also the current to a level that can't supply the need of your bulb. this process is a waste of energy considering that a lot of heat has to be dissipated just to power up at 12v translating to inefficiency.
likewise using a voltage regulator like the LM7812 is also inefficient, a lot of heat has also to be dissipated and the max. input voltage of such device i believe is limited to 45v.
using a switching reg. would do the job as battousai suggested.
however the cost of building a switching reg as against buying a 12v transfor would also have to be considered
[post=4179]Quoted post[/post]
by Aaron Carman
by Jake Hertz
by Jake Hertz