# Resistor

Discussion in 'General Electronics Chat' started by mpangrekar, Sep 11, 2009.

1. ### mpangrekar Thread Starter Member

May 16, 2009
14
0
I have 24 VDC input supply and I want to drive a LED bulb that runs from a 12V AC/DC supply. I am trying to find out a value of resistor that I can use. I found out the resistance of LED bulb by measuring voltage & current (11.34V/0.51A = 22.23 OHM).

GOAL: GET A CORRECT RESISTOR TO GET 12V AT BULB PINS

Now, I tried few Vishay aluminum power resistors and got following values (see attached file) (mr16 led BULB rated for 5W - datasheet attached, 12V AC/DC, this bulb can accept 12V AC/DC & has a rectifier to convert to 12VDC if AC is supplied):

I tried a 12V, 5W halogen bulb too, I measured 13.13V at bulb pins (why the difference in this value?). For whatever reason, I cannot get 12V, am I doing anything wrong? Do I have to use exact 22.23 ohms?

Second question: the one I am trying has a Aluminum heatsink on it, even the 25W resistor gets hot. If I use a 50W, will it produce less heat because of bigger heatsink? Thanks. Any help is appreciated.

Mayur

File size:
46 KB
Views:
16
File size:
398 KB
Views:
23
2. ### Audioguru New Member

Dec 20, 2007
9,411
896
If you look at the datasheet for the "25W" aluminum resistor then you will see that it can dissipate about 4W by itself and needs to be bolted to a pretty big heatsink for 25W.

But with only 12V across 22 ohms it dissipates only 6.6W.

Resistors are made to get very hot. Hot enough to burn you, melt plastic and scorch a pcb.

3. ### bobbyrae Active Member

May 14, 2009
42
1
I was looking at burned out lights in my audio receiver and ran across a situation that is similar. They have a 22v supply, but the bulbs are 12v! It seemed to that this would burn out the bulbs in no time, but the strategy was to put two bulbs in series so that the voltage drop across each would be 11v. It seems to work.

So if you use two LEDs in series, you should be OK and you won't be needing a resistor.

4. ### eblc1388 Senior Member

Nov 28, 2008
1,542
102

Your LED Bulb is specified for AC use and there is no mention of the suitability of running it using DC12V in the datasheet.

Even the three transformers recommended by the manufacturer also output AC voltages.

A resistor will only drop the calculated voltage when there is current. Anything can happen if you place 24V DC across the bulb.

5. ### mpangrekar Thread Starter Member

May 16, 2009
14
0
Thank you for your responce ! I have spoken to the Mfgr of the LED bulb, they it can work with 12 V AC & DC.

6. ### rjenkins AAC Fanatic!

Nov 6, 2005
1,015
69
The LED lamp assembly very likely has a switch-mode voltage regulator in it to minimise heating. The LED itself probably drops around 3.5 - 4V so a resistive dropper would dissipate an extra 10 Watts (dropping 8V of the 12V supply).

If it does include this switch mode regulator, it means it has a negative resistance curve; the supply current will drop as the voltage increases to keep a constant power output from the LED.

You cannot feed this with a simple resistor circuit, it will never be stable.

You will have to create some form of voltage regulator (eg. a zener diode plus an emitter follower transistor) to give a stable 12V to the lamp.

Edit:
Looking back at the lamp current - 0.51A - it must use an internal switcher as the LED must use more than 1A to be rated 5W (assuming the internal LED voltage is around 3.5V).

You could use just a high-power 12V zener to drop the excess voltage, eg:
http://uk.rs-online.com/web/search/searchBrowseAction.html?method=getProduct&R=0856314

Or, the high-tech approach, use some thing like a National Semiconductor 'Simple Switcher' IC to give a 12V supply with very little power loss & heat production. (It would take a lower input current at 24V than the lamp current at 12V).

Last edited: Sep 15, 2009