Actual Power vs. Apparent Power in LED's

Discussion in 'General Electronics Chat' started by steveparrott, Dec 29, 2010.

  1. steveparrott

    Thread Starter Active Member

    Feb 14, 2006
    36
    0
    Please help me understand how to calculate actual power consumption of an LED device.

    An LED lighting fixture mfg. is taking some flak from their customers because they publish a wattage rating for the LED light of 8.5W and a volt-amp rating of 12VA (incorporating the power factor). They instruct the installer to base transformer size (it's a 12V AC system) on the 12VA. In other words, a ten-light system would require a 120W transformer.

    The mfg. goes on to say that the actual consumption is 8.5W for each fixture - this is what the homeowner pays in electrical cost.

    Is this true? Please explain.
     
  2. Kermit2

    AAC Fanatic!

    Feb 5, 2010
    3,777
    932
    8.5 W is the RMS wattage of a 12 VAC peak to peak voltage



    If the LED draws 1 amp RMS(for example) It will pull 1.4 amps at the peak of the AC wave and less before and after the peak.

    .707 is the multiplier to convert peak voltage to RMS voltage.

    We need more info on EXACTLY what the 12 V is referring to. peak ac voltage or RMS voltage.
     
  3. steveparrott

    Thread Starter Active Member

    Feb 14, 2006
    36
    0
    Sorry, I don't know that answer.

    As an example of actual test values of another LED:

    Input voltage: 12.0 volts AC
    Input current: 0.262 amps
    Input power: 2.86 watts
    Power factor: 0.910

    Again, the question is: is the actual energy consumption (in this example) 2.86W, or does the power factor change that?
     
  4. Kermit2

    AAC Fanatic!

    Feb 5, 2010
    3,777
    932
    AC power is VARIABLE meaning a set amount of power doesn't flow continually in the circuit. The amount of current goes up and down, following the voltage impressed across the LED's.

    RMS is a way of giving AC voltages an AVERAGE amount of voltage for use in calculations. It gives you the DC equivalent power the AC voltage will provide. The two numbers are used in totally different ways in calculations.

    As I said before 12VAC at one amp would give 8.5 watts of power

    UNLESS

    We meant 12 VAC RMS at one amp, which would give us 12 watts of power.

    (simple version)
     
  5. steveparrott

    Thread Starter Active Member

    Feb 14, 2006
    36
    0
    I appreciate the explanation but I still don't know what to tell the homeowner. Do you think the mfg is correct when he says 8.5W is the actual power consumption - not 12W (which is what the transformer is sized to)?
     
  6. Ron H

    AAC Fanatic!

    Apr 14, 2005
    7,050
    656
    From what I can tell, most power companies only add a charge if average overall power factor (at the meter) is less than 90%. Otherwise, the customer is billed for real power only, not volt*amps.
     
  7. marshallf3

    Well-Known Member

    Jul 26, 2010
    2,358
    201
    Size it for 12W per fixture and be sure to run an ample gauge of wire to them. You can actually improve on the consumption by putting 1 uF non polarized caps at each bulb connection as that will greatly reduce the harmonics in the system which will end up fighting each other along the way.

    Manufacturers will tend to make things look better than they are in real life, kind of like a 600W UPS. The 600W is KVA and PC power supplies (until recently) have horrible power factors so that 600W is more like 450W max in real life. APC has a discussion page on this somewhere but it isn't easy to find.

    [EDIT:] Found it but it's still a bit misleading in their favor. The newer computer power supply manufacturers that are claiming active power factor correction are only guaranteeing that they're 80% or better. http://www.ptsdcs.com/whitepapers/12.pdf
    Mind you, I'm talking about the actual power supplies in the computers (loads) themselves, not the UPS power factors which are discussed here.

    When in doubt overdesign a bit but don't go overboard unless you sense a need to. In your case you may end up wanting to add more fixtures at a later time and the best values I've seen on 12V lighting transformers were from Home Depot or Lowe's where you can get a huge outdoor one at a minimal price compared to what most others are selling. No reason you can't use it indoors.

    Kermit came up with an interesting point, 12/8.5 is almost exactly 1.414 so I'd say the peak current draw would be 12W but averaged over time it would be 8.5W. This does not of course calculate in the losses of the power supply nor the wiring.
     
    Last edited: Dec 29, 2010
  8. steveparrott

    Thread Starter Active Member

    Feb 14, 2006
    36
    0
    So, are you saying that in my example the homeowner is charged for the 8.5W and not the 12VA? The mfg. is correct?
     
  9. Ron H

    AAC Fanatic!

    Apr 14, 2005
    7,050
    656
    That's my understanding, as long as the average PF for his entire house for the billing cycle is over 90%. Businesses may have slightly different rules, but I believe they are similar. I just found this by Googling. You can do the same.
     
  10. thatoneguy

    AAC Fanatic!

    Feb 19, 2009
    6,357
    718
    You can get Kill-A-Watt meters for around $20 now, they plug into an outlet, then you plug an accessory into the Kill-A-Watt. It is essentially a portable kW-H meter. Shows RMS Voltage, Current, and keeps a running total on power (kW-H) or instantaneous reading of power (Watts).

    These are also handy to have around for other purposes, but in this case, they do show what the power meter will showing.
     
  11. marshallf3

    Well-Known Member

    Jul 26, 2010
    2,358
    201
    Doesn't work that way on power factor billing and virtually no homes are even monitored for it.

    Generally if you go under a certain percent you're only penalized for the excess of which you've gone under.

    Let's say I accidentally go under 85% at the business (which I'll never let happen)

    "When the actual power factor as determined by continuous measurement of lagging reactive kilovoltamperhours is less than 85% the billing demand shall be determined by multipying the maximum demand, measured by the demand meter during the billing period, by 85 and dividing the product thus obtained by the actual average power factor expressed in percent periods of normal operation of the consumer's equipment instead of the average power factor."

    It's just too expensive for them to put in that type of meter into every home and it really only comes into play when you've got a large building witha lot of reactive loads in it.
     
  12. marshallf3

    Well-Known Member

    Jul 26, 2010
    2,358
    201
    Yea, all you'll probably be paying for is the 8.5W average power per fixture but as I mentioned, plan on designing for 12W each as 60 times per second it's going to hit that or higher. Don't try to "squeak by" using a smaller supply, it'll just burn out earlier then you'll be out far more than any savings you got by doing so.

    Same thing when buying a new battery for your car, pay the $10 extra for the better one, it'll last an extra year or so.
     
Loading...