Problem with specifying Ampere needs because of different readings

Thread Starter

Luffy86

Joined Jan 10, 2013
2
Hello everybody,

I have an LED unit and I want to specify the ampere needs for 10 of those. The unit is supplied with 48VDC through a rectifier and the current it draws (measured from my multimeter) is 0.84A DC. Multiplying the former and the latter I can derive that the unit consumes 40.32W. However, I also used an energy meter wall wart style and I read from there that current drawn is 0.38A AC and power consumption is 47W.

This is where I got confused and started looking at various power supplies I have available here at home. Looking at my laptop's power supply I saw that INPUT: 100-240V, 50-60Hz, 1.5A. OUTPUT: 19V DC 4.74A.

So these are my questions:
a) Which values should I provide with to the electrician so he can work out the amp needs for 10 of those units? The DC A or the AC A readings?
b) Can you explain to me in layman's terms what is going on with the different readings or prompt me to the relevant literature?

Thank you in advance for your time
Kind Regards
 

mcgyvr

Joined Oct 15, 2009
5,394
A) Obviously the AC readings.
B) Efficiency.
In the process of converting from AC to DC some energy is lost in that conversion.
So your rectification process requires 7 watts. Its like 85% efficient. 100% efficient would be 40W on the AC side and 40W on the DC side..
 

Thread Starter

Luffy86

Joined Jan 10, 2013
2
Hello mcgyvr and thank you for your reply. However can you explain to me why is it obviously the AC readings? If I multiply the AC current reading times 10 I get 3.8A. Whereas If I do the same from the DC reading I get 8.4A. Isn't the latter that specifies the ampere needs of those 10 units I have?
 

mcgyvr

Joined Oct 15, 2009
5,394
Hello mcgyvr and thank you for your reply. However can you explain to me why is it obviously the AC readings? If I multiply the AC current reading times 10 I get 3.8A. Whereas If I do the same from the DC reading I get 8.4A. Isn't the latter that specifies the ampere needs of those 10 units I have?
NO..
Why would the electrician who is trying to determine how to properly size the "AC" he needs be concerned about the DC side..

Your fixtures will require 3.8 Amps from the AC.. NOT 8.4 Amps.. He (the electrician) JUST needs to know the AC voltage and AC current your device requires..

Frankly you should not be "specifying" anything IMO as its very clear you are out of your league.. This is the absolute basics of electricity.. (no offense)
Please don't tell me you designed this LED fixture..
 

mrmount

Joined Dec 5, 2007
59
I am not sure what the electrician is doing, but if he is doing the wiring to the input of the LED units (i.e. AC input) he would require the AC amperage.
 

wayneh

Joined Sep 9, 2010
17,496
Hello mcgyvr and thank you for your reply. However can you explain to me why is it obviously the AC readings? If I multiply the AC current reading times 10 I get 3.8A. Whereas If I do the same from the DC reading I get 8.4A. Isn't the latter that specifies the ampere needs of those 10 units I have?
Perhaps you are being confused by the relationship of power to current. The power on the AC side balances (is conserved) with the rectification loss and the DC power on the DC side. Because power - calculated as voltage times current - is conserved, current is forced to be inversely proportional to voltage.

Your converter drops voltage, so the DC current available is greater than the AC current supplied. Again, power on the DC side is a bit less than the incoming power due to losses. The number you need to supply the electrician is the AC current your "black box" requires. The electrician doesn't need to know about the innards of the black box.
 

strantor

Joined Oct 3, 2010
6,782
I could theoretically make a device that, when plugged into a 120VAC wall outlet, only draws 1 amp. Inside the device is a step-down transformer with a 120:1 turns ratio. So, 120V in, 1V out. I can draw 120A at 1V from the secondary, while drawing 1A @ 120V on the primary. I rectify this 1V and call it a 120A DC power supply. So should I size my breaker for 120A? NO. Because the device only draws 1A from from the wall socket. It does not matter in the least what amperages are involved after that point.
 

gerty

Joined Aug 30, 2007
1,305
I could theoretically make a device that, when plugged into a 120VAC wall outlet, only draws 1 amp. Inside the device is a step-down transformer with a 120:1 turns ratio. So, 120V in, 1V out. I can draw 120A at 1V from the secondary, while drawing 1A @ 120V on the primary. I rectify this 1V and call it a 120A DC power supply. So should I size my breaker for 120A? NO. Because the device only draws 1A from from the wall socket. It does not matter in the least what amperages are involved after that point.
You have described a Weller (or other brand) soldering gun. We replaced the tip on our 240w iron with a loop of #10 solid wire, clamped an amprobe around that loop and measured the current at over 100 amps... That really gets the students to thinking.....:D

edit: the soldering gun does not have a diode..
 

mcgyvr

Joined Oct 15, 2009
5,394
and of course the reverse is also true.. You can have a 120V AC (2 Amp) on the AC side and can ONLY get a fraction of that on the DC side but the voltage on the DC side would be higher..

watts is watts.. watts on both sides minus the efficiency should always match.. but the amperage doesn't because of the different voltages. V x A = W
 
Top