is 220 V ac actily more efficent than 110V AC?

Thread Starter

electronis whiz

Joined Jul 29, 2010
512
I have to do a paper for an English class that evaluates something. I’m considering the efficiency of 110 VS 220. But I have seen info saying there = and another saying 200 is. What is correct?
One of my friends that teaches electricity at a college told me when we were talking about energy efficacy. He said that sometime soon power companies are going to be metering using amperage as well as possibly watts. Because the new CFL and led bulbs draw less watts but = amps to incandescent. This is making utilities lose $. This doesn’t exactly make since to me because ohms law should make that if the amps drawn by a device should make the watts the same.
If they do meter by amps someday then 220 may be more efficient?
I am kind of just beginning some research to see if I can make a paper on this work.
I would appreciate an answer to both of these if possible. If I end up doing this I may need some more info because my area of electronics is usually computer related or DCv<=12 normally. So I’m not that good with AC.
 

strantor

Joined Oct 3, 2010
6,781
I have to do a paper for an English class that evaluates something. I’m considering the efficiency of 110 VS 220. But I have seen info saying there = and another saying 200 is. What is correct?
Let's say you are going to install a 3KW motor in your house and you can choose between a 110V or 220V motor. If you go with the 110V motor, it will draw 27A, therefore requiring a 30A breaker and 10AWG wire ran. If you go with the 220V motor, it is going to draw 14A, so you can install a 15A breaker and run 14AWG wire. That's the only "efficiency" gain I can see - "efficiency" on your pocket book.

One of my friends that teaches electricity at a college told me when we were talking about energy efficacy. He said that sometime soon power companies are going to be metering using amperage as well as possibly watts.
So what do you think they bill us for right now? Volts?

Because the new CFL and led bulbs draw less watts but = amps to incandescent. This is making utilities lose $.
if CFL dissipated less watts but drew equal amps, that would mean that the voltage is lower. This is not the case. They both operate on 120VAC. The CFL draws less amps, therefore less awatts.
 

deven88

Joined Nov 16, 2010
1
I think 220 V supply is more energy efficient. Consider you have motor having R ohm resistance (avoiding the inductance for simplicity). So for same wattage requirement, 220 V supply needs to provide lesser current. So the copper losses in motor winding having R ohm resistance will be smaller compare to 110 V supply according to I^2*R. And the wires which are connected between utility transformer secondary to load have some resistance which causes additional losses while drawing higher current in case of 110V.

Sorry if i have done any grammatical or spelling mistake.
 

crutschow

Joined Mar 14, 2008
34,201
If you properly adjust the wire size than there is no difference in efficiency between 110V and 220V. For a given power level the 110V line requires twice the current as compared to a 220V line so the wire resistance for the 110V line must have 1/4 the resistance to keep the I\(^{2}\)R power losses the same. The difference is that larger wire costs more, so wiring a home (or motor) for 110V is more expensive than wiring a home for 220V (for the same power level and wire losses).

The reason the power company is concerned about CFLs is if they a low power factor and draw reactive amps which produce no power but would cause resistive losses in the power line. Thus monitoring the line power for amps instead of power would allow the power company to take that into consideration.
 

Thread Starter

electronis whiz

Joined Jul 29, 2010
512
thanks every one this helped me quite a bit. crutschow it seems like it was somthing like that i was told but couln't understand how it was explained the first time.
 

WBahn

Joined Mar 31, 2012
29,928
As crutschow says, utility companies are increasingly talking about charging for apparent power either in addition to or instead of real power. I'm guessing that your friend mentioned charging for volt-amps in addition to watts. Both are, strictly speaking, the same units of power, but volt-amps (abbreviated VA) is computed by multiplying the RMS voltage by the RMS current while ignoring the phase. Thus it is "apparent power" and is always equal to or greater than the actual power that is used by the load.

People frequently get all worked up with they hear that they might get charged for power that they aren't actually using. But this is a red herring argument. In point of fact, they are already getting charged. After all, the rate that the utilities set for every kWh of real energy that a residence uses includes all of the costs of getting that energy there, including the reactive power that is drawn by the home and then shuttled back and forth between the home and the utility, being progressively lost in the lines inbetween. Up to this point, the utility has merely assumed that all homes are the same in this regard and therefore they spread the costs out evenly for each kWh consumed. They to NOT do this with large commercial customers and instead charge them a premium based on their power factor. This incentivizes the big users of, particularly, large inductive loads to invest in reducing their power factor by installing local power factor correction equipment (usually large capacitors) in order to reduce their utility bills.

All that is being talked about now is doing the same for residential and small business consumers, as well. The impetus is that, increasingly, the loads used in these situations have significant reactive currents that increase costs and also increase required generating capacity. They could just pass the costs along to everyone, but instead they want to incentivize even individual homeowners to perform power factor correction locally (which, for a household-size load, isn't outrageously expensive). This makes even more sense when you consider that homes vary greatly in how reactive their loads are, and thus simply charging everyone the same rate isn't very fair. So what you would likely see would be a lower rate for real power and a premium added for reactive power, but since you have ways to control how reactive your load is, you can minimize or even eliminate the premium and enjoy paying only the lower real power rate.

The problem in implementing this approach is that present residential meters where very carefully designed so as to only measure real power. Thus, every homeowner's meter would have to be replaced (and the only practical way to make that happen is for the utility to eat the cost of the upgrade or provide an effective incentive for individuals to do it and deal with the fact that some will and some won't). A compromise that has been considered is to make the power factor measurement at a higher level and then charge all of the customers served at that level under the assumption that they all have that same power factor. This would allow the utility to balance effectiveness with cost. The smaller the group that is served by a single measurement point, the greater the impact each customer has on their own bill as they take steps to improve their own power factor and the more likely they are to encourage their neighbors to do likewise. But, this increases the number of meters needed compared to taking the measurement higher up. But, higher up measurements mean that a given individual is more likely to conclude that there is no benefit for them to spend the money to improve their home because the expected impact on their bill will be too small to justify it.

In the end, what I expect to happen is new installations will use upgraded meters. For the existing meters, how I suspect that will happen is the power companies will start charging something along the lines of the reactive power rate for customers will real power meters and tell them that, if they install an upgraded meter (likely with some kind of government subsidy, for better or worse) they can get the lower real power rate for the bulk of their usage.

Sooner or later, it's going to happen.
 

GetDeviceInfo

Joined Jun 7, 2009
2,192
I have to do a paper for an English class that evaluates something. I’m considering the efficiency of 110 VS 220. But I have seen info saying there = and another saying 200 is. What is correct?
Given you speak only of voltage, there is no difference. Efficiency is largely based on the power in/power out relationship. 'Power' requires more than voltage information. If you place it into the context of some type of usage, then one could generate an arguement for one or the other.

Costs are often included in the efficiency formula, even though it's not strictly electrical. High density power applications can have large cost reductions, whereas upgrading exsisting services stictly for voltage conversion may make little sense.
 
Last edited:

Thread Starter

electronis whiz

Joined Jul 29, 2010
512
thanks wbahn. that cleared up a bit of it. i also had another friend that installed a capacitor box in the church said it powered motor startup and stuff like that so it didn't come fronm the utility. i thought this was a waist at the time because theyd then recharge with what was used. thus making them = with or without but i guess this is not so. still not exectly understanding what apperent power is though.
 

paulktreg

Joined Jun 2, 2008
833
Switched mode power supplies are certainly more efficient at 230VAC than 110VAC and just imagine how many of those are in use world-wide!
 

WBahn

Joined Mar 31, 2012
29,928
If you have a resistor as a load and apply a sinusoidal voltage to it, then the current will be sinusoidal and will be in phase with the voltage. As a result, if you were to calculate the instantaneous power flow at any moment in time, it will always be positive and energy is always flowing from the source to the load. You can take the voltage waveform and some up with its RMS value, which is nothing more than the value of a DC voltage source that would dump the same average power into that same resistive load. Similarly, you can take the current waveform and come up with its RMS value. If you multiply the two together, you get what appears to be a measure of power. In the case of a purely resistive load, it actually is the real power being delivered (and dissipated) by the load and hence, in this special case, real and apparent power are the same.

But if that load is replaced with either a capacitor or an inductor, then what happens is that during part of the cycle the source is supplying power to the load, but the load is storing that energy in either an electric or a magnetic field. During other parts of the cycle, the load is actually supplying power back to the source drawing it from the field it is stored in. As a result, the voltage and current are not in phase (in fact, for a pure capacitive or pure inductive load, they are 90 degrees out of phase) and if you calculate the instaneous power then during part of the cycle it is positive and part of teh cycle it is negative. Under these conditions, the real power delivered to the load is zero (it gives back everything it receives) but you still have a voltage waveform, for which you can calculate an RMS value, and you still have a current waveform, for which you can calculate an RMS value and you can still multiply them together to get something that appears to be a measure of power delivered to the load, but in fact all of it is not delivered (permanently) to the load but, rather, is shuttled back and forth. This is called "reactive power" and apparent power is merely the sum of the real power and the reactive power.

So, in the resistive case, all of the apparent power (units of VA - volt-amps) is real (units of W - watts) and in the capacitor/inductor case, all of the apparent power is reactive (units of VAR - volt-amps (reactive)).

Real loads consist of both a real part (resistor) and a reactive part (capacitor or inductor) and hence the apparent power consists of both real power and reactive power. The result here is that the voltage and current sinusoids (assuming they are still sinusoids, which is part of the problem) are partially in phase and partially out of phase. Because real power is always 90 degrees out of phase with reactive power, the two add like the sides of a right triangle, namely the square of the apparent power is the sum of the squares of the real and reactive powers.

The power factor is defined as the ratio of real power to apparent power.

With all of this in mind, consider the following:

I have a large industrial motor that has a significant inductive component. If that thing is running, I am having to charge up and then discharge the inductor's magnetic field twice each cycle. This means that twice each cycle the utility has to supply not only the current I am going to convert into real work, but also the current needed to charge the field. They are also going to have to absorb that current back twice each cycle as my motor's inductance discharges. So the energy associated with that is going to have to travel through the utilities transmission lines from the generating facility to my motor 120 times a second (here in the U.S. with our 60Hz mains) and, in doing so, some of it is going to be lost as heat in the lines and transformers along the way and, each cycle, the generating facility is going to have to supply new energy to make up for the losses. However, an alternative is to put a bank of capacitors near the motor so that it can dump that energy into the nearby capacitor bank and then turn around and draw that energy back out. There will still be losses in doing so that the generator is still going to have to make up each cycle, but far far less than before. For the most part, the utility now only needs to supply the reactive energy once when my motor starts up and then absorb it once when my motor shuts down, instead of thousands of time for each minute that the motor is running.
 

Thread Starter

electronis whiz

Joined Jul 29, 2010
512
Switched mode power supplies are certainly more efficient at 230VAC than 110VAC and just imagine how many of those are in use world-wide!
why if it is on 110 it is just turned to 220 internaly by the voltage pump capacitors. wouldnt that make them =? or does this coltage pupm have alot of loss? if not why are they differnt?
 

paulktreg

Joined Jun 2, 2008
833
why if it is on 110 it is just turned to 220 internaly by the voltage pump capacitors. wouldnt that make them =? or does this coltage pupm have alot of loss? if not why are they differnt?
I've often wondered this myself but it's definately true for high-end ATX power supplies with a difference of up to 5%. Most switched mode power supplies rectify and smooth the AC to produce a DC voltage that is switched through a transformer, rectified and smoothed to give the required DC rail(s). If you start with 310VDC(from 220VAC) rather than 155VDC(from 110VAC) i guess you have to switch it less for the required output and hence losses are reduced?

If someone knows the definitive answer I'd be interested to know.
 
Top