# Cost of Electricity

Discussion in 'Off-Topic' started by steev, Jun 12, 2011.

1. ### steev Thread Starter New Member

Jun 12, 2011
6
0
Hi everyone,
I'm Brit living in Texas...
This is my first post on here, hope to be a regular member!!!!!
I have a project that I want to start soon, but this depends on the answer to my question now............

Ok, lets say you plug in your phone to recharge and it takes 2 hours. Lets say the adaptor says 7volts on it (meaning it reduces the voltage from 110 volts to 7 volts which is what the phone needs to recharge)
Does that mean that for those 2 hours the electricity company will charge me for 2 hours of 110 volts or 2 hours of 7volts?

The reason I ask is that in December we put up all the pretty Christmas lights, and every year the December electric bill is enormous.
All the pretty lights are usually small voltages etc and often come with an adaptor. If the adaptor is for 12 volts, or 14v for example, does that mean for the entire month I'm paying for 110volts (even though I'm "using only 12v") for each night of turning on those lights, or is it because 12volts for each day will add up to a lot of money?
Or do I pay for only 12v and it adds up to a big bill...
Hope you get the point of my question!!!!
Have a super day...
Regards,
Steve

2. ### magnet18 Senior Member

Dec 22, 2010
1,232
125
You don't pay for voltage, you pay for power, or the voltage multiplied by the current, so you pay for the voltage the lights draw multiplied by the current they draw plus losses from the converter. Or, more accurately, you pay for the current drawn by the converter multiplied by the voltage.

3. ### steveb Senior Member

Jul 3, 2008
2,433
469
Yes, the above is basically correct, but time has to be factored in (obviously).

Basically you pay for energy, which is power multiplied by time. The power is voltage times current.

You don't have to worry that you are overpaying just because the voltage is lower because most converters are reasonably efficient and have high power efficiency. The AC side, which is what the electric company charges you for, will be high AC voltage at low current. The DC side, which is what is actually powering the lights, will be low voltage at proportionally higher current. The power is still the same, aside from the conversion loss, which does add a little to the cost.

However, obviously, running many lights for a long time will increase your electric bill. Consider a 100 W light bulb. If you run it for 10 hours, that's 1 KWHr, which costs something like 10 to 20 cents. Running 100 small 1 Watt light bulbs for 10 hours would cost the same. If you are using a DC conversion in the process, then maybe you pay 20% more because of losses, but it's not going to be 2, or 10, or 100 times more.

Last edited: Jun 12, 2011
4. ### PatM Active Member

Dec 31, 2010
82
87
You pay for the power used not the voltage.
As for a small transformer putting out 7 volts at a small current there is nothing to worry about.
I have a interface for my Amateur radio that is turned on continuously.
It only consumes less than 0.2 watts.
It could run for about 60 years for the price of a power switch, which is why no switch is supplied with the unit.

5. ### steev Thread Starter New Member

Jun 12, 2011
6
0
Hi Again all,
Thanks for the replies, much appreciated...
Yes, I realise I pay for power, not voltage, An analogy is a 5-lane highway: the resistor is across 4 lanes so the 5th lane is the only one being used....... and in those 4 lanes the elctricity is backed all the way up to the meter so the meter only turns at "1 lane" speed because only 1 lane is "moving", or does the meter still turn at "5-lane" speed regardless of adaptors, low voltage lights etc etc.
Any clearer? Or perhaps it just doesn't work that way!!!!!
Steve

6. ### steveb Senior Member

Jul 3, 2008
2,433
469
No, doesn't work that way.

You pay for power multiplied by time, if you want to keep it simple. Lights need power, and the time you keep them on determines the cost for the light you want to produce.

It's difficult to compare DC low voltage with AC high voltage lighting because it all depends on the light source and the conversion efficiency. For example, if you use a DC adaptor, you lose a little efficiency in the conversion of AC to DC. However, if you use LEDs instead of incandescent bulbs, the efficiency of the light source is much much better. So, you may win.

But, the bottom line is that you pay for all energy used, whether it is light or heat produced. You don't have to worry that you are wasting huge amount of money because the lights are DC low voltage. However, you see your bill is high , which means you are using a lot of lights for long periods.

7. ### magnet18 Senior Member

Dec 22, 2010
1,232
125
A better analogy is a gear ratio.
Half the speed, twice the torque, same energy.

8. ### DumboFixer Active Member

Feb 10, 2009
219
34
The cost of the electricity used will be calculated at some point where it enters the property, before the ain fuse/switch/circuitbreaker board, and will probably be in KiloWatt/Hrs.

So in the US you'll get just around 9.5Amps for 1hr at 110v for 1 Kw/Hr and in the UK you'll get around 4.5Amps for 1hr at 220V for 1Kw/hr. Although the voltage and current differ the power used is the same.

The meter neither knows nor cares what transformers etc you have plugged in, it is measuring the amount of power you are using.

9. ### steev Thread Starter New Member

Jun 12, 2011
6
0
Got it!!!!
Took a bit of time but we got there!!!
Thanks to everyone for all your replies, it's much appreciated
Steve