House Hold Power at Outlet.

Thread Starter

Gastube

Joined Jul 13, 2012
1
Hi all ,hope some one can help me understand whats going on here.
We moved recently out to the country from small town in Ontario.
In town I knew the voltage at the outlet was around 105 volts.
At our new place I noticed the voltage is around 125 volts.

This question is regarding trying to save a buck cause both of us will be retiring soon and will be on a very small pension so I'm looking at ways to cut costs.
Correct me if I'm wrong, but because of the higher voltage's in the country devices ,I like to call them Dumb devices like lights , stove , motors ,heaters ,non electronic stuff which has built in Voltage regulation, These devices would use more Power to run at higher voltages ,Yes or No, I never had any issues in town with the operation of any of my appliances . Would this not save me some $$$$$$. Ohms Law says I=E/R If E is higher I is more.

Also does any one know if they sell a Voltage regulator for the Home.

Thanks

Gas
 

Wendy

Joined Mar 24, 2008
23,421
Yes and No. 105VAC is way low, and can do damage on some appliances long term. The description is brown out.

The average for 115VAC is usually 117VAC. 125VAC is a bit high, but not excessively so.

In general devices take what they need in current. You will not save much money by lowing the voltage. One appliance repair will blow any savings way out of the water.
 

GetDeviceInfo

Joined Jun 7, 2009
2,196
you won't notice a difference. Your motor loads will probably run with a slightly higher effeciency. Reduce your lighting wattage draw with flourescants, supplement your heating (wood), unless your on gas. Throw away the TV.
 

gerty

Joined Aug 30, 2007
1,305
I think you have that twisted a little. The higher voltage uses less current.

For example 100 watt bulb on 125 volts I = P/E .. I=100/125.... I = .8 amps
100 watt bulb on 105 volts I=P/E .. I = 100/105 .... I =.95 amps
 

wayneh

Joined Sep 9, 2010
17,498
Forget it and move on. :( I can't imagine why you had low voltage in town, but that was either a bad measurement, old wiring, or an anomaly of some kind. There's no way you want to mess with your household voltage.

It IS true that purely resistive devices like light bulbs will take a bit less power at lower voltage. But they may actually become less efficient (less light per watt) under those conditions. And MANY things in your home could be damaged by low voltage. It's just a dead end, IMHO, even if you could control your voltage. And you can't, not easily.

Holy smokes, there wasn't a single reply when I started this!
 

wayneh

Joined Sep 9, 2010
17,498
The higher voltage uses less current.
No, because it'll no longer be a "100W" bulb. Lower voltage across a given resistance means less current and less power. The resistance won't be exactly the same, because the bulb will be cooler, but that won't change the overall effect.
 

mcgyvr

Joined Oct 15, 2009
5,394
ha.. did they boot you from eng-tips?.. They are really strict/snobby over there about non-engineers posting questions.

Did you get a chance to read the responses before you were booted? It was all answered there.

In general..yes your items will draw more current and "could" end up costing you more money.. However items like the stove/heater (if electric) could be operating more efficiently and cycling less to maintain the given temp which could actually save you money.

No there is no whole house regulator.

Find other ways to try to save money.
 

WBahn

Joined Mar 31, 2012
30,059
For the most part it will be a wash, with the difference making too little a difference to waste much effort on -- you'll end up spending dollars to save cents.

Whether a given item will consume more power, the same, or less is dependent upon the item.

For instance, that heater you use will probably consume more power while it is running, but you won't run it as much. The stove will likely be the same way. You will either learn to set the temperature to something slightly less or, more likely, will take things out and turn the oven off a bit sooner than you would at a lower voltage. The surface burners you will probably continue to do what most people do now, you adjust them up or down depending on if you are getting what seems to be the right about of heat into the skillet, so you will find yourself using a slightly lower setting now than you would have before.

Your lights will, in general, consume more energy (and put out a bit more light). You can compensate with that by using the next lower bulb wattage, but that might be too much of a step down and you'll end up staying with the higher wattage. Keep in mind that a 60W bulb that is on 8-hours a day (and few are) at 10 cents/kWh cost you about $1.50 a month to operate. Going from 105V to 125V could increase that by 50 to 75 cents a month, but you might find that 40W bulbs are close enough and shave a dime or so a month. Of course, these dimes and quarters do add up.

Other things, such as most electronics, have switching regulators that will only pull the power they need and you won't see much of any difference there -- they will simply pull less current to compensate for the higher voltage.
 

williamj

Joined Sep 3, 2009
180
If I'm not mistaken, (and I may well be, not an expert here) most all electrical devises work within a given range of voltages. For standard home devises it's commonly referred to as a 110, 115 or 120 volt devise, and will operate in a voltage range of say 125 (high) to 110 (low). Personally I've seen such devices work as high as 130V and as low as 104V. (can't comment on the efficiency at those voltages)

As I understand it, the electric company will supply household electricity at a slightly higher voltage (125V-130V or higher) at substations and rely on multiple users to drop the voltage to a more acceptable level, somewhere around 115V.

just add'n my two cents and hopin' I'm right,
williamj
 

wayneh

Joined Sep 9, 2010
17,498
Your lights will, in general, consume more energy (and put out a bit more light).
Are you really prepared to claim that a light bulb gets brighter with lower voltage? Please reconsider.

If it were true, I could put two in series and get more than double the light output, since each would now have half voltage and, by your claim, burn brighter.

The OP was right in his thinking that many things in his house would use a bit less power at a lower voltage. But it's a big waste of time to try to exploit that fact. For one thing you could just use lower wattage bulbs and cook at lower temps in the first place. Use the furnace fan less often. Done. To lower a home's voltage, at the least you'd need a big transformer to step down the voltage, and that alone would lose far more than any gain. And supplying a low voltage could ruin some of his electronics as well.
 

alim

Joined Dec 27, 2005
113
I think you have that twisted a little. The higher voltage uses less current.

For example 100 watt bulb on 125 volts I = P/E .. I=100/125.... I = .8 amps
100 watt bulb on 105 volts I=P/E .. I = 100/105 .... I =.95 amps
I will politely say this is incorrect. You have premised your proposal on the incorrect assumption that whichever voltage you use you will get 100 watts . What watts(brightness) you get depends on the voltage used.I will use 115volts as the reference(the mid point) .At 115volts you will draw 0.869 amp.Remember the bulb is a fixed resistance( ignore cold resistance), at reference of 115 volts/100watts, it is 132.25 ohms. At 125 volts the bulb will draw 0.945 amp, =118 watts(brighter). At 105 volts it will draw 0.793 amp.=83 watts(less bright).What you said is true of systems wired for a higher voltage versus a lower voltage say 240 volts vs. 120 volts, but they would have about a 4:1 ratio in resistance.
 

SgtWookie

Joined Jul 17, 2007
22,230
<snip>Remember the bulb is a fixed resistance( ignore cold resistance)>
This is NOT correct.

After a brief settling period, the resistance of the bulb's filament is basically a log function of the voltage across it. When you increase the voltage, the resistance increases. It does not remain the same.
 

takao21203

Joined Apr 28, 2012
3,702
Real tests with a real light bulb and real meters shows that a 60 watt bulb allows .45 amps at 105VAC and it allows .50 amps at 125 VAC.
So we can know that the difference is not significant.

Seriously, the OP should consider a DIMMER.

But the biggest waste will be electric heating, it's quite expensive.

1. Get dimmers + CFLs
2. Get a Microwave oven, eventually from junk or used.
3. Limit electric shower time + don't use it for bathing.
4. Buy precooked foods + quick cook rice/noodles
5. Get mechanical timers for electric heaters + switch off if you don't use the rooms.
6. Wear appreciate clothing indoors to save heating costs

Unplugging TV etc. from standby is not really worth it, don't get hysteric about a few watts.

7. If you don't need it for reading, don't fully illuminate rooms. Get a small 10W LED light for reading desk etc.

Using my information you can easily save 100s of $ through the year.

There is also a website from the US Gov. about energy hogs, you should be able to find it somewhere on the web.
 

#12

Joined Nov 30, 2010
18,224
I wish to take issue with takao about the biggest waste being electric heating. As a state licensed designer of heating systems, I must say that the biggest waste is poor insulation and leaky windows. That said, whatever heat leaks out of the building will be replaced by the electric heaters. If the voltage is higher and the heaters produce more heat per minute, the thermostat will just shut off sooner than it would when the power line voltage is low. The power line voltage has nothing to do with the wastefulness of electric heaters, assuming they have enough voltage to operate at all.
 

takao21203

Joined Apr 28, 2012
3,702
I wish to take issue with takao about the biggest waste being electric heating. As a state licensed designer of heating systems, I must say that the biggest waste is poor insulation and leaky windows. That said, whatever heat leaks out of the building will be replaced by the electric heaters. If the voltage is higher and the heaters produce more heat per minute, the thermostat will just shut off sooner than it would when the power line voltage is low. The power line voltage has nothing to do with the wastefulness of electric heaters, assuming they have enough voltage to operate at all.
If they are well designed, they are not wasteful.

However, there can be heaters which are dimensioned for 20kW, heat up during the night, and then are supposed to give off heat during the day. That does not always work out well. Having mechanical controls, there is also poor response to outdoors temperature.

Result: You will end up with large electricity bills, and will have a very hot room for some hours, then it's too cold, so you also will switch it on during the day.

Modern heaters should have electronic controls.

Old thermostat will typically have a rather sluggish response. So voltage +- 10% is irrelevant.

Also depends if you have cheap nuclear power available, or if it is generated from expensive oil. If you only have oil available locally, it can be more efficient to burn it directly.
 
Top