Fundamental Meter Question

Thread Starter

ironmike828

Joined Jan 29, 2010
13
I have a fundamental questions as to how kWh meters work. I want to know if the utility voltage is typically 10% above or below its nominal voltage, for instance 240 volts, how does this affect the accuracy of the kWh's counted. Do different styles of meters account for this problem better than others.
 

mcgyvr

Joined Oct 15, 2009
5,394
They typically measure voltage and measure amperage then multiply to get watts (then divide by 1000 for kW). It is not necessary to account for a problem when you measure exact values of voltage/current.
 

t_n_k

Joined Mar 6, 2009
5,455
most should have a power factor correction too.
The term "power factor correction" normally has another meaning ....

Any true indicating kWh meter would inherently account for the consumer's total load power factor.

If the supply authority / power company increases the supply voltage by 10% then this will tend to your increase power consumption and therefore your energy costs. That's obviously something that benefits the electricity supplier's financial situation rather than yours.
 

BillB3857

Joined Feb 28, 2009
2,570
The term "power factor correction" normally has another meaning ....

Any true indicating kWh meter would inherently account for the consumer's total load power factor.

If the supply authority / power company increases the supply voltage by 10% then this will tend to your increase power consumption and therefore your energy costs. That's obviously something that benefits the electricity supplier's financial situation rather than yours.
So that's why, in the US, we went from 110 to 115, to 117, to 120 volts to our homes over several years! I thought it was to reduce the IR loss percentage on the transmission lines. Silly me!
 

beenthere

Joined Apr 20, 2004
15,819
That may not be correct. If power is the product of voltage and amperage, then raising the line by 10% should cut current by the same proportion. Those I2R losses are a dead loss to the utility, as they happen on the wrong side of the meter.

If you're dealing with Amerin UE, they just crank up the rates whenever it suits.
 

t_n_k

Joined Mar 6, 2009
5,455
That may not be correct. If power is the product of voltage and amperage, then raising the line by 10% should cut current by the same proportion. Those I2R losses are a dead loss to the utility, as they happen on the wrong side of the meter.
I can appreciate your point that increasing voltage may in some cases reduce the effective supply side current - depends on the nature of the load.

For resistive loads like incandescent lamps and simple space heaters, I would expect the power consumption to rise in such devices. Increase the voltage in a resistive load and the power consumption generally increases. The increase wouldn't be directly proportional since temperature coefficient of resistance isn't linear.

Yes - power supply companies charge whatever they think the market will tolerate. In my state we have have so-called Private Public Partnerships for electrical supply in which the government still has some say in setting electricity tariffs. They are contemplating a 17% increase which has voters somewhat concerned. There is a state wide program of installation of "intelligent" domestic electricity meters which will enable some quite clever variable tariff implementation. People on low or fixed incomes such as pensioners aren't too pleased about this. The government has backed off due to public resentment - for the moment.
 

beenthere

Joined Apr 20, 2004
15,819
There is a state wide program of installation of "intelligent" domestic electricity meters which will enable some quite clever variable tariff implementation
I don't know where you are located, but California has had some really awful problems with intelligent meters. Or PG&E customers have, with bills increasing by multiple hundreds of dollars in a small percentage of cases. That amounts to many thousand outraged customers.
 
Top