What is the relation between frequency and voltage

Thread Starter

Ali Alkhudri

Joined Nov 3, 2015
19
Hi
How can it be that the frequency is 50 Hz and voltage is 230 V in the wall
where creates that 50 Hz ?
What is the advantage of 50 Hz ?
What is the disadvantage of 50 Hz ?
There is a formula says : V=(f*h)/e ... So does that mean if frequency increaces so voltage increases too ?
thanks
 

crutschow

Joined Mar 14, 2008
34,283
The frequency is determined by the AC synchronous generator that provides the power.
The is no "advantage" to 50Hz. It was a value selected from trade-offs between the size of the transformer and generator magnetics and line losses. 50Hz is used in Europe and Asia. 60Hz is use in the US and other parts of the Americas.
It has nothing to do with the voltage.
 

ISB123

Joined May 21, 2014
1,236
Generator creates frequency but it depends on its RPM.
230V is used because its more efficient to send over long distances.
 

Thread Starter

Ali Alkhudri

Joined Nov 3, 2015
19
The frequency is determined by the AC synchronous generator that provides the power.
The is no "advantage" to 50Hz. It was a value selected from trade-offs between the size of the transformer and generator magnetics and line losses. 50Hz is used in Europe and Asia. 60Hz is use in the US and other parts of the Americas.
It has nothing to do with the voltage.
thank you .. Why in US they have 60 Hz and 120 V but Europe 50 Hz and 230 V .. Why there is a big difference in voltage between them ?
How determine that difference ?
There are some formulas in google about relationshipe between frequency and voltage ,, for example v=Vp Sin (2*pi*f*t)
 

Papabravo

Joined Feb 24, 2006
21,159
One of the earliest disputes in power generation and distribution was electrical safety. In the late 1800's New York City had DC electrical power of approximately 125 volts. It was used for elevator motors among other things. Con Edison kept the DC electric service in operation until comparatively recently. The switch to AC power was resisted at the time for a variety of reasons having to with the apparent dangers of AC power and the competition between Edison and Westinghouse.

Europe was able to avoid all that and was able to make other choices.

http://www.coned.com/newsroom/news/pr20071115.asp
http://www.smithsonianmag.com/history/edison-vs-westinghouse-a-shocking-rivalry-102146036/?no-ist
https://en.wikipedia.org/wiki/War_of_Currents
http://webphysics.iupui.edu/jittworkshop/251Sp98GFMar23.html
 

crutschow

Joined Mar 14, 2008
34,283
thanks .. Is ac synchronous generator makes 230 V or 120 V ?
The actual generator likely has an output of several thousand volts. This increased to over 100,000 volts for transmission to the cities and then reduced by transformers in stages until it gets to 120V or 240V in residences.
Read this for more info.
 

alfacliff

Joined Dec 13, 2013
2,458
the 60 hz and 50 hz standards were adopted because it was easier to make gear ratios for clocks with those frequencies. lower frequencies cause blinking of lights, and higher ones were harder to design for at the time.. most aircraft use 400 hz so motors and transformers are lighter.
 

ian field

Joined Oct 27, 2012
6,536
Generator creates frequency but it depends on its RPM.
230V is used because its more efficient to send over long distances.
Electricity generation started off with completely independent private companies generating whatever they thought would work best.

At first, only cities had electricity, then towns caught up - in the early days the UK had nearly as many voltage standards as electrified cities - some were AC, others DC.

In those days there were growing numbers of small electrical businesses that could wind a mains transformer for most things at the local supply voltage - in the DC areas everyone had to use strings of bulbs as dropper resistors.

It gradually evolved into some semblance of standardisation, AFAIK: the change to all AC came first, then gradually 250V emerged as the standard voltage. Years ago that was changed to 240V, and more recently they fiddled the % tolerance so it appears to comply with the EU standard.

Somewhere around WW2, the UK government realised that electricity was a strategic material and set about building the National Grid - Which the Tories sold to the French a few parliaments ago!

AFAIK: the decision whether to use AC or DC in the US, was largely based on the outcome of experiments to find which was most effective for Old Sparky. I don't know how some countries decided on 110V, its not just electricity distribution, house wiring has to be larger diameter because any appliance of a certain power draws double the current.

Back in the days of AC/DC TVs & radios, the heaters were supplied in series chains across the mains. The available voltage being so low, the rectifier heater was designed to drop most of it. In the UK; valves for AC/DC sets had 300mA heaters - in the US, TVs with lots of valves hit a heater power limitation imposed by the voltage headroom. They had to produce a family of valves with 600mA heaters.
 

wayneh

Joined Sep 9, 2010
17,496
There are some formulas in google about relationshipe between frequency and voltage ,, for example v=Vp Sin (2*pi*f*t)
That is not relevant here. The voltage and and the frequency of your power mains are independent of each other, at least for practical purposes. They are both arbitrary choices and both are controlled separately. OK, that's a simplification, but mostly true for us users.
 

PeterCoxSmith

Joined Feb 23, 2015
148
thank you .. Why in US they have 60 Hz and 120 V but Europe 50 Hz and 230 V .. Why there is a big difference in voltage between them ?
How determine that difference ?
There are some formulas in google about relationshipe between frequency and voltage ,, for example v=Vp Sin (2*pi*f*t)
v is not proportional to f but varies with time; however dv/dt is proportional to f so when you do calculations for capacitors f comes into play as i(t)=C(dv/dt)
 

Thread Starter

Ali Alkhudri

Joined Nov 3, 2015
19
the 60 hz and 50 hz standards were adopted because it was easier to make gear ratios for clocks with those frequencies. lower frequencies cause blinking of lights, and higher ones were harder to design for at the time.. most aircraft use 400 hz so motors and transformers are lighter.
"most aircraft use 400 hz so motors and transformers are lighter ";; you mean motors are less heavy with 400 hz ?
thanks
 
Top