# What is the relation between frequency and voltage

Discussion in 'General Electronics Chat' started by Ali Alkhudri, Nov 5, 2015.

1. ### Ali Alkhudri Thread Starter New Member

Nov 3, 2015
19
1
Hi
How can it be that the frequency is 50 Hz and voltage is 230 V in the wall
where creates that 50 Hz ?
What is the advantage of 50 Hz ?
What is the disadvantage of 50 Hz ?
There is a formula says : V=(f*h)/e ... So does that mean if frequency increaces so voltage increases too ?
thanks

2. ### crutschow Expert

Mar 14, 2008
12,984
3,223
The frequency is determined by the AC synchronous generator that provides the power.
The is no "advantage" to 50Hz. It was a value selected from trade-offs between the size of the transformer and generator magnetics and line losses. 50Hz is used in Europe and Asia. 60Hz is use in the US and other parts of the Americas.
It has nothing to do with the voltage.

3. ### ISB123 Well-Known Member

May 21, 2014
1,239
527
Generator creates frequency but it depends on its RPM.
230V is used because its more efficient to send over long distances.

4. ### Ali Alkhudri Thread Starter New Member

Nov 3, 2015
19
1
thank you .. Why in US they have 60 Hz and 120 V but Europe 50 Hz and 230 V .. Why there is a big difference in voltage between them ?
How determine that difference ?
There are some formulas in google about relationshipe between frequency and voltage ,, for example v=Vp Sin (2*pi*f*t)

5. ### Ali Alkhudri Thread Starter New Member

Nov 3, 2015
19
1
thanks .. Is ac synchronous generator makes 230 V or 120 V ?

6. ### Papabravo Expert

Feb 24, 2006
10,135
1,786
One of the earliest disputes in power generation and distribution was electrical safety. In the late 1800's New York City had DC electrical power of approximately 125 volts. It was used for elevator motors among other things. Con Edison kept the DC electric service in operation until comparatively recently. The switch to AC power was resisted at the time for a variety of reasons having to with the apparent dangers of AC power and the competition between Edison and Westinghouse.

Europe was able to avoid all that and was able to make other choices.

http://www.coned.com/newsroom/news/pr20071115.asp
http://www.smithsonianmag.com/history/edison-vs-westinghouse-a-shocking-rivalry-102146036/?no-ist
https://en.wikipedia.org/wiki/War_of_Currents
http://webphysics.iupui.edu/jittworkshop/251Sp98GFMar23.html

7. ### crutschow Expert

Mar 14, 2008
12,984
3,223
The actual generator likely has an output of several thousand volts. This increased to over 100,000 volts for transmission to the cities and then reduced by transformers in stages until it gets to 120V or 240V in residences.

Sinus23 likes this.
8. ### Ali Alkhudri Thread Starter New Member

Nov 3, 2015
19
1
thank you sir

9. ### alfacliff Well-Known Member

Dec 13, 2013
2,449
428
the 60 hz and 50 hz standards were adopted because it was easier to make gear ratios for clocks with those frequencies. lower frequencies cause blinking of lights, and higher ones were harder to design for at the time.. most aircraft use 400 hz so motors and transformers are lighter.

Sinus23 likes this.
10. ### ian field Distinguished Member

Oct 27, 2012
4,413
782
Electricity generation started off with completely independent private companies generating whatever they thought would work best.

At first, only cities had electricity, then towns caught up - in the early days the UK had nearly as many voltage standards as electrified cities - some were AC, others DC.

In those days there were growing numbers of small electrical businesses that could wind a mains transformer for most things at the local supply voltage - in the DC areas everyone had to use strings of bulbs as dropper resistors.

It gradually evolved into some semblance of standardisation, AFAIK: the change to all AC came first, then gradually 250V emerged as the standard voltage. Years ago that was changed to 240V, and more recently they fiddled the % tolerance so it appears to comply with the EU standard.

Somewhere around WW2, the UK government realised that electricity was a strategic material and set about building the National Grid - Which the Tories sold to the French a few parliaments ago!

AFAIK: the decision whether to use AC or DC in the US, was largely based on the outcome of experiments to find which was most effective for Old Sparky. I don't know how some countries decided on 110V, its not just electricity distribution, house wiring has to be larger diameter because any appliance of a certain power draws double the current.

Back in the days of AC/DC TVs & radios, the heaters were supplied in series chains across the mains. The available voltage being so low, the rectifier heater was designed to drop most of it. In the UK; valves for AC/DC sets had 300mA heaters - in the US, TVs with lots of valves hit a heater power limitation imposed by the voltage headroom. They had to produce a family of valves with 600mA heaters.

absf likes this.
11. ### wayneh Expert

Sep 9, 2010
12,090
3,027
That is not relevant here. The voltage and and the frequency of your power mains are independent of each other, at least for practical purposes. They are both arbitrary choices and both are controlled separately. OK, that's a simplification, but mostly true for us users.

12. ### PeterCoxSmith Member

Feb 23, 2015
148
38
v is not proportional to f but varies with time; however dv/dt is proportional to f so when you do calculations for capacitors f comes into play as i(t)=C(dv/dt)

13. ### Ali Alkhudri Thread Starter New Member

Nov 3, 2015
19
1
"most aircraft use 400 hz so motors and transformers are lighter ";; you mean motors are less heavy with 400 hz ?
thanks

14. ### ian field Distinguished Member

Oct 27, 2012
4,413
782
Get into kHz and you can use much smaller ferrite cores.