A question regarding a.c.

Thread Starter

yousuf

Joined Mar 21, 2009
11
Hi everyone,
I have a query in my mind that what will happen when frequency of main voltage(220 V ac) increases from 50 to 60(or more) accidentally.What will be effect on output voltage(and frequency) from a step-down transformer connected next to main supply cable for power supply circuit???

Plz plz answer my question
 

crutschow

Joined Mar 14, 2008
34,201
The output voltage of a power transformer will not change significantly with frequency. The output frequency obviously will, since the input and output frequencies are always the same.
 

#12

Joined Nov 30, 2010
18,224
Mains frequency is so stable that (in most countries) it can be used as a time reference for clocks over periods of months to years.
 

ErnieM

Joined Apr 24, 2011
8,377
If a British power supply was accidentally plugged into a US outlet it would see just a shift in frequency.

Practically, very little change would be noted. The disaster scenario for a transformer is when it "saturates," meaning the magnetic field of the core has reached it's maximum. Running it at a higher frequency, thus reversing the field sooner is actually somewhat better to avoid this situation.
 

DerStrom8

Joined Feb 20, 2011
2,390
Many power supplies say that they are rated for 50/60 hertz on the case. A shift of 10 hertz probably won't cause harm to the device, especially if it says it's rated for both frequencies. If not, it wouldn't be the worst thing to do to the supply, but it's not really the best either. If it's just a basic transformer power supply, the output frequency should be fairly close to the input frequency, but the voltage shouldn't change much.
 

P-MONKE

Joined Mar 14, 2012
83
Many power supplies say that they are rated for 50/60 hertz on the case. A shift of 10 hertz probably won't cause harm to the device, especially if it says it's rated for both frequencies. If not, it wouldn't be the worst thing to do to the supply, but it's not really the best either. If it's just a basic transformer power supply, the output frequency should be fairly close to the input frequency, but the voltage shouldn't change much.
Just as an aside to this, I am currently restoring a 1974 Midway 'Chopper' arcade game. This was manufactured in America and I am in the UK.
A step-down transformer allows the electronics to work fine and dandy.* However, all of the sound effects (bar one) are created using an eight-track tape player which, you've guessed it, runs on an AC motor directly off of the PSU. The 17% speed reduction is most definitely noticeable on the vocal samples! :eek:

I think maybe a custom pulley and new drive belt may be on my 'to do' list.


*Once I've fixed all of the faults!
 

#12

Joined Nov 30, 2010
18,224
One last shot...when I was designing linear power supplies
(They're called antiques now.)
The only difference between a 60 Hz model and a 50 Hz model was that the mains transformer had 6/5 times as much iron in the core for 50Hz countries.
The slower power wave required more iron to keep the core from saturating magnetically.
That's all.
 

P-MONKE

Joined Mar 14, 2012
83
One last shot...when I was designing linear power supplies
(They're called antiques now.)
The only difference between a 60 Hz model and a 50 Hz model was that the mains transformer had 6/5 times as much iron in the core for 50Hz countries.
The slower power wave required more iron to keep the core from saturating magnetically.
That's all.
Would this possibly cause long-term reliability issues with a 60Hz transformer being run at 50Hz? Or are they over engineered enough for it not to really matter?
Sorry if I'm thread-jacking a little, but it seems relevant here.
 

#12

Joined Nov 30, 2010
18,224
Yes. A mains transformer designed for 60 Hz can have saturation problems at 50 Hz.
It usually WILL have problems because the almighty dollar dictates that engineers design the absolute minimum quality into their parts. That means that 20% extra iron is not likely to be inside your 60Hz transformer.
 

Thread Starter

yousuf

Joined Mar 21, 2009
11
Actually,the purpose of my question is that how frequency will effect the current and voltage???? I know that step-down transformer have some mechanism of maintaining voltage(a/c to rated value of transformer),but how varying frequency will effect the two parameters(current and voltage)?
 

Wendy

Joined Mar 24, 2008
23,408
Ideally, not at all.

Transformers are tweaked for best efficiency at a specific frequency, so it could have some affect it a bit.

Voltage and frequency are separate parameters. They can interact, but only on a case by case basis.

Voltage, current, and resistance are all part of Ohm's Law. You don't see frequency in there anywhere.
 

DerStrom8

Joined Feb 20, 2011
2,390
Voltage, current, and resistance are all part of Ohm's Law. You don't see frequency in there anywhere.
Unless, of course, you bring in impedance to the circuit, but even then, it's only the voltages and currents through the load that change. Is that what you were asking, yousuf?
 

Thread Starter

yousuf

Joined Mar 21, 2009
11
Unless, of course, you bring in impedance to the circuit, but even then, it's only the voltages and currents through the load that change. Is that what you were asking, yousuf?
No,actually I'm interested in knowing effect of frequency on current and voltage. And after Bill's post I got some clarification now..
 

DerStrom8

Joined Feb 20, 2011
2,390
No,actually I'm interested in knowing effect of frequency on current and voltage. And after Bill's post I got some clarification now..
Ok. I just mentioned that because, for example, if you connect an inductor and a resistor to the output of the transformer, the frequency output will change the voltage and current through the inductor. Glad you're getting some clarification, though :)
 
Top