# Increase frequency=small increase in voltage but why?

Discussion in 'General Electronics Chat' started by Bucks04, Nov 9, 2010.

1. ### Bucks04 Thread Starter New Member

Nov 9, 2010
3
0
I was wondering if someone could explain to me the theory/reason behind frequency changes means a small change in voltage.

My understanding that is frequency went up that would mean you are just cutting flux faster and in order to change voltage you have to change the strength of the flux with current. I'm not seeing how the speed can affect voltage.

I couldn't find anything in the forum that explained this.

Thanks

2. ### marshallf3 Well-Known Member

Jul 26, 2010
2,358
202
It doesn't, only components in a circuit can cause changes.

3. ### Jaguarjoe Active Member

Apr 7, 2010
770
91
If you're talking about VFD's, voltage/frequency follow a 460/60 (or 7.67: 1)
ratio in order to maintain rated torque. If the motor ran at 30Hz it would run at 230 volts etc. At very low speed the voltage is a bit higher to make up for stator resistance. At speeds above 60Hz, voltage is fixed because the DC bus can only go so high. This screws up the 7.67 ratio and affects torque.

4. ### Ron H AAC Fanatic!

Apr 14, 2005
7,049
659
It sounds to me like he is talking about an electromechanical generator.

5. ### Bucks04 Thread Starter New Member

Nov 9, 2010
3
0
Yes, sorry this is in regards to a generator. At work we have a PCC (can't remember what it was) that is the brains for the genset and it had a feature that had volts/hz. It had something to do with a load and adjusting back to paramaters without over shooting.

They were saying that when you change your frequecny to bring it back to 60Hz you would change your voltage by just a little bit but I wasn't sure how that worked.

6. ### Bucks04 Thread Starter New Member

Nov 9, 2010
3
0
At work no one could explain to me why this happened.

Apr 14, 2005
7,049
659