Increase frequency=small increase in voltage but why?

Thread Starter

Bucks04

Joined Nov 9, 2010
3
I was wondering if someone could explain to me the theory/reason behind frequency changes means a small change in voltage.

My understanding that is frequency went up that would mean you are just cutting flux faster and in order to change voltage you have to change the strength of the flux with current. I'm not seeing how the speed can affect voltage.

I couldn't find anything in the forum that explained this.


Thanks
 

Jaguarjoe

Joined Apr 7, 2010
767
If you're talking about VFD's, voltage/frequency follow a 460/60 (or 7.67: 1)
ratio in order to maintain rated torque. If the motor ran at 30Hz it would run at 230 volts etc. At very low speed the voltage is a bit higher to make up for stator resistance. At speeds above 60Hz, voltage is fixed because the DC bus can only go so high. This screws up the 7.67 ratio and affects torque.
 

Thread Starter

Bucks04

Joined Nov 9, 2010
3
Yes, sorry this is in regards to a generator. At work we have a PCC (can't remember what it was) that is the brains for the genset and it had a feature that had volts/hz. It had something to do with a load and adjusting back to paramaters without over shooting.

They were saying that when you change your frequecny to bring it back to 60Hz you would change your voltage by just a little bit but I wasn't sure how that worked.
 
Top