Hi there really stuck on this, why are sine waves used in main transformers? I know that they only operate in AC that is pretty much it. Also what flux density should a transformer normally be excited at? If anyone could help me with this it would be greatly appreciated
Because DC doesn't work and square waves are not a natural product of electric generators. The transformer steel must NOT become magnetically saturated. It must operate at less than complete saturation. If saturated, the magnetic flux doesn't change much with the changing input signal, therefore the secondary doesn't produce the proper output since it requires the changing magnetic flux to induce the voltage into it.
thats great thanks, is there anymore to it with why sine waves are used in transformers ? cant find much about it, need to know quite a bit for a case study. also is the epstein frame an international standard for assessing magnetic materials used in transformer industry? bit confused about that. Know that is measures the losses of electrical steels.
http://www.cliftonlaboratories.com/non-linear_transformer_behavior.htm You might find something of interest in these pages
Simply put, sine waves don't have harmonics and real life transformers tend to have losses which increase with frequencies. Hence, a 60 Hz sine wave would have lower losses than any other waveform which would cause increased losses in the transformer. Maybe back in Edison's time they weren't quite as sophisticated, so I suspect that the root reason why sine waves were used to begin with is because that is the waveform that is naturally generated by electric generators. What flux density? As Dragon24 suggests, the highest flux that avoids approaching saturation.
but what is they key points for sine waves in transformers? the main reason. I know that transformers are used in AC, as it doesn't work with AC. Sine waves will also be kinder mathematically to calculate, any other reasons why sine waves used in transformers? very hard to find why lol