Prefered wave form into a transformer

Thread Starter

hrs

Joined Jun 13, 2014
400
Hi,

I have read that transformer core saturation must be avoided because a saturated core no longer induces a voltage causing the impedance to drop to the DC resistance which in turn allow large currents to flow. As I understand it this is mainly because the induced voltage is proportional to dΦ/dt there Φ is the magnetic flux.

Furthermore dΦ/dt is proportional to the applied voltage. Can we conclude then that a transformer is much rather driven by a sine wave than a square wave? On the flat parts of a square wave there will be no change in flux so the impedance will be very low(?). Probably not because it seems that a lot of switch mode power supplies happily chop a square wave into a transformer. What am I missing here?
 

Jony130

Joined Feb 17, 2009
5,488
In switch mode power supplies we design the circuit in such a way that the core will never saturated.
As for this "flat parts of a square wave there will be no change in flux". This is not true. The flux will change (rising linearly) because the primary current will also rises linearly. Remember the equation for Flux Φ = (I * L)/N where I is the current, L the inductance, N number of turns.
http://forum.allaboutcircuits.com/t...lator-theory-of-operation.113526/#post-883379
http://forum.allaboutcircuits.com/threads/understanding-basic-transformer-theory.104524/#post-793379
 

Thread Starter

hrs

Joined Jun 13, 2014
400
Ah yes, that graph with i and v explains it. The voltage is the derivative of the current. Thanks!
 
Top