Hi,
I have read that transformer core saturation must be avoided because a saturated core no longer induces a voltage causing the impedance to drop to the DC resistance which in turn allow large currents to flow. As I understand it this is mainly because the induced voltage is proportional to dΦ/dt there Φ is the magnetic flux.
Furthermore dΦ/dt is proportional to the applied voltage. Can we conclude then that a transformer is much rather driven by a sine wave than a square wave? On the flat parts of a square wave there will be no change in flux so the impedance will be very low(?). Probably not because it seems that a lot of switch mode power supplies happily chop a square wave into a transformer. What am I missing here?
I have read that transformer core saturation must be avoided because a saturated core no longer induces a voltage causing the impedance to drop to the DC resistance which in turn allow large currents to flow. As I understand it this is mainly because the induced voltage is proportional to dΦ/dt there Φ is the magnetic flux.
Furthermore dΦ/dt is proportional to the applied voltage. Can we conclude then that a transformer is much rather driven by a sine wave than a square wave? On the flat parts of a square wave there will be no change in flux so the impedance will be very low(?). Probably not because it seems that a lot of switch mode power supplies happily chop a square wave into a transformer. What am I missing here?