I'm designing a buck converter with 1500V input voltage, 320V 5A output and I'm having trouble with my inductor running too hot. Circuit is a completely standard buck converter topology with schottky diode built around the TL494. I'm only testing at 1.25A output for the time being (so it's running in discontinuous mode at least according to simulation). My first attempt used a 470uH iron powder toroidal inductor with a 7A rms current rating (ATCA-08-471M) running at 100kHz, and within 10 seconds of operation the enamel on the inductor wire was smoking. I switched to a 250uH inductor with a 10A current rating and doubled the frequency to 200kHz and it seemed about the same. I also tried doubling the frequency with the original inductor, and with both inductors in series and nothing even gets it into the ballpark in terms of power dissipation. The maximum run-time is around 30 seconds before inductor temp is > 150C.
Due to the high voltage I am somewhat limited with diagnostic ability but I can verify that the waveform to the IGBT gate looks good, the IGBT and diode are running cool (enough). Ripple current in this application is not critical, and cost is essentially no object. I just need to get the inductor power dissipation down to something reasonable. What's going on? Is it core losses? Based on simulation and my experiments, I don't believe I'm saturating the inductor. 200kHz is really pushing it as it is, I would like to be at 100kHz for the final design.
Due to the high voltage I am somewhat limited with diagnostic ability but I can verify that the waveform to the IGBT gate looks good, the IGBT and diode are running cool (enough). Ripple current in this application is not critical, and cost is essentially no object. I just need to get the inductor power dissipation down to something reasonable. What's going on? Is it core losses? Based on simulation and my experiments, I don't believe I'm saturating the inductor. 200kHz is really pushing it as it is, I would like to be at 100kHz for the final design.