I am thinking about purchasing a device that requires an input of 7500 amps at 240 volts or about 1.8 MW of power, but only for a short pulse of about 3 microseconds each second. I think the duty cycle should be .000003s / 1s = .000003 or .0003%. Assuming an ideal switching time of zero that would mean the device is only drawing 1.8 MW .0003% of the time and the rest of the time it would be drawing no power at all (aside from core losses in the transformer). So the average power should be 1800000 W * 0.000003 = 5.4 W. That's kind of hard to believe. Does that mean I don't need a thick copper cable or lots of thick copper cables in parallel even though I would be drawing (or trying to draw) 7500 amps in short bursts?
0000 AWG wire has a resistance of about half a milliohm every 10 feet. So a single 0000 AWG 10 foot long primary transformer winding that connects directly to the breaker box would drop about 7500 * .0005 = 3.75 volts resulting in 3.75^2 / .0005 = 28.1 kW power dissipation but only for .0003% of the time. If you consider the duty cycle of 0.000003 then you only end up with about 84 mW of average power dissipation in the primary winding with such thick wire.
I don't think 84 mW is going to be melting any copper, but what about the 28 kW pulses? It seems to become a question of how long it takes for copper (or magnet wire varnish) to actually melt. I don't know for sure but I'm guessing more than 3 microseconds.
Although converting peak power to average power through duty cycle calculations does seem to allow ohm's law to work its magic, it seems pretty non-intuitive to me. How can I possibly draw so much current from such a relatively small (less than 1/2" thick) wire at such a relatively low voltage? It just doesn't seem right. Can you just completely ignore such high peak currents and design solely based on average values? In this case the peak current would be 7500 amps, but the average current would only be 22.5 mA. So would I just design based on a 22.5 mA current even though there is never really any 22.5 mA current? It's either 7500 amps or almost zero or in some transitional state between the two. Admittedly most of the time there is almost no current at all in the wire. If anyone could help clarify this stuff for me I would be very grateful.
0000 AWG wire has a resistance of about half a milliohm every 10 feet. So a single 0000 AWG 10 foot long primary transformer winding that connects directly to the breaker box would drop about 7500 * .0005 = 3.75 volts resulting in 3.75^2 / .0005 = 28.1 kW power dissipation but only for .0003% of the time. If you consider the duty cycle of 0.000003 then you only end up with about 84 mW of average power dissipation in the primary winding with such thick wire.
I don't think 84 mW is going to be melting any copper, but what about the 28 kW pulses? It seems to become a question of how long it takes for copper (or magnet wire varnish) to actually melt. I don't know for sure but I'm guessing more than 3 microseconds.
Although converting peak power to average power through duty cycle calculations does seem to allow ohm's law to work its magic, it seems pretty non-intuitive to me. How can I possibly draw so much current from such a relatively small (less than 1/2" thick) wire at such a relatively low voltage? It just doesn't seem right. Can you just completely ignore such high peak currents and design solely based on average values? In this case the peak current would be 7500 amps, but the average current would only be 22.5 mA. So would I just design based on a 22.5 mA current even though there is never really any 22.5 mA current? It's either 7500 amps or almost zero or in some transitional state between the two. Admittedly most of the time there is almost no current at all in the wire. If anyone could help clarify this stuff for me I would be very grateful.