0000 AWG wire has a resistance of about half a milliohm every 10 feet. So a single 0000 AWG 10 foot long primary transformer winding that connects directly to the breaker box would drop about 7500 * .0005 = 3.75 volts resulting in 3.75^2 / .0005 = 28.1 kW power dissipation but only for .0003% of the time. If you consider the duty cycle of 0.000003 then you only end up with about 84 mW of average power dissipation in the primary winding with such thick wire.

I don't think 84 mW is going to be melting any copper, but what about the 28 kW pulses? It seems to become a question of how long it takes for copper (or magnet wire varnish) to actually melt. I don't know for sure but I'm guessing more than 3 microseconds.

Although converting peak power to average power through duty cycle calculations does seem to allow ohm's law to work its magic, it seems pretty non-intuitive to me. How can I possibly draw so much current from such a relatively small (less than 1/2" thick) wire at such a relatively low voltage? It just doesn't seem right. Can you just completely ignore such high peak currents and design solely based on average values? In this case the peak current would be 7500 amps, but the average current would only be 22.5 mA. So would I just design based on a 22.5 mA current even though there is never really any 22.5 mA current? It's either 7500 amps or almost zero or in some transitional state between the two. Admittedly most of the time there is almost no current at all in the wire. If anyone could help clarify this stuff for me I would be very grateful.