When do circuit analysis the energy transferred to electromagnetic waves seems to be neglected (for the low frequency analysis I have been introduced to). If the frequency of the circuit is increased does the transfer of energy to electromagnetic radiation increase? For a simple example if I had a 60Hz source connected to a resistor the analysis is very simple and I would assume the EM losses are negligible. What is the case for operating at a extremely high frequency. I understand that as the frequency increases the energy stored in any inductors magnetic field would also increase and contribute to less power delivered to the load. But specifically at these higher frequencies is there also a greater amount of energy that is transferred to electromagnetic waves that would have to be accounted for in the circuit?