I got this quote from online, "Ordinary electrical cables suffice to carry low frequency AC, such as mains power, which reverses direction 50 to 60 times per second. However, they cannot be used to carry currents in the radio frequency range or higher, which reverse direction millions to billions of times per second, because the energy tends to radiate off the cable as radio waves, causing power losses." I thought power losses came from heat dissipated by the resistance. How are the radio waves created by this current becoming a power loss and how is that calculated?