antenna power

Discussion in 'General Electronics Chat' started by gorgondrak, Nov 18, 2014.

  1. gorgondrak

    Thread Starter Member

    Nov 17, 2014
    I got this quote from online, "Ordinary electrical cables suffice to carry low frequency AC, such as mains power, which reverses direction 50 to 60 times per second. However, they cannot be used to carry currents in the radio frequency range or higher, which reverse direction millions to billions of times per second, because the energy tends to radiate off the cable as radio waves, causing power losses." I thought power losses came from heat dissipated by the resistance. How are the radio waves created by this current becoming a power loss and how is that calculated?
  2. t_n_k

    AAC Fanatic!

    Mar 6, 2009
    Technology exists to provide transmission of high frequency signals along power lines concurrent with the primary role of power distribution.
    So the statement is not strictly correct.
    If radio waves propagate from a conductor they carry away a certain amount of energy as they radiate from the conductor. The energy has to come from somewhere. The primary source driving the cable and load must therefore account for the energy lost.
  3. Shagas

    Active Member

    May 13, 2013
    "Loss" can come from any irreversible transformation of one energy to another. That includes, electric -> heat , electric -> motion , electric -> radiation (this one includes light)
  4. alfacliff

    Well-Known Member

    Dec 13, 2013
    because power cables are not speced for impedance, proper impedance matching is necessary for transmission of rf energy, otherwise, too much loss. and the power ines are not very well balanced, causing more radiation loss than necessary when they act as antennas.