antenna power

Thread Starter

gorgondrak

Joined Nov 17, 2014
61
I got this quote from online, "Ordinary electrical cables suffice to carry low frequency AC, such as mains power, which reverses direction 50 to 60 times per second. However, they cannot be used to carry currents in the radio frequency range or higher, which reverse direction millions to billions of times per second, because the energy tends to radiate off the cable as radio waves, causing power losses." I thought power losses came from heat dissipated by the resistance. How are the radio waves created by this current becoming a power loss and how is that calculated?
 

t_n_k

Joined Mar 6, 2009
5,455
Technology exists to provide transmission of high frequency signals along power lines concurrent with the primary role of power distribution.
So the statement is not strictly correct.
If radio waves propagate from a conductor they carry away a certain amount of energy as they radiate from the conductor. The energy has to come from somewhere. The primary source driving the cable and load must therefore account for the energy lost.
 

Shagas

Joined May 13, 2013
804
"Loss" can come from any irreversible transformation of one energy to another. That includes, electric -> heat , electric -> motion , electric -> radiation (this one includes light)
 

alfacliff

Joined Dec 13, 2013
2,458
because power cables are not speced for impedance, proper impedance matching is necessary for transmission of rf energy, otherwise, too much loss. and the power ines are not very well balanced, causing more radiation loss than necessary when they act as antennas.
 
Top