Accounting for Energy Transfer to Electromagnetic Waves at High Frequency

Thread Starter

andrewdero

Joined Aug 9, 2019
12
When do circuit analysis the energy transferred to electromagnetic waves seems to be neglected (for the low frequency analysis I have been introduced to). If the frequency of the circuit is increased does the transfer of energy to electromagnetic radiation increase? For a simple example if I had a 60Hz source connected to a resistor the analysis is very simple and I would assume the EM losses are negligible. What is the case for operating at a extremely high frequency. I understand that as the frequency increases the energy stored in any inductors magnetic field would also increase and contribute to less power delivered to the load. But specifically at these higher frequencies is there also a greater amount of energy that is transferred to electromagnetic waves that would have to be accounted for in the circuit?
 

Papabravo

Joined Feb 24, 2006
16,103
At any RF frequency, a source may be connected to a "radiator" via a transmission line. At the interface between the source and the transmission line there is a likely "impedance discontinuity". There is another "impedance discontinuity" between the transmission line and the "radiator" (aka antenna). At each "impedance discontinuity" a portion of the RF energy will be transmitted through the discontinuity and a portion of the power will be reflected back to the source. The transmission line itself will introduce some loss or attenuation, and the second "impedance discontinuity" will transmit some of the power and reflect some back to the source.

Good system engineering will minimize the reflections and maximize the radiated power. What is true for a longwave transmitter @137 kHz., is also true for a 47 GHz. transmitter. The same principles apply.

In circuit analysis, the wavelength of low frequency signals is so huge (thousands of meters), that short bits of wire or PCB trace make really crappy radiators.

In case you are interested, the wavelength in meters is approximately equal to 300 divided by the frequency in Megahertz.
 
Last edited:

Thread Starter

andrewdero

Joined Aug 9, 2019
12
At any frequency, a source may be connected to a "radiator" via a transmission line. At there interface between the source and the transmission line there is a likely "impedance discontinuity". There is another "impedance discontinuity" between the transmission line and the "radiator (aka antenna). At each "impedance discontinuity" a portion of the RF energy will be transmitted through the discontinuity and a portion of the power will be reflected back to the source. The transmission line itself will introduce some loss or attenuation, and the second "impedance discontinuity" will transmit some of the power and reflect some back to the source.

Good system engineering will minimize the reflections and maximize the radiated power. What is true for a longwave transmitter @137 kHz., is also true 47 GHz. transmitter. The same principles apply.
What about along the portions of the line without a discontinuity? Electromagnetic waves are generated by the acceleration of electrons. I understand how this can occur at a bend or discontinuity. However when connected to an AC source it doesn't seem to me that a discontinuity is needed to radiate since alternating current of course has change with respect to time. Also do the losses caused by radiation increase with frequency?
 

Papabravo

Joined Feb 24, 2006
16,103
Discontinuities occur where there are connectors. Along a transmission line the geometry of the conductors maintains a constant characteristic impedance. For example if the transmission line is a coaxial cable, no radiation from the center conductor can penetrate the shield.

And since we are dealing with vector quantities, we can change either the magnitude or the direction to produce acceleration. The same is not true for scalar quantities.

Radiation does not need a discontinuity. The impedance of free space is approximately 377 Ω. Efficient antennas are pretty close to that value.
 

Thread Starter

andrewdero

Joined Aug 9, 2019
12
Discontinuities occur where there are connectors. Along a transmission line the geometry of the conductors maintains a constant characteristic impedance. For example if the transmission line is a coaxial cable, no radiation from the center conductor can penetrate the shield.

And since we are dealing with vector quantities, we can change either the magnitude or the direction to produce acceleration. The same is not true for scalar quantities.
Coaxial cables don't radiate outside of the cable because it is designed to have the generated fields by the sending and return currents to cancel each other out. The same can not be said for example a straight wire. A straight wire without discontinuities excited by an AC source will still create electromagnetic radiation as I understand. The question is if for higher frequencies if a greater loss will be experienced along the wire do to electromagnetic radiation compared to a lower frequency.
 

nsaspook

Joined Aug 27, 2009
8,382
One way to look at this is to see that actually (in most cases) circuit energy in transferred in fields as components of EM waves with time changing fields. 'Near' (electrical lengths) charges and sources the field energy behaves in a manner that keeps it constrained near the conductors of the circuit if the (usually physical) length of the conductors is much smaller than the changing fields electrical length so we can use a zero-length conductor circuit assumption (no potential non-uniformity and no charge accelerations across zero-length conductors) that keeps all energy constrained to the circuit unless that energy is coupled to another near-field energy component like a transformer. This is a consequence of the finite speed of energy (speed of light). At high frequency the electrical length of the energy starts to approach the physical length of conductor or a circuit discontinuity, this means we no longer can maintain the zero-length assumption because currents and voltages across the circuit are non-uniform because it takes times for the energy to travel from one point to another. This gives rise to potentials across conductors that cause accelerations of charges because of those potential differences. This can give rise to far-field electromagnetic wave radiation and energy transfer from the circuit into free-space.
 
Last edited:

crutschow

Joined Mar 14, 2008
27,184
Basically the closer the wavelength of the signal (speed of wave divided by the signal frequency) approaches the length of unshielded conductor the more the conductor tends to radiate.
Thus at low frequencies you need a really long conductor before there is significant radiation.
 

Thread Starter

andrewdero

Joined Aug 9, 2019
12
Basically the closer the wavelength of the signal (speed of wave divided by the signal frequency) approaches the length of unshielded conductor the more the conductor tends to radiate.
Thus at low frequencies you need a really long conductor before there is significant radiation.
So would a high frequency on a long conductor radiate less then a lower frequency on the same conductor if the wavelength of the lower frequency signal results in a length closer to the length of the conductor?
 

nsaspook

Joined Aug 27, 2009
8,382
So would a high frequency on a long conductor radiate less then a lower frequency on the same conductor if the wavelength of the lower frequency signal results in a length closer to the length of the conductor?
Your question is not very precise. There are far too many factors to give an answer in general.
 
Last edited:

Thread Starter

andrewdero

Joined Aug 9, 2019
12
Basically the closer the wavelength of the signal (speed of wave divided by the signal frequency) approaches the length of unshielded conductor the more the conductor tends to radiate.
Thus at low frequencies you need a really long conductor before there is significant radiation.
Looked a little more into this and I found that antennas such as dipoles are typically designed to be half wavelength (other designs such as quarter wavelength taking into account mirroring due to ground also work) where loop antennas can be designed as the wavelength acting like a folded dipole. So when you say that closer the wavelength is to the unshielded conductor the more it radiates are you meaning the length of the conductor including return path nears the length of the wavelength. Meaning problems would begin to occur when the length of the conductor feeding the source is half the wavelength similar to a dipole?
 
Consider this question from my Doctoral boards: A capacitor "C" is charged to a voltage "V". It is connected to a second, identical capacitor through a switch. What is the total energy in both caps before; and after the switch closes. If not the same, where did it go?
 

nsaspook

Joined Aug 27, 2009
8,382

nsaspook

Joined Aug 27, 2009
8,382
Actually, another explanation assumes zero resistance but non-zero inductance in the wires. Then, half of the energy has been radiated as e-m waves during the decaying oscillations. As inductance approaches zero, the frequency approaches infinity.
Of course that can't happen because in the real world of real physics the speed of causality is limited. This in effect means there is resistance in the circuit in the form of antenna radiation resistance as the dissipative element. https://en.wikipedia.org/wiki/Radiation_resistance
 
Last edited:

Delta Prime

Joined Nov 15, 2019
994
Hello there :) I have something to contribute.
All transmission lines exhibit some power loss, losses occur in the resistance that is inherent in the conductors that make up the line and from leakage currents flowing in the dialectric material between the conductors. Real transmission lines do not extend to infinity because they have a definite length that connects from the source and terminates to the load the antenna, if the load is a pure resistance whose value equals the characteristic impedance of the line the line is said to be matched the current traveling along the line from the source to the load at the end of the line acts as though there is still more transmission line of the same characteristic impedance that is the load that is the antenna and is completely absorbed and is transmitted into the atmosphere.
Now consider a transmission line that terminates into a load that is not equal to the characteristic impedance of the transmission line the line is now a mismatched line. RF energy reaching the end of a mismatched line will not fully be absorbed by the load impedance which is the antenna, instead part of the energy will be reflected back towards the source, the amount of reflected versus absorbed energy depends on the degree of mismatch between the characteristic impedance of the line and the load impedance connected to its end the antenna. The reason why energy is reflected at the discontinuity of impedance on a transmission line . I will share with you the way it was taught to me by an extreme case where a transmission line is shorted at the end, energy flowing to the load will encounter the short at the end and the voltage at that point will go to zero while the current will rise to a maximum since the current cannot develop any power in the dead short it will all be reflected back towards the source generator. If the short at the end of the line is replaced within open circuit the opposite will happen the voltage will rise to maximum and the current will...by definition go to zero. The phase will reverse and all energy will be reflected back towards the source . By the way if this sounds to you like what happens at the end of a half-wave dipole antenna you are right! I hope this helps you to see it from my perspective and I know you're more than capable of applying the necessary calculations.;)
 

andrewmm

Joined Feb 25, 2011
1,464
When do circuit analysis the energy transferred to electromagnetic waves seems to be neglected (for the low frequency analysis I have been introduced to). If the frequency of the circuit is increased does the transfer of energy to electromagnetic radiation increase? For a simple example if I had a 60Hz source connected to a resistor the analysis is very simple and I would assume the EM losses are negligible. What is the case for operating at a extremely high frequency. I understand that as the frequency increases the energy stored in any inductors magnetic field would also increase and contribute to less power delivered to the load. But specifically at these higher frequencies is there also a greater amount of energy that is transferred to electromagnetic waves that would have to be accounted for in the circuit?

Not certain where this is going ,

But.

At lower frequencies, good old V=IR and such like work just perfect.

Any current down a wire, causes a field, Maxwell comes to mind.

As the frequency goes up, the simple V=IR starts to be less relevant.
At GHz frequencies for instance, most of the power travels in the dielectric !!
but the driving force is in the surface of the conductor, ( you can't get rid of the copper )

You have to choose the equations and thinking depending upon the frequency

For instance, my RF friends make complete circuits, by just cutting different lengths and shapes in the copper
they add active parts, but all the L & C is done in copper !
An obvious example of this is an antenna, where its the size / shape / placement of the emblements that matters.

Sorry, don't know if this helps or hinders,
 

nsaspook

Joined Aug 27, 2009
8,382
The circuit power/energy is always in the dielectric (space surrounding the wires) no matter the frequency. At low frequencies we just use the circuit theory model that doesn't need EM field equations but the energy transporting fields are still there. Any power (electrons KE) in the copper is wasted as heat unless you want to make a heater.
 
Top