Why is the total power of an AM signal the sum of the sideband and carrier powers?

Thread Starter

phlstr

Joined Oct 4, 2012
4
I need help understanding why the total power delivered by an AM signal equal to the sum of the power delivered by the carrier and the sidebands. I know that an AM signal can be decomposed into different sinusoids.

What I have trouble understanding is why the formula for the total power is \(P_{T}=\frac{(E_{c})^{2}}{R}+\frac{(E_{m})^{2}}{2R}\) and not \(P_{T}=\frac{(E_{c}+2E_{m})^{2}}{R}\) because the former formula kind of contradicts the result obtained when applying the superposition principle.

Thank you for your time.
 

K7GUH

Joined Jan 28, 2011
190
For the purpose of this discussion, think of total power as the area under the curve, regardless of shape. That's not an elegant scientific answer, but it may make sense to you.
 

vk6zgo

Joined Jul 21, 2012
677
In the Frequency domain,you may regard the useful output of an amplitude modulator ,(in the simple case of modulation with a single tone fm),as three discrete signals:-

The carrier,-----------------------------fc

The lower sideband (LSB),----------- fc-fm

And the upper sideband (USB),------ fc+fm

This is in fact,what you will see with a Spectrum Analyser.

If you look at the same signal with an Oscilloscope,in the Time domain, you will see the familiar AM modulation envelope,because the 'scope cannot tell the difference between these different signals & displays them all.
The signal voltages add & cancel to give the traditional display.

You can now provide a water cooled test load,with gauges to determine the water flow,& temperature gauges at the input & output of the water path,(Good 1800s technology).

With such a device,you can determine the power input to the test load,knowing the water flow rate,& the temperature rise.(I forget the formula,but it is available on the 'Net)

Assuming you have a fairly high power Transmitter,if you now feed its output to the test load,you can determine its output power.

If you remove the modulation,you will see power from just the carrier,& if you apply modulation,you will see the total power of the modulated signal.(carrier plus both sidebands)

This test is actually done in the real world with high powered equipment.

For low power signals,you will need a true-power meter to perform this test.

Power doesn't really care where the signal comes from,you can measure the power (within a limited spectrum) of a white noise signal,if you have a sensitive enough instrument.
 

Thread Starter

phlstr

Joined Oct 4, 2012
4
Ok, so I now know the formula has probably been proven with experiments. But I'm still confused and I want to fix the conflicting ideas in my head and I want to know where I'm wrong.

So I know an AM signal can be thought of as composed of sinusoids. So something like this:


I want to understand why the formula kind of says that the total average power is the sum of the power dissipated by the resistor with only one source connected at a time. So in a way, I think, that the formula uses superposition principle to get the total power which is wrong..

If I were to apply what I've learned from circuit analysis, in order to get the average power dissipated in the resistor, you would need to do something like this:
(let's call the three ac sources as v1, v2, and v3; period is T)

\(\frac{1}{R*T}\int__{0}^{T}(V_{1}+V_{2}+V_{3})^{2}dt\)

and not

\(\frac{1}{R*T1}\int__{0}^{T1} V_{1}_^{2}dt+\frac{1}{R*T2}\int__{0}^{T2} V_{2}_^{2}dt+\frac{1}{R*T3}\int__{0}^{T3} V_{3}_^{2}dt\)

as the formula seems to suggest. I also tried checking if the results of the two integral are the same using wolfram alpha but they're not.
 
Last edited:

MrChips

Joined Oct 2, 2009
30,801
\((x + y)^2\) is certainly not the same as \((x^2 + y^2)\).

Also be aware that the modulated output is the product of the signal and carrier, not the sum of the two.
 

vk6zgo

Joined Jul 21, 2012
677
I'm too out of practice with Maths,but what you seem to keep doing,is to add the voltages & then square them.
This will not give you the sum of the powers of each individual signal.

I'm not even too happy with this one from your first posting:


If the power from each sideband is equal,I think the "2" should be in the numerator,not the denominator,where Em is the voltage of one sideband.

Em seems like the incorrect term for the sidebands,& would normally refer to the voltage of the modulating signal.
It is not,in this case,however,because the relationship between that signal & the carrier is more complex.

As MrChips says"the modulated output is the product of the(modulating) signal and carrier, not the sum of the two."

It is important to know what happens in practice,as then you have to find how to make the maths agree with facts!
 

Thread Starter

phlstr

Joined Oct 4, 2012
4
The equations in my first post are actually wrong. Sorry for that. It's actually like this: (I used Ec and Em instead of Vc and Vm)


The sqrt(2)'s were added to convert the peak voltages to rms. The other 2's are part of the amplitudes of the sinusoids from this equation:




which can be derived using trigonometric identities from the equation defining a sine wave carrier with amplitude Vc and frequency fc modulated by another sinewave with amplitude Vm and frequency fm:



What I have trouble understanding is why the equation for the total power is this:


and not something like this:

\(\frac{1}{R*T}\int__{0}^{T}(V_{1}+V_{2}+V_{3})^{2}dt\)

where V1 is the first term, V2 is the second term, etc. of this equation:
 

WBahn

Joined Mar 31, 2012
30,051
What I have trouble understanding is why the formula for the total power is \(P_{T}=\frac{(E_{c})^{2}}{R}+\frac{(E_{m})^{2}}{2R}\) and not \(P_{T}=\frac{(E_{c}+2E_{m})^{2}}{R}\) because the former formula kind of contradicts the result obtained when applying the superposition principle.

Thank you for your time.
Are you sure that 2 in the first equation belongs in the denominator?

I skimmed the other responses and may have missed someone else pointing this out, but in case I didn't, the answer to your basic question is simple:

Superposition ONLY works for linear relationships. Power is not linear. Thus, superposition can't be used.
 

WBahn

Joined Mar 31, 2012
30,051
The fact that superposition doesn't apply is one key point, but it doesn't quite cover exerything in this case.

Let's say that we have a voltage signal that is the addition of two other signals:

z(t) = x(t) + y(t).

If we wanted the average power (into a 1 ohm load) of z(t), we find the average of the square of z(t).

But this is NOT the same as adding the averages of the squares of x(t) and y(t). Yet this would seem to be what the rule regarding total power in an RF signal is doing. So what gives.

The basic problem is that when we compute

[z(t)]^2

we get

[x(t) + y(t)]^2

[x(t)]^2 + 2x(t)y(t) + [y(t)]^2

We are claiming that we get just

[x(t)]^2 + [y(t)]^2

So what happened to the crossproduct term?

Remember that, first, we are talking about spectral components, so x(t) and y(t) are sinusoids at different frequences and, possible, phases. Second, remember that we have to take the average of the square.

So, what do we know about the average of the product of two sinusoids of different frequencies? It vanishes! So, in general, we can't neglect the crossproduct terms; but in the case of signals that are made up by adding a bunch of sinusoid together we can.
 

Thread Starter

phlstr

Joined Oct 4, 2012
4
So, what do we know about the average of the product of two sinusoids of different frequencies? It vanishes! So, in general, we can't neglect the crossproduct terms; but in the case of signals that are made up by adding a bunch of sinusoid together we can.
Thank you! I think this answers my questions.

I also tried evaluating these integrals again:
\(\frac{1}{R*T}\int__{0}^{T}(V_{1}+V_{2}+V_{3})^{2}dt\)

\(\frac{1}{R*T1}\int__{0}^{T1} V_{1}_^{2}dt+\frac{1}{R*T2}\int__{0}^{T2} V_{2}_^{2}dt+\frac{1}{R*T3}\int__{0}^{T3} V_{3}_^{2}dt\)
and I got the same answer for each of them. I actually got the limits for the second integral wrong when I did it the first time.

Thanks to everyone who replied! :)
 

WBahn

Joined Mar 31, 2012
30,051
Thank you! I think this answers my questions.

I also tried evaluating these integrals again:
\(\frac{1}{R*T}\int__{0}^{T}(V_{1}+V_{2}+V_{3})^{2}dt\)

\(\frac{1}{R*T1}\int__{0}^{T1} V_{1}_^{2}dt+\frac{1}{R*T2}\int__{0}^{T2} V_{2}_^{2}dt+\frac{1}{R*T3}\int__{0}^{T3} V_{3}_^{2}dt\)
and I got the same answer for each of them. I actually got the limits for the second integral wrong when I did it the first time.

Thanks to everyone who replied! :)
Be sure to understand that, for just any old V1, V2, and V3, the equality doesn't hold and you have to deal with the crossproducts. Only when you are talking about each of them being sinusoids (or members of some other set of orthogonal functions) do the crossproducts vanish.
 
Top