Hey guys, so I was designing an envelope detector in the lab. I also need to do a Pspice simulation of it. The situation is as follows.

Carrier signal: 1Mhz

Message signal: 1khz

This is the circuit I must use.

The input function generator also has a 50Ω internal resistance, so I included that in my simulation.

Here is the circuit I simulated.

R4 is the function generator internal resistance.

So I have a small dilemma here. Let me explain.

The carrier wave, and hence the modulated signal has a frequency of 1Mhz. That means its period is 1μs.

The RC time constant is τ= RC. The resistance R is R4 + R6 in the Thevenin equivalent. Or 60Ω. The capacitor is 3μf. So thus τ = 0.18ms. Times this by 5 = 0.9ms.

So it takes the capacitor 0.9ms to be fully charged. This is much longer than the period of the incoming signal. The capacitor is not being fully charged.

So okay lets say we want to fix this problem. We can't change the 50Ω internal Function Generator resistance. We can alter R6 and the capacitor.

Lets say we make R6 100Ω so the total resistance R is 150Ω and the capacitor 1.3nF.

Now the time constant is 0.2μs. Times this by 5 and we get 1μs. So now the capacitor is fully charged in one period of the incoming modulated signal. That is one problem solved.

But there is another problem This is a low pass filter, whose job it is to recover the message signal at 1khz.

The cutoff frequency of this low pass filter is Fc = 1/2πRC = 1/2π*150Ω*1.3nF = 816,179Hz. Some LOW pass filter right.

So that is the dilemma. I have to either alter the time constant, or alter the center frequency. But if I do one, the other is affected. Believe me I tried many different values of resistances and capacitances and nothing works.

Here is what Pspice gives me. As you can see the envelope detector isnt detecting much. The red signal is the voltage at the output.

Can you guys please help. Just what is more important here, the time constant or the cutoff frequency? Or did I get something else wrong?