I have a problem I have been trying to figure out for several days now.
If the peak signal energy for a 10 MHz signal with a rise/fall time of 1 nS begins to drop at
40 dB/decade above 320 MHz, how much will the peak energy around 800‐900 MHz be reduced
when the rise/fall time is increased to 10 nS?
This is one of those questions that really wasn't covered in the class, but is related to the lesson. I just can't see the connection. I'm not looking for the answer, I'm looking some direction on how to work the problem. Any help would be greatly appreciated.
If the peak signal energy for a 10 MHz signal with a rise/fall time of 1 nS begins to drop at
40 dB/decade above 320 MHz, how much will the peak energy around 800‐900 MHz be reduced
when the rise/fall time is increased to 10 nS?
This is one of those questions that really wasn't covered in the class, but is related to the lesson. I just can't see the connection. I'm not looking for the answer, I'm looking some direction on how to work the problem. Any help would be greatly appreciated.