I am trying to establish an ultra long distance link budget for wideband pulse communication and I'm continuously frustrated by the following noise equation:
[received power from noise] = [Boltzmann constant] x [receiver system temperature] x [receiver bandwidth] or Pn = Kb x Tsys x Br.
According to this equation the bandwidth or range of frequencies that the receiver is listening on is directly proportional to the noise level in the receiver. If you want more bandwidth you have to increase signal strength to the same degree or your signal to noise ratio plummets possibly making your signal undetectable, lost in a sea of noise.
The relationship between receiver bandwidth and noise does make some intuitive sense to me. In a narrow band receiver you are able to filter out a much larger number of frequencies that your receiver doesn't have to listen to at all. If you think of each frequency as a 'channel' the noise level is multiplied by every channel that you have to listen to. The noise multiplies, but the signal doesn't. Or something like that. I have no idea if this way of looking at it is correct. It is just how I am thinking about the problem.
So if the problem is that the receiver has to listen to a much broader window of frequencies, is there some way to solve or mitigate this problem at least for certain applications? Or is the direct bandwidth-noise relationship a fundamental law of nature which just has to be lived with?
My potential application would involve sending a series of raised cosine shaped 5 μs pulses with a carrier frequency of 9.3 Ghz. I believe that the short length of the pulses results in unintentional amplitude modulation and therefore frequency modulation of the carrier wave. A 5 μs pulse corresponds to a frequency of 200 kHz. I guess this is equivalent to modulating or mixing a 200 kHz sinusoidal wave with a 9.3 Ghz one resulting in 2 sideband frequencies of 9,300,200,000 Hz max and 9,299,800,000 Hz min for a total bandwidth of 400 Khz.
This results in a factor of 200,000 times more noise in the receiver compared to an ultra narrow band 1 Hz modulated carrier wave that only varies between 9,300,000,001 Hz and 9,299,999,999 Hz (1 Hz sidelobes) and results in an effective range that is reduced by a factor of 200,000. If a signal with a bandwidth of 200 kHz were strong enough to be received at a distance of 1 km, then a 1 Hz signal with the same EIRP could travel 200,000 km or about halfway to the moon. I'd call that difference non-trivial.
According to the above equation it seems like this is unavoidable. Just another of nature's TAINSTAFL scenarios. If you want a higher bitrate you'll need more bandwidth. If you want more bandwidth you'll get an equal quantity of noise in your receiver to go with it. Or in my case, if you want to transmit short pulses you'll get a proportional amount of carrier frequency variation and receiver noise to go with it. The shorter the pulse the more noise in the receiver and the higher the EIRP has to be to compensate.
But I have been thinking about spread spectrum frequency hopping systems. With SSFH both the transmitter and receiver constantly hop to different (presumably narrow band) frequencies. At any single point in time, during the dwell time on each channel, it seems like the scenario is not so different from traditional narrowband communication. At that point in time the receiver should only need to listen to a very narrow band of frequencies in order to receive the full signal.
What if you had a SSFH system, where instead of the transmitter and receiver pair hopping to discrete channels, they smoothly changed their oscillation frequency according to some function. For instance, a raised cosine function. The transmitter and receiver are designed to sync and change their frequency according to that function. Assuming an ideal system with no frequency drift, at any one time the receiver only has to pay attention to one and only one frequency.
Since the wideband SNR problem is a direct result of the receiver not being able to filter out as many frequencies it would seem like this kind of system could significantly reduce the bandwidth and thus significantly increase the range possible for a given transmitter output power and receiver sensitivity. Of course, there isn't any actual information being transmitted in this scenario. Amplitude modulation, phase modulation, and frequency modulation of the carrier would all result in an unpredictable dynamic change in this hypothetical function-following constantly changing carrier frequency.
Whether it could still be made to work (maybe by somehow measuring the changes in the carrier frequency which deviate from the ideal function) for those modulation schemes I'm not sure, but even if you could make it work you'd be back to side-lobes in the frequency domain again and bandwidth that is proportional to the symbol rate you want to send at. So for CW applications this really doesn't seem to get you anywhere (except for a level of interception/jamming security comparable to SSFH). But for pulsed applications the connection between pulse length and bandwidth would effectively be severed. I'm also thinking that pulse position modulation might allow for an effectively zero bandwidth (resulting in a nearly infinite range from an infinitesimally small EIRP?
) communication channel and shouldn't interfere with the sync function itself as something like pulse duration modulation would. So is the idea plausible or what?
[received power from noise] = [Boltzmann constant] x [receiver system temperature] x [receiver bandwidth] or Pn = Kb x Tsys x Br.
According to this equation the bandwidth or range of frequencies that the receiver is listening on is directly proportional to the noise level in the receiver. If you want more bandwidth you have to increase signal strength to the same degree or your signal to noise ratio plummets possibly making your signal undetectable, lost in a sea of noise.
The relationship between receiver bandwidth and noise does make some intuitive sense to me. In a narrow band receiver you are able to filter out a much larger number of frequencies that your receiver doesn't have to listen to at all. If you think of each frequency as a 'channel' the noise level is multiplied by every channel that you have to listen to. The noise multiplies, but the signal doesn't. Or something like that. I have no idea if this way of looking at it is correct. It is just how I am thinking about the problem.
So if the problem is that the receiver has to listen to a much broader window of frequencies, is there some way to solve or mitigate this problem at least for certain applications? Or is the direct bandwidth-noise relationship a fundamental law of nature which just has to be lived with?
My potential application would involve sending a series of raised cosine shaped 5 μs pulses with a carrier frequency of 9.3 Ghz. I believe that the short length of the pulses results in unintentional amplitude modulation and therefore frequency modulation of the carrier wave. A 5 μs pulse corresponds to a frequency of 200 kHz. I guess this is equivalent to modulating or mixing a 200 kHz sinusoidal wave with a 9.3 Ghz one resulting in 2 sideband frequencies of 9,300,200,000 Hz max and 9,299,800,000 Hz min for a total bandwidth of 400 Khz.
This results in a factor of 200,000 times more noise in the receiver compared to an ultra narrow band 1 Hz modulated carrier wave that only varies between 9,300,000,001 Hz and 9,299,999,999 Hz (1 Hz sidelobes) and results in an effective range that is reduced by a factor of 200,000. If a signal with a bandwidth of 200 kHz were strong enough to be received at a distance of 1 km, then a 1 Hz signal with the same EIRP could travel 200,000 km or about halfway to the moon. I'd call that difference non-trivial.
According to the above equation it seems like this is unavoidable. Just another of nature's TAINSTAFL scenarios. If you want a higher bitrate you'll need more bandwidth. If you want more bandwidth you'll get an equal quantity of noise in your receiver to go with it. Or in my case, if you want to transmit short pulses you'll get a proportional amount of carrier frequency variation and receiver noise to go with it. The shorter the pulse the more noise in the receiver and the higher the EIRP has to be to compensate.
But I have been thinking about spread spectrum frequency hopping systems. With SSFH both the transmitter and receiver constantly hop to different (presumably narrow band) frequencies. At any single point in time, during the dwell time on each channel, it seems like the scenario is not so different from traditional narrowband communication. At that point in time the receiver should only need to listen to a very narrow band of frequencies in order to receive the full signal.
What if you had a SSFH system, where instead of the transmitter and receiver pair hopping to discrete channels, they smoothly changed their oscillation frequency according to some function. For instance, a raised cosine function. The transmitter and receiver are designed to sync and change their frequency according to that function. Assuming an ideal system with no frequency drift, at any one time the receiver only has to pay attention to one and only one frequency.
Since the wideband SNR problem is a direct result of the receiver not being able to filter out as many frequencies it would seem like this kind of system could significantly reduce the bandwidth and thus significantly increase the range possible for a given transmitter output power and receiver sensitivity. Of course, there isn't any actual information being transmitted in this scenario. Amplitude modulation, phase modulation, and frequency modulation of the carrier would all result in an unpredictable dynamic change in this hypothetical function-following constantly changing carrier frequency.
Whether it could still be made to work (maybe by somehow measuring the changes in the carrier frequency which deviate from the ideal function) for those modulation schemes I'm not sure, but even if you could make it work you'd be back to side-lobes in the frequency domain again and bandwidth that is proportional to the symbol rate you want to send at. So for CW applications this really doesn't seem to get you anywhere (except for a level of interception/jamming security comparable to SSFH). But for pulsed applications the connection between pulse length and bandwidth would effectively be severed. I'm also thinking that pulse position modulation might allow for an effectively zero bandwidth (resulting in a nearly infinite range from an infinitesimally small EIRP?