i'm sorry sir i think i've not quoted problem correctlyThe output voltage of some circuits decreases as the input's signal frequency increases due to capacitances (parasitic or not) in the circuit. The attenuation of the output signal is not linear and depends upon the circuit.
can any one explain what is relation between frequency and attenuation?
is that relation has a linear behavior? or is a case of individual application
ok i would like to know in both cases. but presently please explain me in free spaceIs this in free space or in a transmission line, or what? It makes a big difference!
Eric
ok i would like to know in both cases. but presently please explain me in free space
thank you sir,Hi Muni:
In free space (i.e. a perfect vacuum) the attenuation is independent of frequency....it follows the inverse square law (interestingly just like gravity!) In the real world however, losses due to absorption, dielectric heating, etc., generally increase with frequency. (You can have very severe attenuation near the molecular resonance frequency of water, on top of that, for example)
In a transmission line, the loss is logarithmic....it is a given number of dB per distance of transmission line. This is also something that increases (sometimes drastically) with frequency, but it is not a simple relationship. However, loss in a transmission line ALWAYS goes up with frequency.
Eric
Good question with a long answer!thank you sir,
i've a doubt in vhf/uhf transmission it needs only 10 watts of power to cover a range of 200 km,but i think it needs more power incase of fm/sw/mw frequencies . why it is so?
Eric, you know more about this than I do, but this document seems to have a different answer.Hi Muni:
In free space (i.e. a perfect vacuum) the attenuation is independent of frequency....it follows the inverse square law (interestingly just like gravity!) In the real world however, losses due to absorption, dielectric heating, etc., generally increase with frequency. (You can have very severe attenuation near the molecular resonance frequency of water, on top of that, for example)
In a transmission line, the loss is logarithmic....it is a given number of dB per distance of transmission line. This is also something that increases (sometimes drastically) with frequency, but it is not a simple relationship. However, loss in a transmission line ALWAYS goes up with frequency.
Eric
Eric, you know more about this than I do, but this document seems to have a different answer.
http://didier.quartier-rural.org/implic/ran/sat_wifi/sigprop.pdf
I don't know if it's correct or not, or perhaps I am interpreting it incorrectly.
EDIT: Wikipedia confirms what you are saying, and which I remember from my long-lost education.
THANK YOU SIRIndeed, the inverse square law applies to all electromagnetic radiation. Where the rub comes in is how you INTERCEPT that radiation...and this is where the "magic formula" comes in. See if you can follow this...it's a bit obscure until you get to the "aha" part. ^_^
Let's look at a fixed amount of energy radiating from POINT SOURCE. As the signal spreads out, you have that fixed energy distributed across an ever-increasing "surface area". Now an antenna of a given length will intercept a smaller percentage of that surface area, the farther away from the source you get. Now, here's where it gets fun. A typical antenna increases in size in proportional to its wavelength....assuming a dipole, for example. Now an antenna that's "cut to length" is going to intercept a smaller percentage of that volume the shorter it is...in other words, the higher the frequency, the shorter the antenna. If we kept the RECEIVING ANTENNA SIZE in continuous proportion to the DISTANCE FROM THE POINT SOURCE, we would see the the signal intercepted by that antenna follow the inverse square law precisely. The fact of the matter is that we DON'T build antennas proportional to the distance...we build them proportional to the wavelength! (aha!)
Hope this helps some.
Eric