Why 50 ohms?

Discussion in 'Wireless & RF Design' started by Yakima, Sep 2, 2013.

  1. Yakima

    Thread Starter Member

    Jan 23, 2012
    35
    2
    I am wondering why 50 ohms is used to match various modules in a radio. Why not 75 or 2000 ohms? I realize the antenna is 50 or 75 ohms by some type of natural phenomenon. Why is that? Even so, why do we worry about 50 ohms between IF stages or the IF stage to the demodulator stage and the demodulator stage to the audio stage?
     
  2. #12

    Expert

    Nov 30, 2010
    16,278
    6,791
    That is the "A" answer.

    I don't know the details because I don't do RF. It's still magic to me. However, there are people here that know why. I hope one of them will stop by and lay it all out in simple terms.:)
     
  3. KJ6EAD

    Senior Member

    Apr 30, 2011
    1,425
    363
  4. Papabravo

    Expert

    Feb 24, 2006
    10,140
    1,789
    The short answer, like most things in engineering, is that it is a compromise.
    Posted for those too busy or lazy to read the article.
     
    Last edited: Sep 3, 2013
  5. t_n_k

    AAC Fanatic!

    Mar 6, 2009
    5,448
    782
    One might also ask why 600Ω is commonly adopted in telephony etc.

    Nominal Impedance is also covered in Wikipedia.

    The same page in Wikipedia points out that the OP's assertion - "I realize the antenna is 50 or 75 ohms by some type of natural phenomenon." is based on a "myth". Perhaps misconception might be a gentler perspective.
     
  6. Yakima

    Thread Starter Member

    Jan 23, 2012
    35
    2
    Thank you KJ6EAD, I looked up one of the links and got the answer. Here it is, in brief:

    The 50-Ohm compromise

    The arithmetic mean between 30 ohms (best power handling) and 77 ohms (lowest loss) is 53.5, the geometric mean is 48 ohms. Thus the choice of 50 ohms is a compromise between power handling capability and signal loss per unit length, for air dielectric.
    Why 75 Ohms?

    For cheap commercial cables such as those that bring CATV to your home, 75 ohms is the standard. These cables don't have to carry high power, so the key characteristic that should be considered is low loss. The answer to the "why 75 Ohms?" question seems obvious. We just saw that 77 ohms gives the lowest loss for air dielectric coax, so 75 ohms might be just an engineering round-off.

    http://www.microwaves101.com/encyclopedia/why50ohms.cfm

    So the answer is, coax. But this still doesn't answer the question of why we should design for an impedance of 50 or 75 ohms between IF stages. Any thoughts?
     
    Last edited: Sep 3, 2013
  7. t_n_k

    AAC Fanatic!

    Mar 6, 2009
    5,448
    782
    Perhaps the fact that RF test & measurement equipment is designed around the nominal impedance concept.
     
  8. Yakima

    Thread Starter Member

    Jan 23, 2012
    35
    2
    I agree. But this really means that 50 ohms is not necessary. I'm thinking of those variable inductance cans that have a capacitor and inductor to create either a 455 kHz or 10.7 MHz frequency going into a tank. I think these 'cans' may be designed to match a BJT's output resistance at the collector (about 100Kohms) in the tank primary, with the secondary so designed as to match another BJT stage's input resistance at its base (about 2000 ohms).

    I'm an experimenter. My rule of thumb is that I can match stages in the most convenient way possible, but when it's either an input or an output for an entire system -- such as the whole IF amplifier or a mixer, then it has to be 50 ohms in order to match my test equipment. Having a standard also facilitates interconnecting my homemade equipment. I was designing an IF amplifier last week and the question of the reason for 50 ohms made me start wondering about it. I know the answer now: coax. Thanks. :)
     
    Last edited: Sep 3, 2013
  9. WBahn

    Moderator

    Mar 31, 2012
    17,743
    4,789
    In general, when working at RF frequencies, it is hard to just add more power to a signal. So you generally try to transfer as much power as you can from one device to another. That happens when the input, output are matched to the characteristic impedance of the transmission line connecting them.
     
  10. Yakima

    Thread Starter Member

    Jan 23, 2012
    35
    2
    Yes, WBahn, we all know that. No offense. :)
     
  11. t_n_k

    AAC Fanatic!

    Mar 6, 2009
    5,448
    782
    I don't think you will get any disagreement with there being no requirement to perform any 50Ω impedance matching within an amplifier stage itself. As you state, the requirement usually arises when there are interconnections between stages where the connections themselves are either coaxial lines (by intention) or they are of a physical scale whereby they begin to behave like transmission lines at the operating frequency. I presume operation at microwave frequencies and beyond requires such consideration in the designs. Often active device s-parameters are used for the design in such circuits and they tend (not unexpectedly) to be measured / quoted by the manufacturer at nominal 50Ω.
     
  12. WBahn

    Moderator

    Mar 31, 2012
    17,743
    4,789
    I must be misreading the intent of some of what you said, because it sounded like you were still wondering why it was important to match the impedances. Sorry if I misunderstood.
     
  13. Shagas

    Active Member

    May 13, 2013
    802
    74
    Well , I didn't :)
     
  14. vk6zgo

    Active Member

    Jul 21, 2012
    677
    85
    λΩΩ
    Actually,a halfwave dipole driven at its midpoint at its resonant frequency in free space does look like around 77Ω.-----so yes,it is due to a natural phenomenon.
    Dear old Wiki does like to Pontificate a bit!

    Before Coaxial Cables there were such things as 75Ω twin feeders,which were intended to drive halfwave dipoles.

    No doubt the reasons for the choice of 50Ω & 75Ω for coax ultimately were determined as described,but the range of choices needed to be close to the main things driven by coax at the time---halfwave dipoles!

    Interestingly,in a separate Wiki discussion,they quote the feedpoint impedance of a halfwave dipole at resonance as being 63Ω,without any qualification as to height above ground.

    From the classic graph of feed impedance -v- height above ground reproduced just about everywhere, 63Ω looks like about 06λ above ground.

    Some German equipment uses 63Ω coax,so maybe that is where they took the reference from.
     
  15. Yakima

    Thread Starter Member

    Jan 23, 2012
    35
    2
    No need to be sorry, you did nothing wrong. ;)
     
  16. KL7AJ

    AAC Fanatic!

    Nov 4, 2008
    2,040
    287
    It should also be noted that folks were "doing radio" a long time before coax cable was around. 600 ohm open wire feedline is still very effective (and experiencing a resurgence in popularity among radio amateurs). Although a real world dipole has a center feedpoint impedance of about 75 ohms (in free space), a center fed dipole is not the only way of doing things. The end fed "Zepp" antenna, as well as the extended double zepp, (EDZ) have feedpoint impedances FAR removed from any practical coax impedance.

    Eric
     
Loading...