RF oscillator vs amplifier transistor selection

Thread Starter

Halfpint786

Joined Feb 19, 2018
109
A few recent posts have inspired me to ask this question: Why are bipolar transistors used for RF oscillators usually the same transistors that would make garbage RF amplifiers? I keep seing RF oscillators (like FM bugs and crystal oscillator schematics) that use jellybean transistors where I would expect an expensive RF transistor.

If I were to take a guess myself, I would think that it is because oscillators use very little current compared to amplifiers and RF transistors are designed to have the least noise at higher currents compared to the typical 2N2222 or 2N3904 with less noise at lower currents. Am I close or is there more to the story?

Thanks!!!!!
 
Last edited:

MisterBill2

Joined Jan 23, 2018
18,600
You have the answer right there, "an expensive RF transistor". An oscillator is not dealing with a microvolt signal that can easily be buried under transistor noise. Most oscillators produce a signal of at least one volt. So that is about 120 dB greater than the RF signal to be amplified. That is one reason. The other reason is that experimenters often use cheap transistors.
 
Top