AC Bias for sensitive signal rectifier?

Thread Starter

Ragwire

Joined Dec 9, 2013
36
Anyone care to at least speculate?

Using a 1N34A germanium diode in series with a 0.05 uF capacitor.

(A)----] [------(B)------l<l-------(Ground)

Various and random radio frequencies between from 1 MHz to 3 MHz at several millivolts peak to peak are applied across (A) and (Ground) from a 50 ohm signal generator feeding (A) through a 1 k ohm resistor.

DC voltage is measured between (B) and ground (just across the diode).

DC voltage, as expected varies with the RF input, but as there is no pre-bias applied to the diode, it is in square law rectification mode and reads about +0.1 mVDC at (B) with an input of 5 mV P-P.

When a 30 kHz signal is also applied to the circuit at (A) at a level that makes the DC out at (B) read a steady +100 mV, I can add back the same RF feed (still though the 1 k resistor) and make the output go to +101 mV at an RF level of 5 mV P-P...and higher as I crank up the RF.

In short, I have tried various DC diode pre-biasing techniques in the past and never had more than minimal improvement in germanium diode sensitivity, but with a 30 kHz AC feeding it, making the diode turn on and off, I can overlay an very small RF on the signal and see a substantial and predictable output change, proportional to the RF voltage...ie, it seems to become a very sensitive small signal indicator.

I tried different and odd frequencies of RF to see if there was any harmonic adding of the two signal sources, but this does not seem to be the case at all.

Just experimenting. Just curious.
 

alfacliff

Joined Dec 13, 2013
2,458
I have seen some detectors linearized by using a high value load resistor for the detector. you might use a 50 oum load on the signal coming in and a 1 meg ohm resistor for the dc load on the 1n34. not perfect, but pretty linear.
 
Top