Auto Attenuator design

Discussion in 'General Electronics Chat' started by second_ed, Dec 5, 2011.

  1. second_ed

    Thread Starter New Member

    Dec 5, 2011
    2
    0
    I need to design a auto attenuator. I will appreciate any tips, here are details/requirements

    a previously designed system can take a maximum voltage of 10V rms. signal going in to the system can go beyond 10 V at certain points. So this attenuator has to attenuate the signal, independent of frequency,without distortion, when ever the signal is over 10 v.
     
  2. tommydyhr

    Active Member

    Feb 3, 2009
    39
    4
    This very well explained circuit should most likely suit your needs: http://sound.westhost.com/project53.htm . It's designed to be inserted in line with your audio signal. It delivers very low distortion levels, with it's use of a LDR as it's gain control element.

    Edit: I may have misunderstood the topic. This circuit serves to limit the output of an audio amplifier by compressing the signal.
     
    Last edited: Dec 5, 2011
    second_ed likes this.
  3. crutschow

    Expert

    Mar 14, 2008
    13,009
    3,233
    You need an AGC (Automatic Gain Control) circuit. A low distortion AGC can be built using an analog multiplier. Google "AGC Circuit" and you will get many hits.
     
    second_ed likes this.
  4. second_ed

    Thread Starter New Member

    Dec 5, 2011
    2
    0
    crutschow:I tried using AGC; but I do not want to amplify the low voltages; just attenuate whenever the signal in to the system is too high;
    tommydyhr: I looked at the circuit suggested; it sounds promising but need to do little more research for parts and such;

    thanks
     
  5. joeyd999

    AAC Fanatic!

    Jun 6, 2011
    2,678
    2,737
    Ummm...the act of dynamically attenuating the signal is non-linear through time. Therefore, distortion *will* be introduced! The faster the rate-of-change of attenuation, the more distortion. Not sure if this is important, just thought I'd point it out...
     
  6. crutschow

    Expert

    Mar 14, 2008
    13,009
    3,233
    Then you put a threshold on the AGC control voltage so that is provides no control (constant gain of 1) until the input voltage nears 10V. That could be done with a comparator on the input to detect when the reaches 10V. When it reaches 10V then you turn on the AGC control. The trick is to get a smooth transition from below 10V to above 10V input.
     
Loading...