Increase Gain on Audio Amplifier

Discussion in 'The Projects Forum' started by majbb, May 7, 2011.

  1. majbb

    Thread Starter New Member

    Mar 9, 2011
    5
    0
    I have attached my current circuit below. I'm trying to increase the gain of the amplifier without having it clip the signal. Currently my simulations show that I have a maximum gain of about 17 without clipping, but I would like to get that up to 20 or a bit more if possible.

    This is meant to be USB powered, so it is being run off of a 5V supply. It is meant to drive a 4 ohm speaker. My input signal is a 100mV pk-pk sine wave.

    Any help would be appreciated.
     
  2. marshallf3

    Well-Known Member

    Jul 26, 2010
    2,358
    201
    First thing I'd try is slightly changing the values of RP1 and RP2 in the feedback loop but that could increase the distortion level an unknown amount.
     
  3. Audioguru

    New Member

    Dec 20, 2007
    9,411
    896
    The input transistor has its base biased at 0V (which is used when the power supply is positive and negative) so the output of the amplifier will also be at 0VDC and it will rectify the signal. But you want a linear amplifier.
    The input should be biased at half the supply voltage so the output can swing symmetrically as much as possible.
    But the circuit has so many voltage losses that its max output will be only 0.9V p-p which is 318mV RMS. Then the max output at clipping is only 25mW into 4 ohms.
    Like an earphone laying on a chair.
     
  4. Adjuster

    Well-Known Member

    Dec 26, 2010
    2,147
    300
    As well as the input bias voltage not being at half-supply, the feedback is taken from after the output capacitor. This means that the bias will not be set correctly. Frankly I am surprised your simulation gives any result as is.

    The input DC voltage should be set half-way by a potential divider, and the feedback resistor RP1 should be fed from the junction of R1 and R2.
     
  5. majbb

    Thread Starter New Member

    Mar 9, 2011
    5
    0
    Thanks for your advice.

    I've changed my circuit to reflect your comments, and with a 100mV pk-pk sine wave at the input I can get about 1.75V pk-pk at the output by varying RP2.

    What would be a good way to divide my 5V source up into Vcc, and Vee? A simple resistor divider doesn't work.

    I have attached my updated circuit and LTSpice schematic.
     
  6. Adjuster

    Well-Known Member

    Dec 26, 2010
    2,147
    300
    You don't need to divide the power rail. All that is needed is a potential divider to bias the amplifier input to half the supply voltage.
     
  7. Audioguru

    New Member

    Dec 20, 2007
    9,411
    896
    Instead of a dual polarity supply, here is a potential divider to give the input transistor a reference voltage that is "half the supply voltage".

    Did you calculate all the voltage losses by all the emitter-follower transistors in your circuit that reduce the output power to almost nothing? Because the supply voltage is too low.
     
Loading...