Audio clipping detector

Discussion in 'General Electronics Chat' started by chunkmartinez, Dec 7, 2012.

  1. chunkmartinez

    Thread Starter Senior Member

    Jan 6, 2007
    180
    1
    Okay, I am trying to design a simple clipping detector circuit. It will be used with upto very high output caraudio amplifiers as a tool to set gain so that I can connect the output terminals to it and have an led light as soon as it detects clipping. Basically I move the gain knob until it clips and back off slightly.

    amplifier power is so cheap these days that I can be dealing with around upto 300 volts from the amps output, or atleast that is where my limit is going to be made as far as compatibility goes.

    So, I have studdied alot, I have reached the section of the ebook of ADC/DAC but I dont have a wole lot of experience and I dont have all of the material down so I'm a newb.

    So far I have been modeling a common mode comparator that filters out any ripple voltage. There is a capacitor on one of the inputs to filter the dc from one of the inputs so that only ac will be the same at both inputs, causing it to not be "amplified" by the output. The dc, if any will pass and light an led at the output.

    My issue is the voltage range of op amp input. I was thinking of using a 741 or tl082 so my input needs to be under 15v iirc. Well, I modeled a simple resistor voltage divider to proportionally reduce the input voltage within range(from say 150v from the amplifier output to 6v for example) but the led dosn't light at low dc voltage of say 1 volt because of how much I lowered it. I'm not sure if I should have looked at messing with the comparators resistor values because I am a newb with comparators, they sort of confuse me.

    Anyway, so I was wanting to ask, can I use another comparator to reduce the amplifiers 100+ possible volts when I need it too? Basically I can expect to deal with as little as 35v give or take(from the amplifier) to say 300 like i previously mentioned so I am assuming a simple resistor voltage divider dosn't work? And it would be inefficient? Maybe I'm thinking to hard about this, any ideas for the circuit I'm trying to implement? The crazy high voltage from the amplifier is what is difficult for me to deal with.

    oh btw, the amplifier will be output a 50hz sine wave for the detector to work with.
     
    Last edited: Dec 7, 2012
  2. Audioguru

    New Member

    Dec 20, 2007
    9,411
    896
    I think the amplifier power is so high that listeners are deafened and can't hear the clipping distortion. Therefore they need a light to tell them.

    Maybe the speaker voice coils are slamming into the magnet structure (bottoming out) but a clipping detector will not detect it.

    Use a pair of resistors as a voltage divider to reduce the 300V (!) from the amplifier to a lower voltage that the comparator can handle whithout blowing up.
    Then the comparator lights the LED when the divided voltage is more than the DC reference voltage.
     
    PackratKing likes this.
  3. chunkmartinez

    Thread Starter Senior Member

    Jan 6, 2007
    180
    1
    Actually I was wanting to design a detector that could detect the presence of DC for indication instead of the usual method of measuring and comparing to a preset voltage. I would like it to take the input and literally sense if clipping is visible.

    Can I use a 555 or something to indicate if it senses dc? Something that could detect if the voltage is the same value for more then a certain amount of nano seconds to seperate the difference between AC sine(or music) and quick to longer dc bursts? Could I use comparators to do this?
     
    Last edited: Dec 16, 2012
  4. Audioguru

    New Member

    Dec 20, 2007
    9,411
    896
    Why are you talking about DC?
    Clipping is usually symmetrical, above(+) and below (-) 0V.
     
  5. bountyhunter

    Well-Known Member

    Sep 7, 2009
    2,498
    507
    In the old days, clip detectors were comparators that watched the output signal swing and lit an LED when it got within a pre determined range of the supply voltage, like within 4V which is the ballpark where the darlington output driver stage would start to clip off. The voltage rails were not regulated so you always want to compare the signal level to whatever the rail voltage is and warn when it was getting too close. Just use a resistive divider down from the positive rail to one input of the comp and a divider from the output to the other. Adjust the resistors to get it to detect where you want it.
     
    Last edited: Dec 16, 2012
  6. Ron H

    AAC Fanatic!

    Apr 14, 2005
    7,050
    657
    I think you are trying to detect hard clipping, i.e. where the voltage is high, but unchanging. Is this realistic for all amplifiers? I'm pretty sure tube amps don't clip this way.
     
  7. Audioguru

    New Member

    Dec 20, 2007
    9,411
    896
    I don't know what it is called. Not deafness.
    Many people hear everything the same. All pitches sound like the same frequency.
    A smooth sine-wave hum sounds like a harsh square-wave buzz. They hate music.
    Maybe they are a different kind of animal (reptile?) than us.
     
    GopherT likes this.
  8. chunkmartinez

    Thread Starter Senior Member

    Jan 6, 2007
    180
    1
    Maybe you understand. I am thinking of clipping as there beeing a flat dc voltage in a peak. I want something that can detect a zero rate of voltage change so it can tell when the voltage stays the same for too long. I don't want to build a detector where I have to do anything but connect the output to it.
     
  9. Ron H

    AAC Fanatic!

    Apr 14, 2005
    7,050
    657
    Are you wanting to be able to use this on vacuum tube (valve) amplifiers? AFAIK, the don't generally clip as abruptly as solid state amps.
     
  10. chunkmartinez

    Thread Starter Senior Member

    Jan 6, 2007
    180
    1
    No...just class D, AB, A amplifiers.

    Is there something that will trigger when the signal has flat dc points? I've tried using low-pass, and other reactive ideas but it dosn't seem to work.
     
  11. Audioguru

    New Member

    Dec 20, 2007
    9,411
    896
    Simply compare the output to the input. When they are different then clipping at the output is the cause.:)
     
  12. bountyhunter

    Well-Known Member

    Sep 7, 2009
    2,498
    507
    You will get a ton of false readings doing that. You wouldn't believe how many audio tracks I have bought that had severe clipping on the tops when they were digitized from the master. I see it all the time when I process the tracks in my software. They look like somebody took hedge clippers to the top of them.
     
  13. chunkmartinez

    Thread Starter Senior Member

    Jan 6, 2007
    180
    1
    I realize how tracks can be....but when I use the tester to test for clipping I will use a pure sine wave made and recorded then played for test...
     
  14. chunkmartinez

    Thread Starter Senior Member

    Jan 6, 2007
    180
    1
    Guys I was thinking about a differentiator to be able to detect when there is zero rate of change at any time but it dosn't work in simulation. I realize that a differentiator would only display voltage with a rate of change and not without but I figured I could reverse the effect before lighting the diode...As long as I have something to detect the clipping in a sinewave or music. It seem like going digital like a microcontroller would be kind of simple but I havent ventured off into them yet.
     
  15. MrChips

    Moderator

    Oct 2, 2009
    12,439
    3,360
    Just use an analog comparator or an LM3915 LED VU meter.
     
  16. Ron H

    AAC Fanatic!

    Apr 14, 2005
    7,050
    657
    I thought you were to be testing with music. I think you could do it with a sine wave. In fact, I have a half-finished design, using a differentiator, that I was simulating. I had concluded that it wouldn't work with music. I think it might with a single tone.
    What frequency are you thinking of using, or will there be multiple frequencies, one ata time?
     
  17. chunkmartinez

    Thread Starter Senior Member

    Jan 6, 2007
    180
    1
    50hz for subwoofer amp, and also 1khz for upper range amplifiers.
     
  18. Ron H

    AAC Fanatic!

    Apr 14, 2005
    7,050
    657
    what are the minimum and maximum peak clipping voltages that you would want to detect?
     
  19. chunkmartinez

    Thread Starter Senior Member

    Jan 6, 2007
    180
    1

    By that do you mean the amount of voltage that is clipped? Or the total wave voltage? I know I def need to use a voltage divider since the voltage can go as high as 300v possibly on class D amps(very extreme but to be cautious).
     
  20. Ron H

    AAC Fanatic!

    Apr 14, 2005
    7,050
    657
    But what is the low end of your peak voltage range? A differentiator becomes more problematic as the voltage is reduced, because dv/dt of the normal signal is reduced.
    Do you want to have one attenuator work for the entire range of peak voltage, or can the attenuator be switched? I suppose it could adapt to the peak voltage, but that gets pretty complicated.
     
Loading...