Audio clipping detector

Thread Starter

chunkmartinez

Joined Jan 6, 2007
180
But what is the low end of your peak voltage range? A differentiator becomes more problematic as the voltage is reduced, because dv/dt of the normal signal is reduced.
Do you want to have one attenuator work for the entire range of peak voltage, or can the attenuator be switched? I suppose it could adapt to the peak voltage, but that gets pretty complicated.
Umm...I'm not sure what I would use as the low voltage attennuation, I was thinking of really lowering the voltage, like a max of 13v for use with a common OP amp operating voltage. Right? I'm not sure what the lowest would come out to but I figured I could maybe use some sort circuitry to have different attenuations that adjust. In my orignal post I added something like using an addition OP amp to do this task.

What do you think? What low end voltage is troublesome for the task?
 

tubeguy

Joined Nov 3, 2012
1,157
Originally Posted by Ron H
I thought you were to be testing with music. I think you could do it with a sine wave. In fact, I have a half-finished design, using a differentiator, that I was simulating. I had concluded that it wouldn't work with music. I think it might with a single tone.
What frequency are you thinking of using, or will there be multiple frequencies, one ata time?
Making a 'universal' clipping detector will be very difficult for the reason pointed out above.
As mentioned, most of the LED 'clipping' indicators you see in amplifiers are simply comparators set to detect voltage levels slightly below actual clipping in that particular amp.

It seems that the best bet would be running a single low frequency through the amp and detecting the flat DC clip.

Maybe that would be workable in a circuit comparing input signal to output signal.

Or, maybe a low-pass filter that would trigger a comparator with hysteresis, when DC was present for a certain time period.
 
Last edited:

Thread Starter

chunkmartinez

Joined Jan 6, 2007
180
Making a 'universal' clipping detector will be very difficult for the reason pointed out above.
As mentioned, most of the LED 'clipping' indicators you see in amplifiers are simply comparators set to detect voltage levels slightly below actual clipping in that particular amp.

It seems that the best bet would be running a single low frequency through the amp and detecting the flat DC clip.

Maybe that would be workable in a circuit comparing input signal to output signal.

Or, maybe a low-pass filter that would trigger a comparator with hysteresis, when DC was present for a certain time period.
That last part is what i'm thinking about doing...
 

Audioguru

Joined Dec 20, 2007
11,248
Simply attenuate the output so it is at the same level as the input. Invert if required and compare them. When they are different then the amplifier is clipping.
 

Audioguru

Joined Dec 20, 2007
11,248
So by the input, would this be the signal comming from the pre-amp output before the amp?
The input of the power amp since its level is always a percentage of the output level. There might be a volume control between the output of the preamp and the input of the power amp.
 

Thread Starter

chunkmartinez

Joined Jan 6, 2007
180
The input of the power amp since its level is always a percentage of the output level. There might be a volume control between the output of the preamp and the input of the power amp.
I was originally planning not to utilize anything but the output terminals but this is a good idea. I can use an RCA splitter to parallel the signal to the device right? Then, I could use a comparator to compare the signals. I figure to attenuate the signal in exact proportion to the input I could use a sort of comparator design for that too right? Any ideas on how to auto follow the input amplitude?

I figure the comparator to compare the two will use a simple non-feedback design since I don't have to worry about saturation, saturation will be fine to power the LED indicator, right?

Aftedr thinking about it, wouldn't there be things to cause a false indication like any little difference that the amplifier itself adds to the signal like noise which isn't present in the input before it gets to the amp?
 
Last edited:

MrChips

Joined Oct 2, 2009
30,802
A comparator will have too much gain. Try using an op amp to take the difference of the two signals. Then you can adjust the gain as you wish.
 

Audioguru

Joined Dec 20, 2007
11,248
Distortion and noise for a hifi amplifier is at most 0.1% of the output when not clipping so simply have the attenuated output level about 0.1% more than the input level. Then the comparator will be triggered when the clipping distortion exceeds 0.1%.

Short duration clipping will light an LED faster than your vision can see it so you might need a peak detector to flash the LED for at least 30ms.
 

Thread Starter

chunkmartinez

Joined Jan 6, 2007
180
Distortion and noise for a hifi amplifier is at most 0.1% of the output when not clipping so simply have the attenuated output level about 0.1% more than the input level. Then the comparator will be triggered when the clipping distortion exceeds 0.1%.

Short duration clipping will light an LED faster than your vision can see it so you might need a peak detector to flash the LED for at least 30ms.
I planned comphensating for the sharp quick peaks with something like a peak detector as you said but when the led is connected directly to the output of the op amp the peak detection dosn't work. What can I do, use a transistor between the output and peak detector or something?
 

Audioguru

Joined Dec 20, 2007
11,248
I planned comphensating for the sharp quick peaks with something like a peak detector as you said but when the led is connected directly to the output of the op amp the peak detection dosn't work. What can I do, use a transistor between the output and peak detector or something?
I guess you do not know about a peak detector and do not know how to drive an LED.
 

Thread Starter

chunkmartinez

Joined Jan 6, 2007
180
I guess you do not know about a peak detector and do not know how to drive an LED.
I know what the peak detector does...its a diode and capacitor...the dc created is parallel to the capacitor and it wasn't working the way I was trying it. I don't know about driving LEDs though per se.
 

MrChips

Joined Oct 2, 2009
30,802
What they mean to say is that you cannot take much current off the peak detector otherwise it is no longer a peak detector. The voltage would rapidly fall off to zero. What you want to do with the diode and capacitor is to place a resistor across the capacitor in order to have a predetermined time constant.

Then you use a high impedance op amp such as a FET input op amp to measure the voltage at the peak detector circuit without draining the capacitor. Then you can drive the LED with a suitable driver circuit.
 

thatoneguy

Joined Feb 19, 2009
6,359
Off the wall Idea:

SPL rated microphone feeding one input, pre-amp feeding the other.

Phase shift in amp/distance of mic would cause this to false alarm a lot, but it's an entirely different track that would also detect speaker limits in addition to amp limits.

Essentially, a THD analyzer, but in a simplified form.
 

Thread Starter

chunkmartinez

Joined Jan 6, 2007
180
Can anyone simulate and help with the design of an op amp based circuit for this? I've done some simulating..I tried using an op amp with negative feedback for the attenuation of the amp output...It has two inputs, the negative input has the amp output on it so the negative feedback can match it with the voltage from the non-inverting input(the vref or HU output)...Then it runs to a comparator. The output goes goes to one input while the other input also has the HU output.

So first, the neg feedback op amp matches the level and attenuates it enough so that its under the limit of 15vs for the OP amp. Then the output is compared with th HU output again WITHOUT negative feedback so it compares it...I just need a little help with resistor values because I figure the comparator will need VEEEERY low input so it dosn't saturate due to the high gain...any ideas? Max voltage from amp to input will be like 200v as 13v for the MAX input value..HU output max is just 4v from HU.
 

Ron H

Joined Apr 14, 2005
7,063
You can't do it with just negative feedback. There has to be a variable gain element somewhere in the loop.
Will the reference input be constant amplitude? If not, what is the range?

What does HU mean?
 

Thread Starter

chunkmartinez

Joined Jan 6, 2007
180
You can't do it with just negative feedback. There has to be a variable gain element somewhere in the loop.
Will the reference input be constant amplitude? If not, what is the range?

What does HU mean?
HU is headunit(pre-amp), or input to be compared to the amp output Which nis going to be the ref voltage. It will not be constant since it will be the adjusted by the volume of the headunit. I'm not sure what the lowest voltage will be but the max would be 4v.

I thought negative feedback was like a variable gain? can't I use Negative feedback to attenuate the amp output to the input (reference voltage)?
 
Top