Okay, I am trying to design a simple clipping detector circuit. It will be used with upto very high output caraudio amplifiers as a tool to set gain so that I can connect the output terminals to it and have an led light as soon as it detects clipping. Basically I move the gain knob until it clips and back off slightly.
amplifier power is so cheap these days that I can be dealing with around upto 300 volts from the amps output, or atleast that is where my limit is going to be made as far as compatibility goes.
So, I have studdied alot, I have reached the section of the ebook of ADC/DAC but I dont have a wole lot of experience and I dont have all of the material down so I'm a newb.
So far I have been modeling a common mode comparator that filters out any ripple voltage. There is a capacitor on one of the inputs to filter the dc from one of the inputs so that only ac will be the same at both inputs, causing it to not be "amplified" by the output. The dc, if any will pass and light an led at the output.
My issue is the voltage range of op amp input. I was thinking of using a 741 or tl082 so my input needs to be under 15v iirc. Well, I modeled a simple resistor voltage divider to proportionally reduce the input voltage within range(from say 150v from the amplifier output to 6v for example) but the led dosn't light at low dc voltage of say 1 volt because of how much I lowered it. I'm not sure if I should have looked at messing with the comparators resistor values because I am a newb with comparators, they sort of confuse me.
Anyway, so I was wanting to ask, can I use another comparator to reduce the amplifiers 100+ possible volts when I need it too? Basically I can expect to deal with as little as 35v give or take(from the amplifier) to say 300 like i previously mentioned so I am assuming a simple resistor voltage divider dosn't work? And it would be inefficient? Maybe I'm thinking to hard about this, any ideas for the circuit I'm trying to implement? The crazy high voltage from the amplifier is what is difficult for me to deal with.
oh btw, the amplifier will be output a 50hz sine wave for the detector to work with.
amplifier power is so cheap these days that I can be dealing with around upto 300 volts from the amps output, or atleast that is where my limit is going to be made as far as compatibility goes.
So, I have studdied alot, I have reached the section of the ebook of ADC/DAC but I dont have a wole lot of experience and I dont have all of the material down so I'm a newb.
So far I have been modeling a common mode comparator that filters out any ripple voltage. There is a capacitor on one of the inputs to filter the dc from one of the inputs so that only ac will be the same at both inputs, causing it to not be "amplified" by the output. The dc, if any will pass and light an led at the output.
My issue is the voltage range of op amp input. I was thinking of using a 741 or tl082 so my input needs to be under 15v iirc. Well, I modeled a simple resistor voltage divider to proportionally reduce the input voltage within range(from say 150v from the amplifier output to 6v for example) but the led dosn't light at low dc voltage of say 1 volt because of how much I lowered it. I'm not sure if I should have looked at messing with the comparators resistor values because I am a newb with comparators, they sort of confuse me.
Anyway, so I was wanting to ask, can I use another comparator to reduce the amplifiers 100+ possible volts when I need it too? Basically I can expect to deal with as little as 35v give or take(from the amplifier) to say 300 like i previously mentioned so I am assuming a simple resistor voltage divider dosn't work? And it would be inefficient? Maybe I'm thinking to hard about this, any ideas for the circuit I'm trying to implement? The crazy high voltage from the amplifier is what is difficult for me to deal with.
oh btw, the amplifier will be output a 50hz sine wave for the detector to work with.
Last edited: