Input protection of dual polarity signal with minimal signal distortion

ronsimpson

Joined Oct 7, 2019
3,206
Why is it so important to keep the ADC inputs below 4? Is it running from a 3.3V supply?
I see you don't want the signal at the ADC to distort, so you want to distort it at the Diff amp. Is one place better than another?
 

MisterBill2

Joined Jan 23, 2018
19,409
I’ve drawn the fast diodes as Schottkies even though they are probably not, so you know which ones have to be fast and which ones don’t.
The simple solution to avoid ALL diode induced distortion is to select the clamping voltage so that the diodes do not start conducting until the input voltage is exceeding the normal operation level. Of course, if the plan is to routinely operate up to the damage threshold level with no buffer zone, then there is no simple solution available. Certainly some solutions but none of them simple.
So now additional information is needed: Is the voltage to be solidly limited to 2.000 volts positive and negative to prevent damage?? Or is it that 2.000 volts positive and negative are the normal operation range?
Limiting a signal at exactly the edges of the regular operation range IS RATHER MORE COMPLEX, if exceeding that level immediately causes damage. IT IS ALSO POOR DESIGN!!
So now an additional explanation is needed to determine if there is a range in which clamping he signal amplitude can be clamped solidly but without the distortion producing problems.
 
Last edited:

LadySpark

Joined Feb 7, 2024
194
View attachment 317989
This is my exact application. This IC converts the single ended signal to a differential one and feeds it to the ADC. My ADC can only take 0-4V with reference to ground. So I am expecting to limit my single ended signal to -2/+2V with reference to ground, apply a 2V common mode voltage to the LMH6550 and feed that differential signal to the ADC.

The only relevant part is what happens before the LMH6550. The diff amp will have +-5V supply, but I want to limit the signal itself to +-2V to prevent distortion.

In other words, all the diff amp and ADC stuff is not really important, I just want to preserve the signal content as much as possible in the +-2V range and clip everything thats beyond. The single ended signal will be coming from another OPAMP, so it wont carry very large energy or current source capability, but during transients and other stuff, I expect my signal to shoot up to +- 8 or 9V lets say, and at normal large signal conditions, maybe +-3 or 4V
So looking at this, it looks like you don't have a good +2 -2 supply to do the diode clamp limiting I posted earlier.

But a limiter circuit that develops its own clamping voltage might work.

Below is a capture of one I constructed in a sim using the existing 75 ohm resistors, and simulating the ADC load with a 10K resistor. This would be inserted between the 75 ohm resistors and the ADC input:
Edit: The transistor I used was a BC517, but any darlington transistor that is greater than 10,000 can be used as long it can handle the potential current.
Screenshot_2024-03-21_20-36-57.jpg
 
Last edited:

MisterBill2

Joined Jan 23, 2018
19,409
What is still missing is the supply voltage for that amplifier. The reason that matters is that the amplifier will normally not be able to deliver any output, distorted or not, beyond the supply voltage. In addition, most amplifiers tend to produce a fair amount of distortion as the output approaches the supply voltage.
So once again I ask if the A/D converter simply has it's rated input range limited to +/- 2.000 volts, or is that level the "Absolute Maximum Without Damage" value?? There is a great deal of difference in the meaning of those two limits.

There is a great deal of information available about many A/D converters, and it appears that it hhas not adequately been considered here.

Clamping an input level to avoid damage is one thing, while adjusting the input gain to keep the measured input within the full-scale range are two entirely different considerations.
 

LadySpark

Joined Feb 7, 2024
194
What is still missing is the supply voltage for that amplifier. The reason that matters is that the amplifier will normally not be able to deliver any output, distorted or not, beyond the supply voltage. In addition, most amplifiers tend to produce a fair amount of distortion as the output approaches the supply voltage.
So once again I ask if the A/D converter simply has it's rated input range limited to +/- 2.000 volts, or is that level the "Absolute Maximum Without Damage" value?? There is a great deal of difference in the meaning of those two limits.

There is a great deal of information available about many A/D converters, and it appears that it hhas not adequately been considered here.

Clamping an input level to avoid damage is one thing, while adjusting the input gain to keep the measured input within the full-scale range are two entirely different considerations.
By what the original poster said in post #17 Its set up as a 5V single supply operation and using the VRef voltage regulator to set the offset virtual ground for signal crossing. They Should be able to just use standard diode clamp with the ADC + and - supply. Which would limit the signal.

I really don't know why people try to use these dual supply op amps in a single supply. It just complicates the circuit more than its worth.
 

MisterBill2

Joined Jan 23, 2018
19,409
By what the original poster said in post #17 Its set up as a 5V single supply operation and using the VRef voltage regulator to set the offset virtual ground for signal crossing. They Should be able to just use standard diode clamp with the ADC + and - supply. Which would limit the signal.

I really don't know why people try to use these dual supply op amps in a single supply. It just complicates the circuit more than its worth.
Not everybody is gifted with adequate insight as to what works and what does not always work well. Experience teaches in a hard classroom, but some folks will learn in no other. Plus, some folks never learn.
The fact is that not all op-amps are created equal. Some of them are much less equal. In many cases it iis very important to read and understand the data sheet, including all of the pages.
 

ronsimpson

Joined Oct 7, 2019
3,206
What is the part number of the ADC?
I think we are looking for a very complex solution to a simple problem.
The problem may not be a problem at all. I have designed audio recording machines that automatically adjusted the gain so there was no distortion.
Many of us have done this project without diodes, and don't understand the need.
Most of the circuits shown have a very bad temperature drift problem. Post #23 is an example.
 

MisterBill2

Joined Jan 23, 2018
19,409
@Ron: Audio recording systems strive to have low distortion and excellent frequency response. But they do not need to capture the exact amplitude accurately. That is why compression is used to reduce the dynamic range a bit.
An instrumentation system is required to accurately capture the amplitude of a signal, and so it must remain linear. There are instrument systems that adjust the gain and also record that the gain was changed. They are not normally found in most systems because that is a complex function.
 

ronsimpson

Joined Oct 7, 2019
3,206
MisterBill2 I do not understand what limited and no distortion really means. For instruments is different than for audio and I cannot see the entire picture, so my comments are more to find out what he wants.

It is of concern that 'diodes before the amp are OK but diodes after cause clipping and that is bad'. I don't get that at all.

I have built clippers/limiters that are temperature stable. I can build circuits that don't have the diode knee problem.

Because we will not be told what the circuit really is and what the parts numbers are I cannot help. I am out of here. Good luck.
 

MisterBill2

Joined Jan 23, 2018
19,409
Now I have read a current article on this same site about "allowable Op-amp inputs", and I am wondering if the TS has misread a datasheet on that opamp that was shown in the one post today.
The fact is that there is a great deal to understand about A/D converters and quite a bit of it is indeed fairly complex. That is why I did suggest to read and understand the A/D data sheet.
I have designed industrial systems that utilized a number of different analog input boards with different A/D converters and It was always important to understand the different requirements and capabilities. And over the years they certainly did change a great deal. Instrumentation systems and control systems demand correct amplitude measurements while audio entertainment systems mostly care about sound quality and the lack of distortion, with exact gain of amplifiers not mattering very much, as long as they sound good.
 

LadySpark

Joined Feb 7, 2024
194
Now I have read a current article on this same site about "allowable Op-amp inputs", and I am wondering if the TS has misread a datasheet on that opamp that was shown in the one post today.
The fact is that there is a great deal to understand about A/D converters and quite a bit of it is indeed fairly complex. That is why I did suggest to read and understand the A/D data sheet.
I have designed industrial systems that utilized a number of different analog input boards with different A/D converters and It was always important to understand the different requirements and capabilities. And over the years they certainly did change a great deal. Instrumentation systems and control systems demand correct amplitude measurements while audio entertainment systems mostly care about sound quality and the lack of distortion, with exact gain of amplifiers not mattering very much, as long as they sound good.
Reading this all makes me wonder if the application is audio, video or something else. 2-6 ns spike is 166mhz to 500Mhz according to the problem in post 1. But it does make me wonder if they built their offset voltage they are buffering correctly, as usually you suppose to apply a shunt capacitor before going into an op amp common mode pin. to strip out any presence of signal.

audio equipment got cheaper in more ways than one. That is why they soft clip music for CD mastering. Because a lot of things have poor signal to noise. I find the dynamic range doesn't have really any bearing after finding out how much dynamic range reduction is done during the production process.
 

MisterBill2

Joined Jan 23, 2018
19,409
The amazing thing about when CDs first arrived for broadcast radio is how much better they sounded thru a $10 transistor radio. At the time I knew that there was some monkey business going on, because clearly the cheap radio was the limiting factor. But CDs were pushed so very hard because nobody could copy them, at least for a couple of years. Cassette tapes were so very easy to copy, the RIAA hated them.
 

LadySpark

Joined Feb 7, 2024
194
The amazing thing about when CDs first arrived for broadcast radio is how much better they sounded thru a $10 transistor radio. At the time I knew that there was some monkey business going on, because clearly the cheap radio was the limiting factor. But CDs were pushed so very hard because nobody could copy them, at least for a couple of years. Cassette tapes were so very easy to copy, the RIAA hated them.
What was unfair to the musican/diy recording people was their equipment was purposely sabotaged so they would never get pro style recordings. I dated this one guy that was in to it, recording bands and pulling his hair out. One Christmas, I decided to give him an ADAT converter that I modified after stealthy one day peeking on the inside of one he already had, which was a Behringer one that he was plugging in other mic preamps in them because he told me the ones in them sounded bad. So I bought him one and removed the front pcb that had the mic preamps, and put jacks in and directly fed the converters, that were the same ones in a 'mothership' one he yammered about from time to time wishing he had. After that, he never said anything about wanting a 'mothership' converter box and was very happy that he was getting pro results finally.
 
Top