How to clip a voltage level on the input channel of ADC?

Thread Starter

Vishnu_AG

Joined Nov 1, 2021
2
I am using an 8-channel ADC for data acquisition. The voltage supply of ADC is +3.3V. The first input channel of ADC is connected to the output of an opamp (3V3_CURRENT). However, the opamp is provided with a +6V supply which means the first channel of ADC can have a voltage level maximum up to +6V. So, I need to limit the voltage on this first channel to +3.3V. How should I achieve this using a diode?
 

Papabravo

Joined Feb 24, 2006
21,159
I am using an 8-channel ADC for data acquisition. The voltage supply of ADC is +3.3V. The first input channel of ADC is connected to the output of an opamp (3V3_CURRENT). However, the opamp is provided with a +6V supply which means the first channel of ADC can have a voltage level maximum up to +6V. So, I need to limit the voltage on this first channel to +3.3V. How should I achieve this using a diode?
No. Use an opamp with an appropriate gain setting to attenuate the input in a linear fashion.
 

ScottWang

Joined Aug 23, 2012
7,397
Maybe you can use two 1K resistors as voltage divider, and every input voltage to ADC from the opamp will be only 1/2 value, so when you calculate the voltages from the ADC then it should be double.
 

crutschow

Joined Mar 14, 2008
34,285
You could connect a small Schottky diode from the input (anode) to the +3.3V supply of the ADC.
Add a 1-10KΩ resistor in series with the input, before the diode connection.
 
Top