Adapt 0-5V sine wave signal (centered on +2.5V) to 3.3V analog read input

Thread Starter

fe7565

Joined Aug 13, 2016
24
I have an analog Since wave signal that is centered at +2.5VDC and maxes out between 0-+5V. I need to adapt it to the analog read input pin of a 3.3VDC (around 10K-16K internal resistance) microcontroller (nRF52840). I built a voltage divider that moved the +2.5V center to +1.65VDC center (half of 3.3V), but I am still getting sine wave clippings at +4.5V and below zero volts (negative on the oscope).

So changed the voltage divider to center at +1.2VDC, which fixed the upper clippings to +2.88V, but looks like still getting negative clippings. I cannot change the source-setup of the 0-5V signal input, nor the 2.5V center. I can only modify that source signal.

Do I need to reduce the signal amplitude with an op amp to center it on +1.65V and limit the sine wave swings to stay within the 3.3V rail to prevent clippings? Any passive components fix or only op amp? I cannot do a software fix, because i need to capture the entire intact sine wave to do FFT on it.
 

B-JoJo-S

Joined Jan 3, 2026
247
How do you have a sine wave that is positive biased 2.5VDC? That would give you a wave bottoming out at zero and topping out at 5V. Are you sure you have a 2.5V positive shift in the wave form? If so - how did you come across that value?

Oh, and a voltage divider isn't the way to get a regulated 3.3VDC.

What you need is a full wave bridge rectifier. 5V minus two forward diode voltage drops (1.2 to 1.4V) which would reduce your DC to 3.8 to 3.6VDC. But that's without filtering. Filtering will increase the voltage to 5.4 volts DC (accounting for forward diode voltage drop). Use of a 3.3V regulator would mean dropping 2.1V through the regulator. Shouldn't get too hot.

How many amps do you need at 3.3VDC?
 
Last edited:

B-JoJo-S

Joined Jan 3, 2026
247
If you're drawing 2 amps then that's 4.2W. Not cold but not hot. Even still, it's wasted energy.

Another possibility is after rectifying and filtering your 5V to DC you might be able to find a buck converter to adjust output down to 3.3V. That'll waste less energy.
 

schmitt trigger

Joined Jul 12, 2010
2,056
Please show what you have actually built and tested, because otherwise your post doesn’t make sense.
You mentioned a voltage divider to lower the offset from 2.5 to 1.66 volts, meaning a 1.5:1 ratio.
Well….. that same ratio should drop the 5 maximum volts to 3.3, and the zero volt to, well zero.

What is it?
 

eetech00

Joined Jun 8, 2013
4,704
I have an analog Since wave signal that is centered at +2.5VDC and maxes out between 0-+5V. I need to adapt it to the analog read input pin of a 3.3VDC (around 10K-16K internal resistance) microcontroller (nRF52840). I built a voltage divider that moved the +2.5V center to +1.65VDC center (half of 3.3V), but I am still getting sine wave clippings at +4.5V and below zero volts (negative on the oscope).

So changed the voltage divider to center at +1.2VDC, which fixed the upper clippings to +2.88V, but looks like still getting negative clippings. I cannot change the source-setup of the 0-5V signal input, nor the 2.5V center. I can only modify that source signal.

Do I need to reduce the signal amplitude with an op amp to center it on +1.65V and limit the sine wave swings to stay within the 3.3V rail to prevent clippings? Any passive components fix or only op amp? I cannot do a software fix, because i need to capture the entire intact sine wave to do FFT on it.
Yes. You'll need to rescale the sinewave to VDD/2 to get the best ADC resolution. So if the ADC is 3.3v full scale input, then the center should be 1.65v. One way to do that is to block upstream 2.5v DC bias, then rescale using 1.65v bias.
Or, just run MCU/ADC at 5vdc and maintain 2.5v bias.
 

Thread Starter

fe7565

Joined Aug 13, 2016
24
Thank you for the inputs. The sine wave is coming from a nerf radar that uses 10Ghz. The device amplifies the +/-200 uV radar sensor signal to 0-5V and then uses a voltage divider to center it at +2.5V So, it's basically an sine wave AC signal coming out of two stage op-amp (see the wave shape, which represents the moving object's doppler reflection)

My original circuit adds a voltage divider after that to bring down that center to +1.65V. It did do that, but the sine wave when triggered went into the 4+ volt range, and also went negative a couple of hundred mVs.

Then I changed up the voltage divider and moved the center to about +1.2V which solved the upper sine wave clipping to +2.6Vmax, but still had -240mV on the bottom.

So, absent of any other proper terms, need to squeezer the sine wave together to a smaller amplitude while keeping the same shape so that FFT can process it properly.

One option is to is to use a Schmidt trigger of sort that has an about 2V state change trigger and a 3.3Vmax built-in limit and convert the sine wave into a square wave...

Attached the oscope snapshot of the current voltage divider setup.

EDIT: my MCU/ADC uses 3.3V. I need to leave some room below 3.3 and above 0V so the sine waves can form properly and not get clipped. Not all the signals reflected from the object have the same amplitude or attenuation, because the object can vary in size/shape/density and of course speed.


IMG_1504.jpeg
 

Attachments

Last edited:

eetech00

Joined Jun 8, 2013
4,704
My original circuit adds a voltage divider after that to bring down that center to +1.65V. It did do that, but the sine wave when triggered went into the 4+ volt range, and also went negative a couple of hundred mVs.
That's why I suggested blocking the 2.5v bias, then re-scaling the sine wave to 3.3v centered at 1.65v.
 

BobTPH

Joined Jun 5, 2013
11,487
A voltage divider will not create a negative output with a positive input. The negative voltage must be present in the input.

Have you perhaps not connected the two grounds?
 

Thread Starter

fe7565

Joined Aug 13, 2016
24
I think I may have found the problem. Let me check. I moved the incoming centered 2.5V sine wave signal to the top of my voltage divider where the 3.3V supply should be, and read the middle of the voltage divider with the oscope.
 

BobTPH

Joined Jun 5, 2013
11,487
I think I may have found the problem. Let me check. I moved the incoming centered 2.5V sine wave signal to the top of my voltage divider where the 3.3V supply should be, and read the middle of the voltage divider with the oscope.
How was it originally connected?
 

Thread Starter

fe7565

Joined Aug 13, 2016
24
See my post above about my possible rookie error on the voltage divider. Both grounds (radar device and MCU analog input) are connected together and also to the oscope.
 

Thread Starter

fe7565

Joined Aug 13, 2016
24
Let me sketch a circuit diagram. I think I need two voltage dividers. One to bring the 2.5V centered signal down to 1.65V and another to feed that signal into a voltage divider's center that is centered at 1.65V. Because when I feed the 2.5V directly into the center of the 1.65V centered voltage divider, it bumps everything up to 2.5V and clips over 4 volts.
 

Thread Starter

fe7565

Joined Aug 13, 2016
24
I am sure there is an easier way. I cannot change anything inside the box labeled as "device".

On second look, I do not think that second divider accomplishes anything. The sine wave will still swing above 4V.

EDIT: may need an op amp to attenuate the signal to fit between 0-3.3V

IMG_1506.jpeg
 
Last edited:

Ian0

Joined Aug 7, 2020
13,114
Yes. You'll need to rescale the sinewave to VDD/2 to get the best ADC resolution. So if the ADC is 3.3v full scale input, then the center should be 1.65v. One way to do that is to block upstream 2.5v DC bias, then rescale using 1.65v bias.
Or, just run MCU/ADC at 5vdc and maintain 2.5v bias.
I'd agree with that, especially if the 5V DC isn't precisely 5V. However, you need to be careful with phase shift, if the phase of the waveform is important. A coupling capacitor will be creating phase shift at least a decade below the 1/(2πRC) frequency.
 

Thread Starter

fe7565

Joined Aug 13, 2016
24
Or, you might try a single divider:
View attachment 364511

Vin = 5VPP, centered at 2.5vdc
ADC= 3.3VPP, centered at 1.65vdc
R1=5.1 k
R2=10 k
This seems the easiest. Can I use 1K Ohm for R1 and 2K Ohm for R2 in case I want to use signals in the 200Khz range?

If I go with the "DC bias and shift down to 1.65V center" option as previously outlined above, safe to use 1uf cap and 1K Ohm and 2K Ohm to prevent phase shift at 200 Khz?

EDIT: I think max current is 50mA -200mA for entire device, and for signal reading side maybe 2-3mA?
 

Ian0

Joined Aug 7, 2020
13,114
If you are capacitively coupling, you will need two resistors of equal value to re-bias it to the centre. And anything over 10nF will work for a 1k load at 200kHz.
 

Thread Starter

fe7565

Joined Aug 13, 2016
24
Thank you. I did the two resistor for now and will test it. But I like the DC isolation and offset removal idea too.
 
Top