Good day everyone. I have recently designed a 10th order Butterworth LPF to be used as an anti-aliasing filter for an STM microcontroller. I designed for a cutoff frequency of 20 kHz and an attenuation of 60 dB at 40 kHz. I used the normalized low-pass filter time tables and unity gain KRC equations in Sergio Franco's textbook: "Design with Operational Amplifiers and Analog Integrated Circuits" to obtain the results for my capacitors and resistors. I believe my calculations are correct. What intrigues me is the fact that when I simulate this circuit in LTSpice there is a ripple of about 3dB occurring before the cut-off. This seems to happen only after the last stage of the filter. I have not been able to find an explanation to help me understand why this occurs and/or how to minimize this ripple. The reason I chose the Butterworth was to avoid ripple like the Chebyshev for example. Below I posted only the circuit and the simulated results. Any and all help will be appreciated. Thank you for your time.





