Selecting filter topology for anti-aliasing

Thread Starter

MasterSnow

Joined Jan 18, 2009
22
Greetings all,

I'm designing the interface circuits to a daq system to capture accurately the output response of a DC motor to a controlled input. Therefore, I don't want any of my interface circuits to have any noticeable effect on the output characteristic and trick my system identification software later in the process.

My first impulse is to use a butterworth filter topology as an anti-aliasing filter because of its maximally flat gain characteristic. However, I've noticed that unless I really raise the cutoff frequency way above my operation frequencies, the phase shift is harsh and creates overshoot. Of course, I could be to blame since my input is a square wave with 100us rise times...

So I pose this question - what is the best anti-aliasing filter topology? I have a feeling that they all have their strengths and weaknesses (the bessel filter looks interesting from a phase point of view). Also, is what I'm doing actually the right way to go about it - that is, use a low gain distortion filter and raise the bandwidth till there are no phase problems?

Thanks!
 

hgmjr

Joined Jan 28, 2005
9,027
Take a look at the Bessel Filter response. Such filters are noted for their linear phase response.

hgmjr
 

t06afre

Joined May 11, 2009
5,934
A problem with a square wave is the harmonics. As a thumb rule I would say a square wave needs at least 10x the fundamental frequency in band width to be reproduced properly.
 
Top