What exactly is Anti-aliasing filter?

Thread Starter

picstudent

Joined Feb 3, 2009
91
I am doing a project with a 13 bit ADC for data acquisition.

In my setup a small voltage across a shunt resistor is captured ,amplified and offset by a opamp circuitry and fed to 13 bit ADC via a multiplexer. I read and got the fact that I need to use a anti aliasing filter here,basically to remove noise.

May I know what is this term 'noise' exactly mean? What is its range of frequencies? How can we reduce its generation in our circuit?

Thanks a lot

Roy Thomas
 

DedeHai

Joined Jan 22, 2009
39
Whenever you measure something, there always is noise.
Noise are more or less random, unwanted frequencies you get in your signal. It comes from external sources (like radio, cell phones and all sorts of power electronics) or from switching operations in your circuit, high currents in your circuit and there is also thermal noise. The smaller the voltage that you try to amplify, the more noise you will have in your signal.
You can get rid of noise by applying filters, that only let the desired frequencies pass. Simplest is a lowpass filter, that eliminates frequencies above a certain cutoff-frequency.
Aliasing is an effect that happens in analog to digital conversion. If you use a sampling rate of lets say 10kHz, the maximum signal frequency that can still be conversed is 5kHz (Nyquist Law). So you have to make sure your signal only has frequency below half your sampling frequency or you will get something like phantom frequencies in your digital signal because of the aliasing effect (see wikipedia for an explenation).

To reduce the generation of noise: separate signal processing areas from power areas, use a big ground plane, reduce resistance and/or inductance in signal paths, synchronize the time of sampling with your switching sucht that you sample your signal only when no switching is done and all switching transients have dissipated.
The solution to your problem could simply be a low pass filter with a cutoff-frequency of let's say 0.3 times your sampling frequency. If that cuts into your actual signal frequency: increase sampling frequency!
 

Thread Starter

picstudent

Joined Feb 3, 2009
91
Thanks a lot, understood.
just a few points please.

You said sample when there is no switching done - that means in my case since I am using multiplexer , select the channel, wait for sometime for the switching signal to settle (I think say 5us ok?) then enable multiplexer and also wait if possible then sample .Also make sure in interrupt or otherwise no pins are switched in the close time. Is it correct?

Also since my signal is pure DC how to decide cut off frequency?

Thank you

Roy Thomas
 

DedeHai

Joined Jan 22, 2009
39
When I say switching I mean switching of power signals only. I thought of a digitally controlled power supply or motor application. So for instance if you have a switched DC-to-DC converter, a buck converter for example, you might have noise on your output signal during every switching operation of the power-FET. The time each switching operation takes highly depends on the transistor used but also on current flowing, through the transistor (since it takes more time to "close the valve" on a high current).
If your signal is purely DC with no switching present (like in monitoring a battery voltage with a constant load) the cutoff frequency can be as low as 0.1 Hz or even lower. But be aware that a lowpass filter also filters out dynamic changes in your voltage. So if your voltage changes over time, let's say it changes quickly (in the range of ms) from one voltage to another (a so called step-function) the output of the filter still changes slowly because your filter frequency as slow as 0.1Hz it takes the output more than 10 seconds to reach the new voltage.
So set the cutoff frequency such that you still have good dynamics if needed. Normally a cutoff frequency of 100Hz is used for DC with low dynamics. If you sample only once every second, 100Hz cutoff frequency still is enough. Since your signal is DC, the nyquist-law does not apply. Just make sure (i.e. with an oscilloscope) your noise is below a certain point, depending on the accuracy you want (1% maybe). You could also measure your signal without a filter with an oscilloscope, determin the lowest frequencies of the noise present and set your cutoff frequency slightly below that measured frequency. That will give you tha maximum in the view of dynamics.
To answer your question: No, it's not correct. You just select your channel, set your multiplexer and sample away! Signal switching does not produce any significant noise, only switching of power parts (meaning currents in the range of 0.1Amp or more).
 

Thread Starter

picstudent

Joined Feb 3, 2009
91
Thanks for your time. Got the idea. But the dynamic changes in my voltage is an issue for me.
I use the ADC reading to monitor the current here. This current is actually switched by a high side P channel MOSFET switch. Apart from current logging with .1mA resolution, I need to shut down the MOSFET in case a load short circuit (possible in my case). So my filter should not block the response time so that the MOSFET can be protected.
Is such a effective over current protection possible? If so how fast I should read and respond to current change?

Thanks again for your time
Roy Thomas
 

DedeHai

Joined Jan 22, 2009
39
It now depends how fast you need to switch off that short circuit and how you are detecting that a short circuit actually appeared.
How fast a short circuit current destroys your FET depends on how big your short circuit current is and what the rated current (and overcurrent) is of your FET. Maybe 0.1 seconds is fast enough. A cutoff frequency of 1kHz still should work. then your filter output should still be fast enough (in the range of 1ms).
By the way, the filter output does not add a delay (in like staying constant and then jumping delayed) but the voltage just rises more slowly. So if you detect on a sudden, unexpected increase in current you can actually switch off your FET with a delay of less than 0.1ms with a 1kHz filter.
If you detect a short circuit only if the voltage exceeds a certain level it might take almost the whole 1ms.
I would need A LOT more insight in what your circuit is to tell you exactly what you need.
 

DedeHai

Joined Jan 22, 2009
39
your FET can hold up to 70amps for a maximum time of 0.3ms. The mosfet itself takes only about 300ns to switch off, after it gets the signal. So there seems to be enough time to switch it off, once you detected the over-current. Your normal current should not exceed say 0.6 amps right? So you can set your over-current detection to the according voltage level or the according ADC-value respectively.
A filter with a cutoff of 1kHz needs 1ms to reach about 90% of the final output voltage. BUT: you don't need 90%, you only need 2% (at most) since you are detecting at 0.6amps already.
see here: http://upload.wikimedia.org/wikipedia/commons/a/a3/Sprungantwort_eines_RC-Systems.png

Problem is: Your shortcircuit current could exceed those 70amps, since your total resistance (assuming your electrolysis probe has less than 0.1Ω) is probably below 0.8Ω. So either you just make that cutoff frequency 10kHz (leaving you with all that nasty noise) or you could also increase the inductance in your current-path. By putting a coil anywhere in your main path you let the current increase more slowly, giving you more time to switch off your FET. It should not affect your normal operation, which is just a DC-current. Your could use an air-core coil because it does not saturate at higher currents.
 

Thread Starter

picstudent

Joined Feb 3, 2009
91
Thanks a lot for the analysis
your FET can hold up to 70amps for a maximum time of 0.3ms. The mosfet itself takes only about 300ns to switch off, after it gets the signal. So there seems to be enough time to switch it off, once you detected the over-current. Your normal current should not exceed say 0.6 amps right? So you can set your over-current detection to the according voltage level or the according ADC-value respectively.
A filter with a cutoff of 1kHz needs 1ms to reach about 90% of the final output voltage. BUT: you don't need 90%, you only need 2% (at most) since you are detecting at 0.6amps already.
So timing is ok.If i drive my PICMicro MCU at 20Mhz, I think I have enough time to track all the 12 cahnnels.
Also I plan to use a hardware comparator cut off using the output of the first opamp amplifier from the current shunt.The output of the opamp can pull down the drive for opto LED. What is your opinion.But it will them create an ON/OFF seuence till short remains. right? How can we overcome that? Some RC delay to delay the MOSFET ON again.
Problem is: Your short circuit current could exceed those 70amps, since your total resistance (assuming your electrolysis probe has less than 0.1Ω) is probably below 0.8Ω. So either you just make that cutoff frequency 10kHz (leaving you with all that nasty noise) or you could also increase the inductance in your current-path. By putting a coil anywhere in your main path you let the current increase more slowly, giving you more time to switch off your FET. It should not affect your normal operation, which is just a DC-current. Your could use an air-core coil because it does not saturate at higher currents.
I think noise is to be avoided at any cost since I am using a 13 bit ADC. So will introduce a Inductor. How can we arrive at a optimum value for the inductor?
Thanks again
 
Top