What makes my question different from the average question, is that the intended application is for reading what's proved to be a rather noisy mechanical encoder, and not a manual switch. In fact, said encoder is comprised by a couple of standard magnetic reed switches. And no, I cannot change said component and adapting to its behavior is a must within the scope of my project. The reeds in question can work at up to 200 Hz and switch a 3.3V signal. In reality, I doubt they'll ever reach a frequency of 100 Hz, but I'm just being protective here.
As an aside, I found this very immersive (and entertaining) series of articles written by Max Maxfield to be quite enlightening in what proved to be a far more sophisticated subject than I at first thought:
At this point in time, I've decided that a hardware solution would be best for my application. This because my project is based on the PIC16LF1825 Microchip MCU, which has interrupt-on-change pin monitoring capabilities. And since the only debounce algorithm that I know of works by monitoring the historical values of the state of the switch, said algorithm would require a polling technique rather than an interrupt driven one. And my project's architecture relies heavily on monitoring the encoder's state through interrupts, not polling. Of course, I could be very wrong in my assessment and therefore I invite anyone with greater knowledge than me to please explain an alternate technique to properly implement an interrupt-on-change driven debounce algorithm.
Anyway, among the hardware options, I found:
Options #1 and #3 usually require sourcing additional power for that specific application. And since my project is battery-powered, I'm trying to avoid said scenario. Of course, I found circuits out there that draw no more than 20 or 40 µA and look rather tempting. So I'm keeping an open mind in this regard.
Option #3 is probably the worst of all, because all of the chips I found have a fixed 20ms or more sampling period in them, and my circuit has to work at a frequency of up to 200 Hz (5 ms period)
That leaves me with using the primitive, but perhaps most effective RC filter. Except that I'm going to have to be very careful with its components' values. My main concern is that the reed switch is placed at an up to 5m distance from the main circuit, and I'd like to feed it a 3.3V signal, so inductive kickback and voltage drop could be a problem
And finally just so you know all there is to know about my circuit, I could use a 12V signal for the reed switch if I wanted to (that's the system's main battery voltage). And then connect it to my MCU using a voltage divider plus the RC filter. But the restriction would be that no more than a few µA should be drawn in this scenario, and I don't know if there's a significant advantage over using 3.3V instead. Also, this is not for an industrial application, so ambient EMI noise would be rather small.
This is the circuit that I'm currently considering:
My MCU already has Schmitt-trigger buffer at its input, so there's no need to add the component shown in the above image. And also, the MCU already has in internal pull-up resistor that can be enabled/disabled on demand, except that it is placed at the gate's input, and not at the switch's output as shown.
As an aside, I found this very immersive (and entertaining) series of articles written by Max Maxfield to be quite enlightening in what proved to be a far more sophisticated subject than I at first thought:
At this point in time, I've decided that a hardware solution would be best for my application. This because my project is based on the PIC16LF1825 Microchip MCU, which has interrupt-on-change pin monitoring capabilities. And since the only debounce algorithm that I know of works by monitoring the historical values of the state of the switch, said algorithm would require a polling technique rather than an interrupt driven one. And my project's architecture relies heavily on monitoring the encoder's state through interrupts, not polling. Of course, I could be very wrong in my assessment and therefore I invite anyone with greater knowledge than me to please explain an alternate technique to properly implement an interrupt-on-change driven debounce algorithm.
Anyway, among the hardware options, I found:
- Custom dedicated logic gate driven circuits
- RC (and a diode) filters
- Specialized IC's
Options #1 and #3 usually require sourcing additional power for that specific application. And since my project is battery-powered, I'm trying to avoid said scenario. Of course, I found circuits out there that draw no more than 20 or 40 µA and look rather tempting. So I'm keeping an open mind in this regard.
Option #3 is probably the worst of all, because all of the chips I found have a fixed 20ms or more sampling period in them, and my circuit has to work at a frequency of up to 200 Hz (5 ms period)
That leaves me with using the primitive, but perhaps most effective RC filter. Except that I'm going to have to be very careful with its components' values. My main concern is that the reed switch is placed at an up to 5m distance from the main circuit, and I'd like to feed it a 3.3V signal, so inductive kickback and voltage drop could be a problem
And finally just so you know all there is to know about my circuit, I could use a 12V signal for the reed switch if I wanted to (that's the system's main battery voltage). And then connect it to my MCU using a voltage divider plus the RC filter. But the restriction would be that no more than a few µA should be drawn in this scenario, and I don't know if there's a significant advantage over using 3.3V instead. Also, this is not for an industrial application, so ambient EMI noise would be rather small.
This is the circuit that I'm currently considering:
My MCU already has Schmitt-trigger buffer at its input, so there's no need to add the component shown in the above image. And also, the MCU already has in internal pull-up resistor that can be enabled/disabled on demand, except that it is placed at the gate's input, and not at the switch's output as shown.