I am not a student, and I am new at messing around with circuits and have a newbie question for a hobby project.
I programmed a microcontroller to talk to an external device using an ansync serial interface, working on CMOS levels. My program needs to be able to detect when the cable of the external device is disconnected.
My first idea was to have a resistor from the RX going to ground, so that in case the cable is disconnected, I would get all logical spaces and therefore a "modem break" on my serial port. The problem with that approach is, if the resistor value is too high (>500 Ohm), the microcontroller will not see the spaces (I guess the input is still considered high?). On the other hand, some external devices don't like it if the resistance is too low (<1000Ohm) and refuse to operate.
Am I missing something obvious, or is there a better way to detect a disconnected serial connection?
thanks,
MGX
I programmed a microcontroller to talk to an external device using an ansync serial interface, working on CMOS levels. My program needs to be able to detect when the cable of the external device is disconnected.
My first idea was to have a resistor from the RX going to ground, so that in case the cable is disconnected, I would get all logical spaces and therefore a "modem break" on my serial port. The problem with that approach is, if the resistor value is too high (>500 Ohm), the microcontroller will not see the spaces (I guess the input is still considered high?). On the other hand, some external devices don't like it if the resistance is too low (<1000Ohm) and refuse to operate.
Am I missing something obvious, or is there a better way to detect a disconnected serial connection?
thanks,
MGX