Hey RF experts, I hope I am not re-posting a similar question but I couldn't find any answers when I did a search. I am primarily a firmware programmer and have limited exposure to RF applications, and am running into RF performance issues with a garage door opener type of application. The product uses the ~433MHz band and we are using SOCs that do the modulation and demodulation for us. I am currently using FSK with Manchester decoding at 8K symbols/sec.
I found that this works very well when there is Line Of Sight between the transmitter and receiver but whenever there is Non Line of Sight and in an enclosed environment like in a garage with reflecting walls and ceiling, the stream of data gets clobbered beyond repair by conventional forward error correction codes such as Hamming. I did not observed small bursts of corrupted bits but rather the whole packet is badly corrupted so that correctly preserved bits were the exception rather than the norm. I am guessing severe intersymbol interference?
I am not able to disclose the modem chips used due to NDAs but I am wondering if there are any settings I can play with to improve the bit error rate. I tried to see if I am able to configure the sampling window of the receiver so that it uses the cleanest part of the symbols and avoid the parts that were spilled over from the previous symbol, but I can't seem to find any setting that allows me to. To begin with, I don't fully understand intersymbol interference. All the literature I found describes ISI of binary pulses. I don't really understand the way two adjacent RF symbols blend into each other.
These settings are available according to the data sheets:
Transmitter side:
- Gaussian shaping
- pre-emphasis
Receiver side:
- channel bandwidth
Any way of improving the bit error rate? Thanks folks.
I found that this works very well when there is Line Of Sight between the transmitter and receiver but whenever there is Non Line of Sight and in an enclosed environment like in a garage with reflecting walls and ceiling, the stream of data gets clobbered beyond repair by conventional forward error correction codes such as Hamming. I did not observed small bursts of corrupted bits but rather the whole packet is badly corrupted so that correctly preserved bits were the exception rather than the norm. I am guessing severe intersymbol interference?
I am not able to disclose the modem chips used due to NDAs but I am wondering if there are any settings I can play with to improve the bit error rate. I tried to see if I am able to configure the sampling window of the receiver so that it uses the cleanest part of the symbols and avoid the parts that were spilled over from the previous symbol, but I can't seem to find any setting that allows me to. To begin with, I don't fully understand intersymbol interference. All the literature I found describes ISI of binary pulses. I don't really understand the way two adjacent RF symbols blend into each other.
These settings are available according to the data sheets:
Transmitter side:
- Gaussian shaping
- pre-emphasis
Receiver side:
- channel bandwidth
Any way of improving the bit error rate? Thanks folks.