I just began designing a circuit for digital serial communication using two 18k452 PIC micro-controllers, using bits RB0 for receive and RB1 for send. The information is in one byte packets but with a start bit and a parity check bit and so one baud(?) of information is 10 bits long with two of the bits being throwaways as far as any transmitted or received data is concerned.
My master clock is 4 MHz and the instruction clock is 1 MHz, and therefore one microsecond long. I am emulating what I know about RS232 protocol and thus using the first bit sent as the start bit and the 10th bit sent as the parity bit.
When I began to think out the rate of data transmission I saw a problem I had not really been aware of before. I realized that I was converting my micro-controller's instruction clock of 1 MHz to a desired baud rate of 9600 baud per second, or 96,000 bits per second. This confused me. The instruction clock is counting such that the rising edge of each pulse is one microsecond removed from the next pulse, but my data stream is running at 96,000 bits per second and there are no highs and lows between the data stream. So how do I calculate the delay between each sample of the state of of RB0 such that I sample it in the middle of a particular pulse after the start bit has been sent?
I have more to say on this and I have taken a different approach to the problem, but I'll save that for when I have received some intelligent replies. Also, this will be a radio in the future. There will be no modulation, just a carrier switched on and off like Marconi or Morse. On is 1 and off is 0. I realize that Morse Code also involves the duration of a pulse -- long is dash, short is dot. But with digital information I don't see how this is necessary when on can be one and off zero and the data stream is periodic and so is sampled at regular intervals. Start bit = zero is "end of transmission," and while each baud has its start bit equal to one we just continue the algorithm of receiving data.
My master clock is 4 MHz and the instruction clock is 1 MHz, and therefore one microsecond long. I am emulating what I know about RS232 protocol and thus using the first bit sent as the start bit and the 10th bit sent as the parity bit.
When I began to think out the rate of data transmission I saw a problem I had not really been aware of before. I realized that I was converting my micro-controller's instruction clock of 1 MHz to a desired baud rate of 9600 baud per second, or 96,000 bits per second. This confused me. The instruction clock is counting such that the rising edge of each pulse is one microsecond removed from the next pulse, but my data stream is running at 96,000 bits per second and there are no highs and lows between the data stream. So how do I calculate the delay between each sample of the state of of RB0 such that I sample it in the middle of a particular pulse after the start bit has been sent?
I have more to say on this and I have taken a different approach to the problem, but I'll save that for when I have received some intelligent replies. Also, this will be a radio in the future. There will be no modulation, just a carrier switched on and off like Marconi or Morse. On is 1 and off is 0. I realize that Morse Code also involves the duration of a pulse -- long is dash, short is dot. But with digital information I don't see how this is necessary when on can be one and off zero and the data stream is periodic and so is sampled at regular intervals. Start bit = zero is "end of transmission," and while each baud has its start bit equal to one we just continue the algorithm of receiving data.
Last edited: