What's difference between RS232 and UART in context of programming? #2

Thread Starter

ZimmerJ

Joined Dec 9, 2020
55
Ummm... no.


That part is true.


A UART is the physical piece of hardware that handles the sending and receiving of the serial data. It can either be on a chip by itself, such as a 16650 or 8250 or numerous others, or implemented as an on-chip peripheral on a microcontroller chip. But a UART is not a communication protocol; it implements one.


RS-232 is a physical interface specification that ensures devices can be interconnected and successfully send and receive signals, without letting out any of the Magic Smoke.
I am currently working on a Pyboard, with an STM32 chip offering UART. It does not mention any protocols, interfaces nor voltage levels in the datasheet. Since it works (i've tried it), but hasn't been specified any particular implemented protocol, i figured UART kind of has its own "default" protocol? Or does it use a specific protocol that is not mentioned?
Appreciate any answers
 

Thread Starter

ZimmerJ

Joined Dec 9, 2020
55
New thread created for older thread.
I thought i was suppose to rather engage in already created threads than creating new ones, oh well.

regarding your answer in older thread:
"UART is a chip that implements part of the RS-232 standard protocol, not the electrical part. Thus it is common practice to refer to the UART as a protocol even though it is just a piece of hardware."

So this is that "default" protocol i am referring to then? As if it isn't a full-blown RS-232 protocol, but because it implements parts of it, it works?
 

MrChips

Joined Oct 2, 2009
24,233
The term "protocol" has different meanings, interpretation, and context.

TTL, RS-232, RS-485, etc. primarily refers to hardware voltage levels.

UART is a piece of hardware to implement a standard method of transmitting serial data using a protocol called NRZI, non-return to zero inverted.
The basic UART interface on microcontrollers such as STM32 MCU transmits data on 0-3.3V levels "TTL".
UART serial communications itself has its own hardware protocol which defines number of data bits, parity bit, length of stop bit and transmission rate (baud).

These are all hardware level protocols.

Then there is data encoding protocols where specific meanings can be assigned to the actual data bits being transmitted. For example, data can be encoded as single-byte binary, multi-byte binary, decimal, hexadecimal, printable text, control characters, etc.

At a higher level, there are various software protocols such as MODBUS, CANBUS, OPTOMUX.

At an even higher level, data can sent encapsulated in a packet, e.g. USB, internet, etc.
 

Thread Starter

ZimmerJ

Joined Dec 9, 2020
55
The term "protocol" has different meanings, interpretation, and context.

TTL, RS-232, RS-485, etc. primarily refers to hardware voltage levels.

UART is a piece of hardware to implement a standard method of transmitting serial data using a protocol called NRZI, non-return to zero inverted.
The basic UART interface on microcontrollers such as STM32 MCU transmits data on 0-3.3V levels "TTL".
UART serial communications itself has its own hardware protocol which defines number of data bits, parity bit, length of stop bit and transmission rate (baud).

These are all hardware level protocols.

Then there is data encoding protocols where specific meanings can be assigned to the actual data bits being transmitted. For example, data can be encoded as single-byte binary, multi-byte binary, decimal, hexadecimal, printable text, control characters, etc.

At a higher level, there are various software protocols such as MODBUS, CANBUS, OPTOMUX.

At an even higher level, data can sent encapsulated in a packet, e.g. USB, internet, etc.
Just amazing, it never ends. But this is great, thanks.
 

Thread Starter

ZimmerJ

Joined Dec 9, 2020
55
The term "protocol" has different meanings, interpretation, and context.

TTL, RS-232, RS-485, etc. primarily refers to hardware voltage levels.

UART is a piece of hardware to implement a standard method of transmitting serial data using a protocol called NRZI, non-return to zero inverted.
The basic UART interface on microcontrollers such as STM32 MCU transmits data on 0-3.3V levels "TTL".
UART serial communications itself has its own hardware protocol which defines number of data bits, parity bit, length of stop bit and transmission rate (baud).

These are all hardware level protocols.

Then there is data encoding protocols where specific meanings can be assigned to the actual data bits being transmitted. For example, data can be encoded as single-byte binary, multi-byte binary, decimal, hexadecimal, printable text, control characters, etc.

At a higher level, there are various software protocols such as MODBUS, CANBUS, OPTOMUX.

At an even higher level, data can sent encapsulated in a packet, e.g. USB, internet, etc.
Just for clarity, one more question:

If you want to implement RS-232, is the UART hardware always combined with RS-232 or can you set up a RS-232 communication with some other type of NRZ-encoding? In other words, if RS-232 is present, does that imply that there is UART "beneath" it?
 

MrChips

Joined Oct 2, 2009
24,233
The answer is no. You can use RS-232 to control a relay, for example.

Strictly speaking, RS-232 is an electrical standard that specifies, voltage, switching speeds, rise-time, fall-time, capacitance and resistance on the driver and receiver, max voltages, etc.

UART is a chip or hw circuit that implements a specific serial format. You do not have to have a UART. This can be done in software using a technique called bit-banging.

You do not have to implement the UART NRZI protocol. For example, you may choose to implement a phase encoded signal, SPI, I2C, 1-Wire, etc. What you do with the signal is then up to you. You can leave it as bare-metal 0-3V or you can add any kind of receiver/driver such as:
  1. open-collector, open-drain
  2. tri-state
  3. 4-20mA current loop
  4. RS-232
  5. RS-422
  6. RS-485
  7. IrDA
  8. fibre-optic
  9. RF
  10. smoke and mirrors:)
 
A UART transmits 7 or 8 bits of information. The receiver can know if one but flipped based on parity if used.
The reciver can stop the sending of information using software or hardware flow control.
The devices are classifed as DTE (Data Terminal Equipment) and DCE
Kinda goes like:
DTR - I have power
DSR - I have power (typo fixed)
RI - Someone is calling (ring indicator)
RTS - I have information (Answer the phone)
DCD - Data Carrier Detect - (the modem answered)
CTS - It's OK to send it
Txd/Rxd/SG - the information to send and receive

This needs polishing.
 
Last edited:

Yaakov

Joined Jan 27, 2019
3,520
The devices are classifed as DTE (Data Terminal Equipment) and DCE
DCE Data Communications Equipment

DTR Data Terminal Ready (used in handshaking to indicate the terminal is prepared to handle data)
DCE DSR Data Set Ready (used in handshaking to indicate the DCE is prepared to handle data)
RTS Request to Send (DTE request to DCE for it to send data)
CTS Clear to Send (DCE telling DTE it can send data)
RTR Ready to Receive (from DTE to DCE saying it is ready to receive data, overrides RTS when used)
 
My first experience with RS232 was the Bell 103 Modem (300 baud). You just hooked up the minicomputer to a 25 foot cable and it worked. Then came the IBM PC and all sorts of wierdness happened. Flow was based on DTR/DSR insted of RTS/CTS. The roles were reversed. 9 pin connectors. I had an Amiga which had a totally non-standard printer cable from IBM.

Generally, it was instrumentation that had the issues.

Then there was that wierd baud rate of 134.5.
 

MrChips

Joined Oct 2, 2009
24,233
Yes, this was one of the weird things that IBM did with the PC.

From the aspects of data communications the IBM PC was not a computer but a computer terminal.
One connects DTE to DCE which is connected to a mainframe. Hence the IBM PC COM port was assigned as DTE, i.e. it was a terminal.:confused:
 

djsfantasi

Joined Apr 11, 2010
7,842
Yes, this was one of the weird things that IBM did with the PC.

From the aspects of data communications the IBM PC was not a computer but a computer terminal.
One connects DTE to DCE which is connected to a mainframe. Hence the IBM PC COM port was assigned as DTE, i.e. it was a terminal.:confused:
Because IBM thought of the PC as a terminal, not a computer. Their initial product run was ~8,000 units because it drastically underestimated demand. I worked on a project that created an intelligent terminal out of a PC and we were going to order several hundred (800?) units. We were treated royally, including being flown out to White Plains for a strategy meeting. At the time, we didn’t understand why but as time passed, our questions were answered.
 
I was tasked to upgrade PDP-11/2 equipment to "something". My conclusion was, "The technology isn;t ready yet". The response: "Tough, the money is available". I picked the MacIntosh because of greater than 8.3 filenanes and a flat memory model and Labview was developed on the Mac first. It was transitioning to the PC. Development was a "moving target"/

It was slower than the x-y recorder based technology. That upgrade lasted 17 years untill was upgraded to the PC running labview and SMU units which I wanted to use initially and was told no.

The initial assumption was we could print after each test to a Laserjet 4. We had to batch print at the end of the day.
The progrm was cool because it was developed using no hardware. Just simulationsand it could read in a file and re-write it with area adjusted data or other changes. Everything was correctable.

Another system was getting upgraded too and there was no interface able monochrometer, so I had to assume I was building one. It was going to control the stepper motors directly with a Rorze controller and had a filter wheel of 4 filters and 2 shutters. Eventually a monochrometer became available at a huge cost. The shutters stayed.
 
RS232 is a specification for serial communications between a DCE and DTE (eg, computer and modem); it defines electrical characteristics, the 25-way 'D' connector and the various functions of the various signal lines.
The protocol is described as asynchronous as there is no clock transmitted at all. Instead a different method of clock recovery is used.

At the beginning of each transmission a start bit is transmitted indicating to the receiver that a byte of data is about to follow. Since the idle state of the RS232 lines is low (-12V) to signal a start condition the line is set high (+12V) for 1 bit period. This means a transition on the line is always generated so that a receiver knows when the 1st edge of the data burst occurs.

The start bit lets the receiver synchronize to the data bits since it can see the rising edge of the signal on the line. What this means is that the receiver can create its own sample clock at the middle of each bit - to decide if the bit is actually a data zero or data one.

A UART is a Univeral Asynchronous Receiver and Transmitter - it is an electronic circuit which handles communication over an asynchronous serial interface - very often an RS232 interface.The UART (Universal Asynchronous Receiver/Transmitter) chip is responsible
for just what its name implies; transfering data, to and from the serial
port. The 8250 is quite old, and has been almost entirely replaced (the
8250 UART was shipped WITH the original IBM PC--and I mean the original.)
Its first replacement was the 16540 UART, which had the same general
architecture, but was somewhat faster and supported higher baud rates for
data transfer. The 16540 was replaced by the 16550, a UART which featured a
16-bit wide receive buffer for characters and a built-in FIFO buffer. A
close cousin to the 16550 is the 16560, a chip which sports a 32-bit wide
receive buffer.

That's all I remember from the difference between RS232 and UART
 

BobaMosfet

Joined Jul 1, 2009
1,850
I am currently working on a Pyboard, with an STM32 chip offering UART. It does not mention any protocols, interfaces nor voltage levels in the datasheet. Since it works (i've tried it), but hasn't been specified any particular implemented protocol, i figured UART kind of has its own "default" protocol? Or does it use a specific protocol that is not mentioned?
Appreciate any answers
You asked about this in the context of programming- so, let's make it simple.

A UART/USART is normally a TTL/Logic-level, two-wire (for full duplex) Tx/Rx (send/receive) function that most MCUs and many CPUs have- they manage the signaling aspect for you- so all you have to do is simple coding to support putting data into a register or reading data from a register, once you have the UART/USART configured (rate, parity, etc).

This is far simpler than writing your own RS232, RS485, or any other handler. What most people do is they use the UART/USART to talk to an adjunct like a MAX232CPE (RS232) IC. Let's it do the heavy lifting for higher voltage levels and more complex RS232 signaling.
 

MrChips

Joined Oct 2, 2009
24,233
RS232 is a specification for serial communications between a DCE and DTE (eg, computer and modem); it defines electrical characteristics, the 25-way 'D' connector and the various functions of the various signal lines.
The protocol is described as asynchronous as there is no clock transmitted at all. Instead a different method of clock recovery is used.

At the beginning of each transmission a start bit is transmitted indicating to the receiver that a byte of data is about to follow. Since the idle state of the RS232 lines is low (-12V) to signal a start condition the line is set high (+12V) for 1 bit period. This means a transition on the line is always generated so that a receiver knows when the 1st edge of the data burst occurs.

The start bit lets the receiver synchronize to the data bits since it can see the rising edge of the signal on the line. What this means is that the receiver can create its own sample clock at the middle of each bit - to decide if the bit is actually a data zero or data one.

A UART is a Univeral Asynchronous Receiver and Transmitter - it is an electronic circuit which handles communication over an asynchronous serial interface - very often an RS232 interface.The UART (Universal Asynchronous Receiver/Transmitter) chip is responsible
for just what its name implies; transfering data, to and from the serial
port. The 8250 is quite old, and has been almost entirely replaced (the
8250 UART was shipped WITH the original IBM PC--and I mean the original.)
Its first replacement was the 16540 UART, which had the same general
architecture, but was somewhat faster and supported higher baud rates for
data transfer. The 16540 was replaced by the 16550, a UART which featured a
16-bit wide receive buffer for characters and a built-in FIFO buffer. A
close cousin to the 16550 is the 16560, a chip which sports a 32-bit wide
receive buffer.

That's all I remember from the difference between RS232 and UART
That is a reasonable description of RS232 and UART except that there is no recoverable clock in the signal in order to determine the middle of the bit. Both devices (transmit UART and receive UART) must agree on the baud.

There are auto baud detection schemes which require that certain characters be first transmitted.
If the byte pattern is odd (i.e. lsb is 1), then this guarantees that the start bit will end with a signal edge.

https://www.st.com/resource/en/appl...ic-baud-rate-detection-stmicroelectronics.pdf
 

Delta Prime

Joined Nov 15, 2019
1,150
Hello
:)
smoke and mirrors
I got that one and the wonderful sentiment behind it.
With the utmost respect quoting a lost mentor "OBW0549" , "RS-232 is a physical interface specification that ensures devices can be interconnected and successfully send and receive signals, without letting out any of the Magic Smoke.":)
 
Top