Ive looked at datasheets for various ethernet cards and such, as well as googling phrases similar to the title of this thread and found no answers. Found plenty of bits/sec specs, but I want to know how fast the hardware actually scans the input. I figure 10Gb/s maxed out equates to 5Ghz baud (square wave frequency, 101010...) so the card would (wild guess) need to sample at least10x faster than that, at 50Ghz. But that sounds very high so I question myself. Am I incorrect in assuming that a NIC samples the signal like an oscope would? How does it work?