Processing crazy high frequency signals (like HD video)

Discussion in 'Embedded Systems and Microcontrollers' started by diamondman, Oct 23, 2010.

  1. diamondman

    Thread Starter New Member

    May 23, 2010
    19
    0
    I am probably jumping in too deep with this project, but here it goes. I am working on a project that will decode hdmi (using an ic such as http://www.nxp.com/documents/data_sheet/TDA19977A_TDA19977B.pdf), modify the video stream with a microcontroller, and then re encode the whole stream back to hdmi (using another chip i have found).

    As I looked up what i was getting into, i found that hdmi can easily go nearly 4 gigabits/s.

    I can't understand this! Obviously some of this is overhead, and i understand why it has to go this fast (supporting 1080p and 8 channels of audio) but how can anything read those signals? Looking at some logic gates online I see some having very short response times (Reliable a few hundred megaherts for good parts), but they can't even come close to that speed (obviously the miniaturized elements in the chips are better, but there is a limit)! And the high performance chips like CPUs that can get to those speeds produce crazy amounts of heat.

    So my loss comes from confusion on how a $9 dollar chip can decode this bandwidth into video channels without using hardly any power or producing much heat at all. And further more, how would i be able to use a microcontroller to read, let along modify the stream when no microcontrollers can go ANYWHERE near that speed.

    TL;DR HOW CAN HDMI HAVE SUCH A HIGH FREQUENCY, AND HOW CAN CHEAP LOW POWER HARDWARE EXIST THAT CAN READ IT?
     
  2. bitrex

    Member

    Dec 13, 2009
    79
    4
    It does it through data compression. HDMI's signal frequency is only 340 MHz, which is UHF - vacuum tube TVs from the 1950s were able to process signals of that frequency without much trouble!
     
Loading...