I am probably jumping in too deep with this project, but here it goes. I am working on a project that will decode hdmi (using an ic such as http://www.nxp.com/documents/data_sheet/TDA19977A_TDA19977B.pdf), modify the video stream with a microcontroller, and then re encode the whole stream back to hdmi (using another chip i have found).
As I looked up what i was getting into, i found that hdmi can easily go nearly 4 gigabits/s.
I can't understand this! Obviously some of this is overhead, and i understand why it has to go this fast (supporting 1080p and 8 channels of audio) but how can anything read those signals? Looking at some logic gates online I see some having very short response times (Reliable a few hundred megaherts for good parts), but they can't even come close to that speed (obviously the miniaturized elements in the chips are better, but there is a limit)! And the high performance chips like CPUs that can get to those speeds produce crazy amounts of heat.
So my loss comes from confusion on how a $9 dollar chip can decode this bandwidth into video channels without using hardly any power or producing much heat at all. And further more, how would i be able to use a microcontroller to read, let along modify the stream when no microcontrollers can go ANYWHERE near that speed.
TL;DR HOW CAN HDMI HAVE SUCH A HIGH FREQUENCY, AND HOW CAN CHEAP LOW POWER HARDWARE EXIST THAT CAN READ IT?
As I looked up what i was getting into, i found that hdmi can easily go nearly 4 gigabits/s.
I can't understand this! Obviously some of this is overhead, and i understand why it has to go this fast (supporting 1080p and 8 channels of audio) but how can anything read those signals? Looking at some logic gates online I see some having very short response times (Reliable a few hundred megaherts for good parts), but they can't even come close to that speed (obviously the miniaturized elements in the chips are better, but there is a limit)! And the high performance chips like CPUs that can get to those speeds produce crazy amounts of heat.
So my loss comes from confusion on how a $9 dollar chip can decode this bandwidth into video channels without using hardly any power or producing much heat at all. And further more, how would i be able to use a microcontroller to read, let along modify the stream when no microcontrollers can go ANYWHERE near that speed.
TL;DR HOW CAN HDMI HAVE SUCH A HIGH FREQUENCY, AND HOW CAN CHEAP LOW POWER HARDWARE EXIST THAT CAN READ IT?