Greetings all, this is my first post. I've used the website off and on for awhile, but today I signed up for an account and I hope to contribute as well as receive help.
I have been following the saga of the USB Killer (USB Kill.com) observing folly after folly in its evolution. It's use is controversial, but it's purpose of encouraging OEM's to include TVS in new designs is probably valid.
My question involves TVS diodes. They've been around for a long time, early CMOS chips had them built in, but they didn't offer much protection for many reasons-probably the most significant of which is the built in voltage limit associated with the substrate structure that is built into all CMOS devices.
So, today we have really small TVS packages that are not built into (onboard) the device needing protection.
There is much data on the web, but are TVS diodes all they are cracked up to be. They have to be installed in parallel as any resistance in series with a CMOS device severely reduces its throughput- due to the time constant from the chips input C and the series R.
The makers of TVS diodes have glowing reports of their usefulness, but….
Along comes the USB Killer-argh. It works fine to demonstrate the need for CMOS input and output over voltage protection. BUT-the model used to define the worst case static charge is far outdated and doesn’t address the power dissipation of the diode when a constant over current and voltage is applied.
TVS stands for 'transient' and the TVS diodes work well as long as multiple or continuous/malicious over voltage attacks are perpetrated.
So...my question, finally::>
How do these diodes dissipate power if the input voltage is high enough to make the diode clamp? In such an attack, don't these diodes melt from heat dissipation quickly??
With very low input capacitance associated with the TVS diode, the junction cannot be very robust. And, they're not mounted on heat sinks!!
How long can a 5V rated microscopic diode junction dissipate 100 watts if the clamping current is a constant 20A (for instance)?
I have been following the saga of the USB Killer (USB Kill.com) observing folly after folly in its evolution. It's use is controversial, but it's purpose of encouraging OEM's to include TVS in new designs is probably valid.
My question involves TVS diodes. They've been around for a long time, early CMOS chips had them built in, but they didn't offer much protection for many reasons-probably the most significant of which is the built in voltage limit associated with the substrate structure that is built into all CMOS devices.
So, today we have really small TVS packages that are not built into (onboard) the device needing protection.
There is much data on the web, but are TVS diodes all they are cracked up to be. They have to be installed in parallel as any resistance in series with a CMOS device severely reduces its throughput- due to the time constant from the chips input C and the series R.
The makers of TVS diodes have glowing reports of their usefulness, but….
Along comes the USB Killer-argh. It works fine to demonstrate the need for CMOS input and output over voltage protection. BUT-the model used to define the worst case static charge is far outdated and doesn’t address the power dissipation of the diode when a constant over current and voltage is applied.
TVS stands for 'transient' and the TVS diodes work well as long as multiple or continuous/malicious over voltage attacks are perpetrated.
So...my question, finally::>
How do these diodes dissipate power if the input voltage is high enough to make the diode clamp? In such an attack, don't these diodes melt from heat dissipation quickly??
With very low input capacitance associated with the TVS diode, the junction cannot be very robust. And, they're not mounted on heat sinks!!
How long can a 5V rated microscopic diode junction dissipate 100 watts if the clamping current is a constant 20A (for instance)?