When a diode is being sold, typically there's a voltage rating not to be
exceeded, i.e. 10kv. I presume that's the reverse bias rating. Also say
that voltage rating applies for say, 10mA of current.
But what about forward bias? How much current/voltage can the diode
withstand? Would it be safe to compute the reverse bias ratings in watts
and stay within that parameter, even though voltage and current may vary
in the forward bias mode, i.e. so long as said watt rating isn't exceeded?
Reason is I need to run more current through a given diode than it's rated for.
Voltage will be much less than it's rating in the forward direction. Some heavy-
duty (microwave) diodes have large R drops, which are unacceptable for my
application, hence the need for a ligher duty diode to handle more current at
less voltage.
exceeded, i.e. 10kv. I presume that's the reverse bias rating. Also say
that voltage rating applies for say, 10mA of current.
But what about forward bias? How much current/voltage can the diode
withstand? Would it be safe to compute the reverse bias ratings in watts
and stay within that parameter, even though voltage and current may vary
in the forward bias mode, i.e. so long as said watt rating isn't exceeded?
Reason is I need to run more current through a given diode than it's rated for.
Voltage will be much less than it's rating in the forward direction. Some heavy-
duty (microwave) diodes have large R drops, which are unacceptable for my
application, hence the need for a ligher duty diode to handle more current at
less voltage.