How would you calculate the voltage drop of a diode in a diode-resistor series circuit? In real life, the voltage drop of a diode isn't always exactly it's bias voltage or whatever. The problem is, when I try to calculate the REAL voltage drop of a diode in the circuit, I get inter-dependencies and I get confused... Essentially, the voltage drop is based on the diodes resistance and the total circuit resistance:
VoltageDropOfDiode = Vin*(DiodeResistance/(DiodeResistance+ResistorResistance))
HOWEVER, the diodes resistance isn't constant, and is based on the voltage leading into it, which means it's voltage drop is based on its voltage drop. You need the variable you want to find to find itself!
I know it's not usual to think of a diodes resistance, but it's still technically correct.
How would you solve this problem?
Another question might be what would the I-V curve look like for a REALISTIC diode-resistor series circuit? and how would you find the total current through the diode-resistor circuit realistically?
VoltageDropOfDiode = Vin*(DiodeResistance/(DiodeResistance+ResistorResistance))
HOWEVER, the diodes resistance isn't constant, and is based on the voltage leading into it, which means it's voltage drop is based on its voltage drop. You need the variable you want to find to find itself!
I know it's not usual to think of a diodes resistance, but it's still technically correct.
How would you solve this problem?
Another question might be what would the I-V curve look like for a REALISTIC diode-resistor series circuit? and how would you find the total current through the diode-resistor circuit realistically?