Hey guys, my doubt is pretty straightforward. Basically, when you're building, say, a very simple source-resistor-LED circuit, and you use Ohm's Law to calculate resistor / current values depending on your source, how exactly do you go about the precise voltage drop? I ask this because LED datasheets will usually give you the typical voltage drop for a forward current of 20mA, and, if you're lucky, throw in a graphic of the I(Vd). So to get the 20mA you can simply apply Ohm's. But now imagine you want to get, say, ~60mA to flow through your LED. Even though a very slight increase in Vd translates into a very steep increase in the current, there is still some difference in the drop you will have to subtract to the source in order to calculate the resistor value. Which might end up in an appreciably lower current flow. So my question is, how do you go about these calculations? Especially if you have no chart/graphic, and only the typical VD at If=20mA, and you want a current value significantly higher (50mA and upward). Is the difference in Vd negligible?