Can you elaborate a little on the voltage drop across an LED? According to what you explained it's value is constant as long as the voltage supply is greater than Vf. So whether 15mA is flowing in the circuit (using a 240 Ohm resistor) or 20mA flowing in the circuit (using a 180 Ohm resistor) in the circuit, the voltage drop across the LED (and consequently the resistor) will be the same in both cases. If in place of the LED you connected an identical (240 or 180 Ohm resistor, as the case may be) the voltage drop across the 1st resistor would be halved from 6V to 3V regardless of the amount of current. How do you explain that to a 5th grader? If you used a non-identical resistor, the voltage drop across the 1st resistor would be relative to its size to the 2nd resistor. This is not the case for LEDs.Consider, the attached schematic. The supply voltage is 6 volts and the Vf is 2.4 volts. That means the LED is dropping 2.4 volts. Now most common LEDs like to run at 20 mA or less. Lets say we want this one to run at 15 mA. This means that we will have to drop (6 - 2.4)volts across the current limiting resistor.
I guess what it comes down to is how do I explain the effect an LED has on a simple circuit with just a power supply and a resistor RATHER THAN the effect a resistor has on simple circuit with just a power supply and an LED. The latter I can explain, the former is a bit trickier.
As always, thanks for your ongoing help.