My goal is to cheaply and simply use a 3.7V Li-Ion cell to power a 3.0Vf LED with 100mA constant current through the voltage range of the cell, 4.2V-2.8V.

Here are a couple of the articles I have been reading:

http://socrates.berkeley.edu/~phylab...Files/bsc4.pdf

http://www.vishay.com/docs/70596/70596.pdf

So I think I have the basic understanding down. The negative voltage developed over the source resistor is fed back to the gate to "close" the JFET more as current increases, as current decreases the voltage will rise closer to zero, which will open the JFET to more current flow.

I have found some JFETs with an Idd of 200mA:

http://www.fairchildsemi.com/ds/J1/J105.pdf

Using the formula provided for basic source biasing on the first page of the vishay document:

-4V Gs(off)

0.1A Id

0.2A Idd

2.0K

I end up with 11.7 ohms for the source resistor.

This however, doesn't make sense to me, since no current flows through the gate of the JFET, it all must go through the 11.7 ohm resistor, but if I calculate what resistor I'd need to run the LED with no JFET I end up with:

3.7V supply, 3Vf drop, and 100mA = 7ohms.

So I can't be using more resistance on the JFET circuit, doesn't make sense to me, where am I misunderstanding? Is my battery voltage just not high enough to make this work properly? If that's the case should I look into cascading Jfets or running two with a lower Gs(off) in parallel?

Or maybe there's a much more simple way to achieve a 100mA current source from a 3.7V Li-Ion cell?

Thanks a bunch for any help offered. I'm ordering some JFETs to play around with but any insight is much appreciated.