Hi, I need to bias a transistor on a sensor chip (the MOSFET is my sensor). My application is very noise sensitive since I am trying to measure a drain current signal change of only ~ 10 -80 nA.
Would it be a good idea to bias my transistor using a micro-controller controlled ADC (for example: ADG1606 with ON-resistance of 4.5 Ohm, leakage current < 5 nA which has a programmable current output 100 uA - 1000 uA)? Or should I stick with a benchtop high resolution, stable current/voltage supply? What method would introduce least noise in the experiment system?
If I go with trying to bias my transistor sensor chip with an on-PCB ADG16060 bias current, what other parameters of the ADC should I study to make sure the noise level of ADC current output is minimal? Do the bias conditions affect the noise of the test chip/transistor at all? Can I have a noisy bias supply and still measure a low signal if my signal measurement instrument is high resolution?
Thanks a lot.
Would it be a good idea to bias my transistor using a micro-controller controlled ADC (for example: ADG1606 with ON-resistance of 4.5 Ohm, leakage current < 5 nA which has a programmable current output 100 uA - 1000 uA)? Or should I stick with a benchtop high resolution, stable current/voltage supply? What method would introduce least noise in the experiment system?
If I go with trying to bias my transistor sensor chip with an on-PCB ADG16060 bias current, what other parameters of the ADC should I study to make sure the noise level of ADC current output is minimal? Do the bias conditions affect the noise of the test chip/transistor at all? Can I have a noisy bias supply and still measure a low signal if my signal measurement instrument is high resolution?
Thanks a lot.