Hello World.
OK, I'm generally a digital guy so let's get that out of the way.
For one project, the goal is to provide a "clean power supply" to the ADC. In this case the ADC supply pin is both the analog reference and the supply to the ADC core itself. The chip runs on 3.3V.
Problem: the ripple or noise on the 3.3V supply is not given.
So, one implementation I've seen is to utilize an inductor (100uH) as well as typical bypass capacitors (0.1uF and 10uF).
I want to make sure my digital results are calibrated correctly to the actual voltage at the ADC supply.
Won't the inductor cause a minor voltage drop from the main supply to the ADC supply/reference pin? Given this is a DC voltage system, voltage drop across a 100uH or even 1000uH inductor should be minimal, right?
TIA.
OK, I'm generally a digital guy so let's get that out of the way.
For one project, the goal is to provide a "clean power supply" to the ADC. In this case the ADC supply pin is both the analog reference and the supply to the ADC core itself. The chip runs on 3.3V.
Problem: the ripple or noise on the 3.3V supply is not given.
So, one implementation I've seen is to utilize an inductor (100uH) as well as typical bypass capacitors (0.1uF and 10uF).
I want to make sure my digital results are calibrated correctly to the actual voltage at the ADC supply.
Won't the inductor cause a minor voltage drop from the main supply to the ADC supply/reference pin? Given this is a DC voltage system, voltage drop across a 100uH or even 1000uH inductor should be minimal, right?
TIA.