Hi there, I need some help understanding how to use a microcontroller's ADC to read voltages from a variable DC supply (0-12V) and display them on a screen. I know I need a voltage divider circuit since the microcontroller can only handle up to 5V. With a 10-bit ADC and a reference voltage of 5V, I'm calculating the voltage using:
Voltage = (ADC sample / Total ADC samples) * reference voltage
For instance, at 6V supply, I'm getting a 512 ADC value, which computes to 2.5V. How can I adjust the formula to display the correct voltage value (6V) on the screen? This is a general question, not specific to any microcontroller, display, or compiler
Voltage = (ADC sample / Total ADC samples) * reference voltage
For instance, at 6V supply, I'm getting a 512 ADC value, which computes to 2.5V. How can I adjust the formula to display the correct voltage value (6V) on the screen? This is a general question, not specific to any microcontroller, display, or compiler