Configuring a 10-bit ADC for 0-12V DC Voltage Measurement

Thread Starter

MTech1

Joined Feb 15, 2023
161
Hello,

I'm currently reading the general description of ADC features in microcontrollers and have a few questions regarding their configuration. Specifically, I'm interested in understanding how to set up a ADC to read a 0-12V DC voltage using a 10-bit ADC. As , a 10-bit ADC provides 1024 steps, with 0 representing 0V DC , 512 representing 6V DC and 1024 representing 12V DC.

To achieve this, I'd like to know what configuration settings are necessary for the reference voltage, sampling time, and trigger mode. If you don't prefer general term for clarification. you can use a specific example of your choice, or you can consider the PIC16F877A microcontroller.

Thank you in advance.
 

BobTPH

Joined Jun 5, 2013
9,138
The reference voltage can be no more that the PIC’s Vdd. Normally, I use Vdd itself as the reference, as long as it is from a regulated source.

You cannot measure 12V directly. You use a voltage divider to reduce the range down to the reference voltage.

The sampling frequency depends entirely on other things you have not specified. Notably, the bandwidth if the signal you are sampling.

Triggering mode depends on how you want to write the program.
 

Thread Starter

MTech1

Joined Feb 15, 2023
161
The reference voltage can be no more that the PIC’s Vdd. Normally, I use Vdd itself as the reference, as long as it is from a regulated source.

You cannot measure 12V directly. You use a voltage divider to reduce the range down to the reference voltage.
Of ourse, I've taken the voltage divider circuit into consideration. I'm aware of the formula for voltage calculation:

Voltage = ADC Value * Reference Voltage / 1024.

However, if you use the PIC's VDD voltage as the reference, you won't be able to measure the full 12V DC range.

To achieve the maximum 12V DC measurement, I think we should use a 12-V reference voltage instead.
 

crutschow

Joined Mar 14, 2008
34,682
Of ourse, I've taken the voltage divider circuit into consideration. I'm aware of the formula for voltage calculation:
You do not seem to "of course" understand how a voltage divider works or you wouldn't have stated this--
To achieve the maximum 12V DC measurement, I think we should use a 12-V reference voltage instead.
As noted , the reference voltage can't be higher than the ADC's supply voltage.
So we use a voltage divider to reduce the 12V to the maximum the ADC can convert.
 

Thread Starter

MTech1

Joined Feb 15, 2023
161
Then 1023 = 12V after the adjustment by the voltage divider.
Yes exactly

you wouldn't have stated this--
I'm aware that directly supplying 12V DC to the ADC input pin of a microcontroller is not possible, as microcontrollers typically operate with a 5V DC supply. To scale down the 12V DC, we commonly use a voltage divider circuit consisting of two resistors. This setup allows us to present a voltage to the ADC input pin within the 0 to 5V range, which is compatible with the
microcontroller's operating voltage.

Voltage = ADC Value * Max Voltage / 1024.
Max Voltage is supply voltage here is 12 DC
 
Last edited:

Thread Starter

MTech1

Joined Feb 15, 2023
161
Hi MT
Define what what you mean by sampling period?
Do you mean sampling interval
E
I've noticed several terms related to timing in ADC, such as ADC timing, ADC interval, sampling period, sampling interval, and sampling rate. Could you please clarify which of these terms I should focus on when determining the time it takes to obtain each ADC value? I want to ensure I have a clear understanding of the timing aspects in ADC.
 

crutschow

Joined Mar 14, 2008
34,682
The sampling requirements depend upon the maximum frequency of the signal you are converting.
What is the nature of that signal?
 

Thread Starter

MTech1

Joined Feb 15, 2023
161
What is the nature of that signal?
I'm assuming working with a 12V DC supply and using a potentiometer to obtain various DC voltage levels by adjusting the potentiometer.

My primary goal is to understand the necessary configuration settings requires for the ADC. To start, I've taken one specific scenario as an example measuring 0 to 12 DC voltage via 10 bit ADC to better understand the concept.
 

ericgibbs

Joined Jan 29, 2010
18,992
Hi MT,
With due respect, you should post what you think the settings should be to meet the requirements of your ill-defined specification, we can then help you.
There is sufficient information in the data sheets and on the web, which if you read through, will help you to post a draft project.

E
 

Thread Starter

MTech1

Joined Feb 15, 2023
161
Hi MT,
your ill-defined specification, we can then help you.
I've made my best to define the requirements for a better understanding of the ADC configuration settings.

How would you define the requirements to measure the voltage between 0 to 12 volts DC using a 10-bit ADC in a microcontroller?
 

crutschow

Joined Mar 14, 2008
34,682
How would you define the requirements to measure the voltage between 0 to 12 volts DC using a 10-bit ADC in a microcontroller?
The sampling rate/interval need be no faster than how often the signal voltage is changed.
If you want to improve the accuracy some and/or minimize noise, you can take several samples per interval and average them in the micro.
Otherwise you just connect the voltage to the ADC input with the proper scaling.
 

sghioto

Joined Dec 31, 2017
5,420
Assuming a 5 volt supply for the micro a 14k to 10k voltage divider will drop the 12 volts to 5 volts at the adc input. How offen you sample this voltage depends on your need. If for instance it's to monitor a battery maybe once a minute.
 

Thread Starter

MTech1

Joined Feb 15, 2023
161
Assuming a 5 volt supply for the micro a 14k to 10k voltage divider will drop the 12 volts to 5 volts at the adc input. How offen you sample this voltage depends on your need. If for instance it's to monitor a battery maybe once a minute.
Thank you for sharing the information about the 14k to 10k voltage divider and the voltage drop to 5 volts at the ADC input. It's indeed important to determine the sampling frequency based on specific application needs, such as monitoring a battery, which you've suggested could be done once a minute. This means that the ADC would take a new reading and convert it to a digital value once every minute

1 sample per minute = 1/60 samples per second ≈ 0.0167 Hz
 

BobTPH

Joined Jun 5, 2013
9,138
For such a simple application, I suspect the default configuration will work.

Surely you can find sample code somewhere for the PIC16F ADC. Start with that.

Here, I googled it for you:

PIC ADC tutorial
 

schmitt trigger

Joined Jul 12, 2010
918
Use a 3:1 resistor divider. It has a pair of advantages:

—Makes the conversion maths very easy.
—It provides a 0 to 15 volts range. This is important, as a cold battery while being charged, it’s voltage rises around 14 volts. The 15 volts range prevents exceeding the input range @ Vdd= 5 volts.
 
Top