Can anyone give me a basic overview of Analog to Digital Converters?
I did this search
http://www.google.com/#q=analog-to-...OIG78gaB5cE0&start=0&sa=N&fp=d55d3d558932bb8c
and came up with a lot more info on DAC's, but nothing that really explained how the most significant bit and least significant bit are given values, and how a microchip actually interprets that and reads it?
The range I will be using will be 0-3.3, or maybe .3 to 3.3? That way it could be a live 0 at .3v? I'm trying to wrap my brain around how the bits are assigned and how the microchip interprets the bits that represent the analog value?
I did this search
http://www.google.com/#q=analog-to-...OIG78gaB5cE0&start=0&sa=N&fp=d55d3d558932bb8c
and came up with a lot more info on DAC's, but nothing that really explained how the most significant bit and least significant bit are given values, and how a microchip actually interprets that and reads it?
The range I will be using will be 0-3.3, or maybe .3 to 3.3? That way it could be a live 0 at .3v? I'm trying to wrap my brain around how the bits are assigned and how the microchip interprets the bits that represent the analog value?