+/-10V to two 10V A/D converters

Thread Starter

nalexandrov

Joined Mar 9, 2013
6
I am designing an analytical system for a project. I am using a LOG100JP log differential amplifier. It will output a -10V to +10V signal.

I need to digitize this, one option would be to use a bipolar A/D but that's very expensive.

What I would prefer to do is to use two A/D converters which operate at a 0 to 10V range. One sampling -10V to 0 and the other sampling +10V to 0, this would give me a very accurate measurement since using 2 A/D's increases resolution as opposed to one.

This seems like it should be a very simple problem but I am having trouble understanding how the output of the two a/d's which are at different potentials would be interfaced to a microcontroller etc.

Any help would be very much appreciated.
 

ErnieM

Joined Apr 24, 2011
8,377
This seems like it should be a very simple problem but I am having trouble understanding how the output of the two a/d's which are at different potentials would be interfaced to a microcontroller etc.
Simon says don't do that. The A2D's must be at the "same potential" so they can talk to the micro. By "same potential" I mean they all share the same common ground.

How expensive is too expensive? The LTC2301 is bipolar 12 bits and about 4 bucks each single quantity. It's input range is limited to +/-2.048V.
 

MrChips

Joined Oct 2, 2009
30,795
Use one ADC.
Your input signal is 20V peak-to-peak.
Attenuate the signal and add a DC offset to suit the input range of your chosen ADC which could be 2.5V, 5V etc. There is no loss in resolution by doing so. Determine how much resolution you need in terms of percentage and choose the number of bits accordingly. Add one or two extra bits.

For example, 0.1% resolution requires 10-bit ADC. Select a 12-bit ADC.

Or you could use a bipolar ADC as ErnieM says.
 

Thread Starter

nalexandrov

Joined Mar 9, 2013
6
Is this the kind of thing you guys are talking about to bring the +/-10V to 0 to 20V?
http://masteringelectronicsdesign.com/design-a-bipolar-to-unipolar-converter/


I have found a few ADCS that can sample +/-10V, but not many of them will operate off the +/-15V power supply I am using. Could I use a voltage divider to lower the voltage of each leg from +/-15V to the needed +/-5V supply voltage?

What about if the power requirements only need +5V and I have a +/-15V PSU, do I just voltage divide the upper leg?
 
Last edited:

Thread Starter

nalexandrov

Joined Mar 9, 2013
6
Can you explain how if I shift a signal from -10V to +10V to 0V to 5V I will not lose resolution? Just by increasing the bits of the ADC?

I am torn between trying to use an expensive and rare +/-10V bipolar ADC or shifting to unipolar operation and using a cheaper ADC and more complex circuitry.

Another question:
Can I use potentiometers in a shifting circuit so that I can scale the output from my LOG100 amplifier to the ADC at the same time that I go from +/-10V to 0 to 5V?

example: If my LOG100 is always outputting from -7V to -8V could I scale only this voltage range so the range I am interested in takes up the whole range of the ADC?

If someone can help me with this project I can send them a hundred or so through paypal or bitcoin.
 
Last edited:

MrChips

Joined Oct 2, 2009
30,795
You don't need +/-10V ADC.
+/-1V ADC will work just as well after you attenuate your signal to match the range of the ADC.

Measure the resolution as a percentage of full scale. A 10-bit ADC has a resolution of 0.1%
It doesn't matter if you take 0.1% of 20V or 0.1% of 20V/10 in relative terms. You are still digitizing to 0.1% of full scale.

Of course, without getting into the details, you cannot lower the ADC reference voltage for ever. Eventually you will hit up against the noise level of the ADC.
 
Last edited:

Thread Starter

nalexandrov

Joined Mar 9, 2013
6
OK thats good news.
Now I am interested in being able to scale to the range I am investigating.

So if in my application the -10V to +10V range is more like -7V to -7.5V can I just scale that specific segment of the output to 5V to send to a "normal" ADC? I would like to be able to do this dynamically with a potentiometer, etc.

Again if someone could help me in this area I can send you cash.
 

MrChips

Joined Oct 2, 2009
30,795
Decide on a unipolar or bipolar ADC. It doesn't have to go to 10V.
You can attenuate your signal and shift it to what ever you please.
Just try to match the full range of the ADC for maximum dynamic range and best resolution.
 

Thread Starter

nalexandrov

Joined Mar 9, 2013
6
I have decided on a 0 to 5V unipolar ADC and just need help with the attenuation circuitry.

Would adjusting the range going into the ADC with control circuitry afford me higher resolution in my range of interest, which may be different than the full output range of the LOG100 output?
 

Thread Starter

nalexandrov

Joined Mar 9, 2013
6
again if someone wants to email me and help with this problem you can contact me at nalexandrov251 AT gmail. I have a 100$ bounty on helping me find a solution.
 
Top