DAC Characterization

Discussion in 'The Projects Forum' started by welkin87, Feb 24, 2012.

  1. welkin87

    Thread Starter Member

    Nov 18, 2010
    32
    0
    I have just started an internship and one of the projects they have me doing is doing a DC characterization of a DAC that a past intern designed. All we are interested in is the DNL and INL of this 10-bit DAC. I do know the theory behind digital and analog conversion, and how they work. I just haven't had any hands-on experience with DAC's yet.

    My question is what test setups do some of you use? I have quite a bit of hardware available (oscilloscopes, network and spectrum analyzers, clock sources, logic analyzers, etc.) and Labview as well.

    Any input would be greatly appreciated!

    Thanks!
     
  2. crutschow

    Expert

    Mar 14, 2008
    13,014
    3,234
    To measure the DNL and INL of the DAC you simply step through all 1024 digital input values and record the output voltage for each step.

    Since doing 1024 measurements by hand is very tedious and error prone, you could automate that by generating the 10-bit digital output sequence using Labview and a digital output card. Then use an accurate digital voltmeter that has a digital output connection (typically USB or GPIB) to record the measured analog voltage of each DAC output step.

    You then transfer the recorded values to a spreadsheet and calculate the DNL and INL values.
     
  3. welkin87

    Thread Starter Member

    Nov 18, 2010
    32
    0
    We do have Labview so that should work. Would the National Instruments USB-6501 work for the digital output?
    http://sine.ni.com/nips/cds/view/p/lang/en/nid/201630

    My thinking is to use the NI 6501 to send the 10-bit signal to the DAC. The DAC output would then go to a digital multimeter connected to Labview via IEEE-STD-488.

    Didn't think about Excel. Gotta love that program.


    Thanks for the help!
     
    Last edited: Feb 24, 2012
  4. crutschow

    Expert

    Mar 14, 2008
    13,014
    3,234
    That NI device should work to generate the digital signals.

    Make sure you allow plenty of time for the DAC signal to settle at each level (say at least a 10ms delay) and then take the multimeter reading after that. Multimeters can be slow to take a reading so be sure to allow time for that also.
     
  5. welkin87

    Thread Starter Member

    Nov 18, 2010
    32
    0
    I have both the NI6363 DAQ (USB) and Agilent 34401 (GPIB) multimeter hooked up and working properly with Labview. My problem now is I know what needs to be done but am having issues implementing it properly in Labview due to a lack of experience with the software.

    I've spent hours so far looking for a VI or combination of VI's that would allow me to send 10 parallel channels of digital data to the 10 specified ports of the 6363 (I'll worry about timing later).

    I found the Digital Pattern Generator VI and set it up for a 10 channel ramp (which is what I want), but I can't seem to get it into 10 separate channels (tried the Digital Subset VI, couldn't get it to work).
     
  6. MrChips

    Moderator

    Oct 2, 2009
    12,442
    3,361
    A simple way is to use a 10-bit binary counter and step the counter one step at a time while reading the voltage output with Labview.
     
Loading...