Signal Delay IC

Thread Starter

mmostell

Joined Jul 28, 2011
4
Hi,

I am attempting to find an IC that is capable of a simple delay of about 1ms to 1s of an analog input signal. That is, the signal is sent to the chip, and after a delay time (1ms - 1s) the same signal is output by the device.

I have seen several threads on creating simple RC circuits with op-amps to perform this operation but cannot for the life of me find an IC that can do it for me reliably (i.e. without having to tune a pot every time I use it).

Does anyone know of a chip to do this??? My power requirements are relatively broad: the IC must be able to operate on a -25V to +25V supply and be compatible with input/output signals from 0V to 5 V.

Any help is greatly appreciated!

M
 

Thread Starter

mmostell

Joined Jul 28, 2011
4
The clock frequencies I am using for this application are between 100kHz and 250kHz and the output of my system is an analog voltage from 0V to 5V.

The application is not for audio; I am conducting a research project and need to acquire the analog voltage output of a charge-coupled device (CCD) at this frequency. Due to the limitations of my data acquisition software and hardware (LabVIEW and National Instruments USB-6221 DAQ), I need to incorporate a delay of at least 1ms between a write task and the analog voltage read task. Thus, I want to have a simple IC that can simply delay the CCD output signal by at least this minimum time so that I can read the data.

I have looked into bucket-brigade devices but the few I could find (i.e. the SAD1024) tend to be either expensive ($40+) or no longer in production, and thus are not carried by major distributors.

Again, any help would be super helpful and appreciated.
 

Thread Starter

mmostell

Joined Jul 28, 2011
4
Also, I looked into other BBD devices, such as the Panasonic MN3000 series. Most of these chips, unlike the SAD1024, are P-channel devices and thus have a maximum operating frequency of around 100kHz. This is fine for acoustic signals since they are in the audible range, but would not be operating at optimum performance at my higher frequencies.
 

praondevou

Joined Jul 9, 2011
2,942
The clock frequencies I am using for this application are between 100kHz and 250kHz and the output of my system is an analog voltage from 0V to 5V.

I need to incorporate a delay of at least 1ms between a write task and the analog voltage read task. Thus, I want to have a simple IC that can simply delay the CCD output signal by at least this minimum time so that I can read the data.
Can you explain this further?

If I understand right you need to get the data out of the CCD device with a speed of 100 to 250kHz. But in order to read it correctly you need each signal to be present for at least 1 ms. Is that correct?

Why can't you use a slower clock?
How will you avoid a memory overflow if you enter the delay line with 100kHz and exit it with 1kHz?

Or am I missing something here?
 

Thread Starter

mmostell

Joined Jul 28, 2011
4
Can you explain this further?

If I understand right you need to get the data out of the CCD device with a speed of 100 to 250kHz. But in order to read it correctly you need each signal to be present for at least 1 ms. Is that correct?

Why can't you use a slower clock?
How will you avoid a memory overflow if you enter the delay line with 100kHz and exit it with 1kHz?

Or am I missing something here?
Sorry I did a bad job of explaining.

In order to activate the device, which is a linear 128x1 pixel array CCD, I need to supply an impulse to the CCD that is "high" (+5V) during a rising edge of the clock. I am doing this using a DAQ device and LabVIEW for timing purposes. This impulse causes the shift register in the CCD to immediately (with that first clock rising edge) begin dumping its data sequentially at the rate of the supplied clock.

To read this data, I need to use the same DAQ. Unfortunately, the minimum time for the transition from writing the impulse waveform to reading the data from the CCD is 1ms due to my hardware and software (LabVIEW) limitations.

The clock speed needs to be at minimum 100kHz to avoid saturating the CCD pixels, which use capacitors to store a voltage "data" that is proportional to the amount of light incident on that pixel during a specified integration time. In other words, for example, if 1ms of light will saturate the pixels, then 1 period of my clock needs to be Tclk < (1ms / 128 pixels) = 7.81 us, or a frequency of at least 128kHz. Unfortunately, this means that all of the data held in the shift registers of the CCD will be output after only (7.81 us x 128 pixels) = 1ms, meaning that all of the data will have already been output before my DAQ has even started reading it!

To rectify the situation, I want to place a time delay of at least 1ms on the output signal of the CCD before it is input to my DAQ device, so that the DAQ will have enough time (again, 1ms) to switch to "read mode" before it starts receiving the data that was output from the CCD. This will allow me to successfully read the data i need for this application.

Hope this helps clear things up and sorry to anyone I confused. Again I can't thank you all enough for the help!

M
 

joeyd999

Joined Jun 6, 2011
5,287
ummm...wouldn't it be easier to digitally delay your start impulse to the CCD so that by the time the data starts pumping your DAQ is ready to accept data?
 

NLightNMe

Joined Apr 20, 2011
16
I think what mmostell is saying is that he is triggering the CCD at a 100 - 250 kHz rate. But because of the software and DAQ interactions, he can only reliably trigger the ADC with a 1 ms delay. The CCD trigger and the ADC trigger don't line up so the datapoints don't match.

Seems to me that the simpler solution would be to setup a delay circuit for the CCD trigger rather than a delay circuit for the analog output. Then, you could just trigger them both and adjust the CCD trigger delay so that it matches the ADC start.
 

laasworld

Joined Oct 17, 2012
1
mmostell: Did you ever come up with a solution? If so, care to divulge this information? I've been trying to google a usable solution for a similar problem without having to invest in an expensive digital delay generator. It seems like a Linear Technology's "TimerBlox" series may be of some use (specifically, a LTC6994-2). My own problem now is that these ICs only come as surface mount parts if I choose to go this route!
 

Austin Clark

Joined Dec 28, 2011
412
I'm just gonna jump in with a possible suggestion. It's likely not the best solution, but the first thing that comes to my mind would be a ADC being read by a uC, stored in memory, and they re-read and broadcasted back out through the uC. This might not be fast enough for your needs, but it's worth considering.
 

bertus

Joined Apr 5, 2008
22,278
Hello,

You might have noticed that the thread is over one year old.
The OP has not posted anything the past months.

Bertus
 
Top