Just a quick thought experiment. If you're sampling a signal of frequency x, you have to have a sample frequency of 2*x or greater to prevent aliasing errors. My question is this:

If your sampling period is random, but is always twice as fast as the signal you're trying to sample, do you get any sampling errors, or would this be an acceptable method of sampling an analogue signal?

Dan