Sampling theorem says that if the samples are separated by 1/(2B) seconds then we don't lose information.
My questions:
1) is the sampling done with deltas?
2) if the sampling is done with deltas, then in real life we don't have deltas. Thus, we use the zero-order hold sample process where we lose information. But if we hold the value constant for less than 1/(2B) seconds , would this not mean that we don't lose any kind of information, even in real life?
My questions:
1) is the sampling done with deltas?
2) if the sampling is done with deltas, then in real life we don't have deltas. Thus, we use the zero-order hold sample process where we lose information. But if we hold the value constant for less than 1/(2B) seconds , would this not mean that we don't lose any kind of information, even in real life?
Last edited: