I'm a bit confused on how to interpret a timing diagram, let's say that you have A = B = C = 0, and D is set to 1, and each inverter has a delay of 1 ns and each gate has a delay of 2ns. C changes from 0 to 1 at a certain interval, let's say it's 2ns. Since the other variables are basically left as constants, do they add extra delay to the circuit since they go through inverters when c is the only thing that changes? I have an example drawn out in an attached picture, and also my attempt to solve it:
When C changes from 0 to 1 at 2ns, and it goes through the gate, as well as considering that D is going through an inverter ( D=0 now and C=1, making F =1), does this add extra delay? Meaning instead of 2ns to go through the gate for a change, it'll take 3 ns because of the inverter that D goes through along with the gate to get F?
When C changes from 0 to 1 at 2ns, and it goes through the gate, as well as considering that D is going through an inverter ( D=0 now and C=1, making F =1), does this add extra delay? Meaning instead of 2ns to go through the gate for a change, it'll take 3 ns because of the inverter that D goes through along with the gate to get F?
Attachments
-
3.3 MB Views: 9
-
3.7 MB Views: 9