Hey everyone,
I've been stumped on getting a certain digital circuit working as it needs to for my digital design class. The goal is this:
"Design a 2-bit gray code counter with a CLR input and an INC input (using D Flip-flops). In cases where CLR = INC = '1', have the counter also increment. Draw a schematic of the resulting counter. Use the sequence 00 -> 01 -> 11 -> 10 -> 00 ...
Here is the associated truth table, Karnaugh Maps, and logic equations I have derived so far...
Every attempt thus far at building this circuit has failed in the correct operation. I have a sneaking hutch that it is because the CLR input got completely removed from the equations, but I have triple checked my Kmaps and am fruitless.
Can anyone help me get pointed in the right direction? Does it look like I did something obviously wrong?
Here is the schematic I have built thus far (sorry it's horribly ugly, I built it quickly)
I've been stumped on getting a certain digital circuit working as it needs to for my digital design class. The goal is this:
"Design a 2-bit gray code counter with a CLR input and an INC input (using D Flip-flops). In cases where CLR = INC = '1', have the counter also increment. Draw a schematic of the resulting counter. Use the sequence 00 -> 01 -> 11 -> 10 -> 00 ...
Here is the associated truth table, Karnaugh Maps, and logic equations I have derived so far...
Every attempt thus far at building this circuit has failed in the correct operation. I have a sneaking hutch that it is because the CLR input got completely removed from the equations, but I have triple checked my Kmaps and am fruitless.
Can anyone help me get pointed in the right direction? Does it look like I did something obviously wrong?
Here is the schematic I have built thus far (sorry it's horribly ugly, I built it quickly)