I've been having problems with one particular exercise at Digital Logic Design. I can't quite put my finger on it and i can't figure out why. I have to design a up/down counter on a given sequence. example:
A=(1,1,1,0,0,0,0,0)
b=(0,0,1,1,1,0,0,0)
c=(0,0,0,0,1,1,1,0)
d=(1,0,0,0,0,0,1,1)
where a,b,c,d are the bits of the number . D-most significant bit A-least significant bit
For x = 1 , counts up and for x = 0 it counts down. If smh can help me, it would make my day. Anyway,thanks in advance. Happy new Year.
A=(1,1,1,0,0,0,0,0)
b=(0,0,1,1,1,0,0,0)
c=(0,0,0,0,1,1,1,0)
d=(1,0,0,0,0,0,1,1)
where a,b,c,d are the bits of the number . D-most significant bit A-least significant bit
For x = 1 , counts up and for x = 0 it counts down. If smh can help me, it would make my day. Anyway,thanks in advance. Happy new Year.