Help needed on a digital logic problem.

Discussion in 'The Projects Forum' started by sky333, Jan 17, 2010.

  1. sky333

    Thread Starter New Member

    Jan 17, 2010
    3
    0
    Hi, all. I am trying to use gates exclusively to implement such a problem:
    in a binary sequence, the 0s divide it into several subsequence i.e. 1s. For the subsequence with even length, the corresponding output should be 0101...(same length). For the odd length, the corresponding output should be 10101...(same length).
    e.g. if we have 0 1 1 1 1 0 1 1 1 1 1 0 1 1 0 0
    then the output is 0 0 1 0 1 0 1 0 1 0 1 0 0 1 0 0
    Requirment: the time cost by the logic circuit does not depend
    on the word length.

    Any help will be greatly appreciated!!!
     
  2. StayatHomeElectronics

    Well-Known Member

    Sep 25, 2008
    864
    40
    can you try to explain this in more detail? I do not understand the process of going from input to output...
     
  3. sky333

    Thread Starter New Member

    Jan 17, 2010
    3
    0
    if we have 0 1 1 1 1 0 1 1 1 1 1 0 1 1 0 0;
    the segments are 1 1 1 1 ; 1 1 1 1 1 and 1 1. So the corresponding output should be
    0 1 0 1; 1 0 1 0 1 and 0 1.
    That is 0 0 1 0 1 0 1 0 1 0 1 0 0 1 0 0. Requirment: the time cost by the logic circuit does not depend on the word length. Thanks any way.
     
  4. zgozvrm

    Member

    Oct 24, 2009
    115
    2
    I understand how the output is to be determined from the input, but my question is this:
    Is there a maximum input length (number of bits)?

    Also, is there a maximum length for a string of 1's?
     
    Last edited: Jan 20, 2010
  5. sky333

    Thread Starter New Member

    Jan 17, 2010
    3
    0
    No, we can assume it to be 20 for example. (I mean 20 for both maximum input length and maximum length for a string of 1's).
     
  6. frankv

    New Member

    Jan 23, 2010
    5
    0
    I assume the input is a stream of bits sequentially on a single channel. e.g. a 300 bps stream? So one bit of the input will be 333ms long?

    And the output also needs to be a single stream of bits, at the same rate as the input stream?

    If you want it in real time, then you can't do it.

    You can't tell what the first bit of the output subsequence is until you're received the last bit of the input subsequence.

    If there is an absolute maximum subsequence length (e.g. 20 bits), and you're prepared to delay by that much, then I guess it is possible.

    Can you tell us *why* you want to do this? Or is it just a thought experiment?
     
Loading...