Delay in Combinational Logic Circuit

Thread Starter

mcardoso

Joined May 19, 2020
226
Hi All,

This question is again related to some of the others I have posted. I have a combinational logic circuit where 9 signals enter a circuit, pass through some number of logic gates (5-6 on any given path) and 7 new signals are output. I am not currently using any clock, however there is an enable signal provided to the clock input of the first chips in this circuit to indicate when to sample the inputs.

I wish to avoid any glitches (static-0 or dynamic) in the output caused by race hazard through the various logic paths (a static 1 glitch is acceptable and will not harm the output of the circuit). The best way I could think to do this is a final AND gate on each output which has the output signal and the enable signal as inputs. This way, all outputs are LOW until enough time has passed for glitches to settle.

To do this, I want to use the same enable signal as I used to trigger the input sampling, but I was to delay it. This delay must be such that the best case delay in the enable circuit is longer than the worst case delay of the logic through any path in the gates. I saw some designs using a chain of buffers, however I was wondering if there was a better way to handle this condition. Buffers in the same logic family that I am using seem to have extremely fast (5-10ns) propagation delays, so a large number of them would be needed to sufficiently delay the circuit.

Is this how people normally handle these conditions? If not, what is a good alternative idea?

EDIT: Added picture. Latch on the left signifies the input logic sampling (in my case a CD74AC238 3-to-8 decoder). The ENA signal is HIGH when DAT is valid and static. The top path is a combinational logic circuit with some number of gates. The bottom path is my proposed delay with some number of buffers. The AND gate at the end holds the output LOW until the logic has settled on its final value and the ENA delay has expired.

delay.jpg

Thanks
 

WBahn

Joined Mar 31, 2012
30,045
This method can work, but getting the magic number of delays that will work over all process corners is not going to be easy.

The normal way would be to use another register instead of the AND gate on the output and capture the value that is output from your logic at the beginning of each clock cycle (or you could clock that register with an inverted version of ENA).
 

cmartinez

Joined Jan 17, 2007
8,252
How fast does the complete circuit has to be? Why not use an MCU instead and do away with those glitches once and for all?
 

Thread Starter

mcardoso

Joined May 19, 2020
226
This method can work, but getting the magic number of delays that will work over all process corners is not going to be easy.

The normal way would be to use another register instead of the AND gate on the output and capture the value that is output from your logic at the beginning of each clock cycle (or you could clock that register with an inverted version of ENA).
OK, I think I might be starting to understand clocked logic. Let me explain how I think it works and correct me if I am wrong.

Combinational logic is never clocked (there isn't an AND gate with a clock input), but you encapsulate the combinational logic between latches/flip-flips. The clock pulse is sufficiently slow to ensure that the time between rising edges gives the logic time to settle on a steady value. The same clock is shared, so data shuffles down the circuit one clock pulse at a time. The clock must also be sufficiently fast that the input is sampled frequently enough. I'd have to guess that the 2x max input frequency rule would apply here. These conditions give a lower and upper bound on the clock frequency.

Asynchronous input data is double buffered so prevent metastability in the output from screwing up your circuit. This is because you don't know when the data from the outside world will change...

I think this picture shows what I mean. The data can only advance through one of the clouds of combinational logic per clock pulse. The clock must be faster than f_in but slower than the worst case of 1/tp_1 and 1/tp_2.

Does that all sound correct?

clock.jpg
 

Thread Starter

mcardoso

Joined May 19, 2020
226
How fast does the complete circuit has to be? Why not use an MCU instead and do away with those glitches once and for all?
The circuit needs to evaluate the input signals at a frequency up to 700kHz, read/write memory/ and set outputs. I figured this is beyond the capability of most micros. The signal period is 1.4us worst case.
 

dl324

Joined Mar 30, 2015
16,910
I figured this is beyond the capability of most micros. The signal period is 1.4us worst case.
You figure wrong. Things like Arduino operate at 16MHz. Using interrupts, you can easily capture events that happen more 10 times faster.
 

cmartinez

Joined Jan 17, 2007
8,252
You figure wrong. Things like Arduino operate at 16MHz. Using interrupts, you can easily capture events that happen more 10 times faster.
Even with fast MCU's (I work with single cycle 22MHz MCU's) attaining 700KHz after all of those logic gates is not an easy task.
 

MrChips

Joined Oct 2, 2009
30,795
How fast does the complete circuit has to be? Why not use an MCU instead and do away with those glitches once and for all?
I already suggested this as a solution but TS did not take.
TS needs to define the application and situation.

Here is the scenario as I see it.
Signals A and B arrive at random times with random transitions.
Combinational logic will always be subject to propagation delays and race hazards.
At some point you have to define when the data is acceptable. That is the purpose of a timing signal (CLOCK).
This still does not prevent latching the data at a "bad" moment. There is no fail-safe solution as I see it.

This is why we go to a synchronous design, i.e. clocked.
The CLOCK defines a window when the inputs must settle. On the other edge of the clock we define when the output is presented. The CLOCK frequency can be as fast as the circuit or MCU operates.

With an MCU solution, you read the inputs, perform the logic, present the output. All of this can be accomplished in under 1μs.
 

nsaspook

Joined Aug 27, 2009
13,261
With modern 8-bit controllers like the PIC Q43 running at 64 MHz with a three instruction cycle latency to a vectored interrupt and CLC, MCU solutions is easily possible.
Interrupt Latency
When MVECEN = 1, there is a fixed latency of three instruction cycles between the completion of the instruction active when the interrupt occurred, and the first instruction of the Interrupt Service Routine. Figure 11-7, Figure 11-8, and Figure 11-9 illustrate the sequence of events when a peripheral interrupt is asserted, when the last executed instruction is one-cycle, two-cycle and three-cycle, respectively
Configurable Logic Cell (CLC)
http://ww1.microchip.com/downloads/en/DeviceDoc/40002188A.pdf
 

Thread Starter

mcardoso

Joined May 19, 2020
226
I am totally open to alternative ideas. Please have some patience with me. A have a lot of experience in mechanical engineering, machining, control system software design, etc. I have some limited experience in PCB design, arduino/STM32 programming, and circuits (slow stuff like you see of most Arduino projects). I have little to no experience with digital logic, high efficiency/high speed programming, FPGAs, other microprocessors, or PCB design with micros or FPGAs.

I first wrote about my project on another post on this forum looking for the best avenue to accomplish my task (signal conversion). I did not get much feedback on the best method so I started down what I thought to be the path of least resistance. Thanks to the help of many here, I have learned a ton in the past few weeks - that alone makes me feel that my efforts have been worthwhile. I do feel confident that I can make the circuit function out of discrete digital logic chips. Also, the logic design that I have done would be necessary for any technology, so I don't have much sunk cost to get over if I chose to move to another method.

All that being said, I am likely not doing this in an efficient fashion using modern techniques - I just don't know what those are! I am totally open to throwing away the idea of discrete digital logic for a different method, but I will have a lot of questions and might need some help. Everyone here has been extremely helpful so far - thank you.

You figure wrong. Things like Arduino operate at 16MHz. Using interrupts, you can easily capture events that happen more 10 times faster.
I could totally be wrong, but that gives 22 ish clock pulses between input transitions. 22 clock pulses doesn't seem to be enough. I think that most operations take one clock cycle, memory read/write taking two, and some taking much longer. With hardware interrupt detection, 3 hardware input reads, 3 memory writes and 3 memory reads, then some semi-complex combinational logic, and finally 4 hardware output writes, this doesn't seem to be nearly enough clock cycles to get it done.

But again, hardware guy here so I could be missing something.

What about using an FPGA?
Absolutely the method of choice for this kind of stuff - so I have heard. Know nothing about them other than what they do. Felt that the path to selecting an FPGA, designing a PCB with all the required peripherals, leaning VHDL/Verilog, etc. seemed to be a difficult thing to do. If you'd be willing to point me in the right direction, I'd definitely start learning more about them.

Even with fast MCU's (I work with single cycle 22MHz MCU's) attaining 700KHz after all of those logic gates is not an easy task.
That was my concern. I think that a MCU that runs at several hundred MHz with efficient assembly level code would be able to do the job, but my knowledge with micros has been limited to Arduino IDE and mBed. I'd also prefer to package the finished circuit into a PCB that plugs into my servo drives. Because of this, I don't want to use a finished programming board like an Arduino or STM32 Nucleo. I'd be laying the parts out myself on a custom PCB which adds some challenge.

I already suggested this as a solution but TS did not take.
TS needs to define the application and situation.

Here is the scenario as I see it.
Signals A and B arrive at random times with random transitions.
Combinational logic will always be subject to propagation delays and race hazards.
At some point you have to define when the data is acceptable. That is the purpose of a timing signal (CLOCK).
This still does not prevent latching the data at a "bad" moment. There is no fail-safe solution as I see it.

This is why we go to a synchronous design, i.e. clocked.
The CLOCK defines a window when the inputs must settle. On the other edge of the clock we define when the output is presented. The CLOCK frequency can be as fast as the circuit or MCU operates.

With an MCU solution, you read the inputs, perform the logic, present the output. All of this can be accomplished in under 1μs.
I think, and I could be wrong here, that I have a fairly complete problem description (original project link here). I know what the incoming signals look like, I know how fast they arrive, I know the input/output voltage levels, I know the power supply voltages, I know how the input patterns need to map to output patterns, and I know how quickly the circuit must be able to get data passed through. If there is missing information, I'd definitely want to know so I can figure it out.

Totally not against the microcontroller idea. I don't yet understand how to actually accomplish adding a micro to a custom circuit. There seems to be a lot of peripherals like clocks, power supplies, programming interfaces, memory, etc. that need to be considered, and I am not sure where these are described. I'm not sure how to figure out what microcontroller is needed to do the task. Some are going to be too slow I think.

With modern 8-bit controllers like the PIC Q43 running at 64 MHz with a three instruction cycle latency to a vectored interrupt and CLC, MCU solutions is easily possible.
Lot of words I don't understand but I did read you link and that looks awesome! I am going to do some research on it and I might have some questions. I'm happy to do the research and learn what I need, but I had no idea that stuff like this even existed.


Thank you all for your feedback and patience.
 

Thread Starter

mcardoso

Joined May 19, 2020
226
Got kind of off topic from the original question - sorry. Let me summarize the possible options I have to build this circuit and maybe you guys can let me know the easiest to accomplish/highest likelihood of success.

  1. Discrete digital logic (Asynchronous, what I have been building)
  2. Discrete digital logic (synchronous, modify my circuit to include a clock)
  3. Micro-controller with interrupt driven logic
  4. Micro-controller with CLCs
  5. FPGA

PS: All of this is to let me run the motors on the industrial robot I bought a month ago. I have servo drives but the encoders on the motors are a proprietary design of the original manufacturer. I'm trying to hack this system to be able to run them. Replacing the encoders or motors would be very expensive and difficult.
 

cmartinez

Joined Jan 17, 2007
8,252
Micro-controller with interrupt driven logic
I don't think that would work. Interrupts have latency times, and most MCU's out there claiming to work at X frequency actually perform X/4 or less because a single instruction takes several clock cycles.

What NSA suggested is an excellent alternative, providing the MCU itself has enough logic gates in it that could be accommodated to your needs.
 

MrChips

Joined Oct 2, 2009
30,795
I think what we are witnessing here is a typical scenario where someone is proposing a solution and have not told us the problem.
Step back for a moment and ignore possible solutions, discrete logic, FPGA, MCU, etc.
What is the problem, i.e. the application, not about signals arriving at different times.
What is the big picture here? What are you doing?
 

Thread Starter

mcardoso

Joined May 19, 2020
226
I don't think that would work. Interrupts have latency times, and most MCU's out there claiming to work at X frequency actually perform X/4 or less because a single instruction takes several clock cycles.

What NSA suggested is an excellent alternative, providing the MCU itself has enough logic gates in it that could be accommodated to your needs.
I too am really intrigued with this idea.

Right now my circuit is:
(3) 3-bit shift registers
(2) 2-input XOR gates
(8) 2-input OR gates
(1) inverter
(3) 3-to-8 decoders (each equivalent to 8 inverters and 8 3-input AND gates)
(44) 3-input AND gates
(15) 3-input OR gates

in addition I estimate the final stage of the circuit will add (4) 3-input OR gates

There is a mix of combinational and sequential logic.

Does this seem reasonable to accomplish in those CLCs? I see specs on the number of input/outputs to each CLC, but not on the number of gates that can be built. I'm sure it is out there, but I haven't found it yet.

If you want to see the circuit, go here: https://www.falstad.com/circuit/

You can open the circuit in the web browser, but I like to download the offline one (link below the applet window). I have attached a circuit file below which can be opened in this applet. Alt-Click and drag allows you to pan.
 

Attachments

cmartinez

Joined Jan 17, 2007
8,252
By the way, are you familiar with Digital Works? It's a free software that's very easy to learn and lets you play around with digital gates and all of its related logic. It won't let you know about the real world circuit delay stuff that you're trying to solve. But it's an excellent learning tool.
 

Thread Starter

mcardoso

Joined May 19, 2020
226
By the way, are you familiar with Digital Works? It's a free software that's very easy to learn and lets you play around with digital gates and all of its related logic. It won't let you know about the real world circuit delay stuff that you're trying to solve. But it's an excellent learning tool.
I was not until @dl324 showed it to me a few days ago. It looks to do the same kind of things as the circuit applet that I have been using does. I used that applet in college quite a lot to understand my assignments. Do you think digital works is superior?

I should try to download LTSpice and start using it. It looks to be a much more capable piece of software than what I use and would predict timing delays.
 

cmartinez

Joined Jan 17, 2007
8,252
I should try to download LTSpice and start using it. It looks to be a much more capable piece of software than what I use and would predict timing delays.
You're right about that. It's just that LTSpice has a steeper learning curve. But it's very much worth the effort.
Only thing is I'm not sure if LTSpices library has your particular chips in its library. I guess it's a matter of looking around.
 

Thread Starter

mcardoso

Joined May 19, 2020
226
Unless I am misunderstanding it, the CLC capabilities of the PIC micro controllers fall quite short of what I need to create this project. If I understand it right, there are 8 CLC blocks per processor and each one can only perform 2 gates of logic on 4 inputs with one output, or 1 bit of memory. That would require a LOT of PIC controllers side by side to accomplish the task.

Alternatively it looks like a Cypress PSoC has a much wider capability, but I really didn't understand what it did after paging through the datasheet.
 
Top