Hazard (logic)
Encyclopedia
In digital logic, a hazard in a system is an undesirable effect caused by either a deficiency in the system or external influences. Logic hazards are manifestations of a problem in which changes in the input variables do not change the output correctly due to some form of delay caused by logic elements (NOT, AND
, OR gate
s, etc.) This results in the logic not performing its function properly. The three different most common kinds of hazards are usually referred to as static, dynamic and function hazards.
Hazards are a temporary problem, as the logic circuit will eventually settle to the desired function. However, despite the logic arriving at the correct output, it is imperative that hazards be eliminated as they can have an effect on other connected systems.
In properly formed two-level AND-OR logic based on a Sum Of Products expression, there will be no static-0 hazards. Conversely, there will be no static-1 hazards in an OR-AND implementation of a Product Of Sums expression.
The most commonly used method to eliminate static hazards is to add redundant logic (consensus terms in the logic expression).
The simple circuit performs the function noting:
f = X1 * X2 + X1' * X3
If we first look at the starting diagram, it is clear that if no delays were to occur, then the circuit would function normally. However since this isn't a perfect circuit, and an error occurs when the input changes from 111 to 011. i.e. when X1 changes state.
Now we know roughly how the hazard is occurring, for a clearer picture and the solution on how to solve this problem, we would look to the Karnaugh Map
.
The two gates are shown by solid rings, and the hazard can be seen under the dashed ring. The theory proved by Huffman tells us that by adding a redundant loop 'X2X3' this will eliminate the hazard.
So our original function is now: f = X1 * X2 + X1' * X3 + X2 * X3
Now we can see that even with imperfect logic elements, our example will not show signs of hazards when X1 changes state. This theory can be applied to any logic system. Computer programs deal with most of this work now, but for simple examples it is quicker to do the debugging by hand. When there are many input variables (say 6 or more) it will become quite difficult to 'see' the errors on a Karnaugh map.
e.g. A logic circuit is meant to change output state from 1 to 0, but instead changes from 1 to 0 then 1 and finally rests at the correct value 0. This is a dynamic hazard.
As a rule, dynamic hazards are more complex to resolve, but note that if all static hazards have been eliminated from a circuit, then dynamic hazards cannot occur.
Restrictions are not always possible. For instance, let us imagine some logic circuit that has two inputs: One input is used for a clock signal, and the other is connected to a random noise source that we wish to measure. It should be clear that restrictions in this case would not be an effective solution.
The simplest example of this is the exclusive-or function.
In this scenario it is quite difficult to see how a hazard could occur if the circuit is built up on the same couple of chips. However let us imagine that some circuit designer has split this function across different chips (i.e. one NOT gate on one chip and the other NOT gate is implemented on another chip across the PCB somewhere)
Let us set-up the initial state of our circuit. A = 1, B = 0. Now lets say there is a delay in the NOT gate marked (X). The inputs now change simultaneously so that A = 0 and B = 1 (remember in an equally delayed circuit or a perfect circuit, the circuit output would match the specification).
If we observe what the circuit should do, and do not change the output of the NOT gate X (this simulates a delay in gate X), it should be clear that the output of the circuit changes. Now we change the output of NOT gate X and the circuit goes back to the proper state.
The most effective way to solve this hazard would be to carefully design the PCB so that delays are all equal, or at least match the delays on each path. i.e. Delay of A's path = Delay of B's path. Yet adding more gates to the circuit by the same methods as described in dynamic and static hazards will not work as Huffman's method
cannot be applied..
AND gate
The AND gate is a basic digital logic gate that implements logical conjunction - it behaves according to the truth table to the right. A HIGH output results only if both the inputs to the AND gate are HIGH . If neither or only one input to the AND gate is HIGH, a LOW output results...
, OR gate
OR gate
The OR gate is a digital logic gate that implements logical disjunction - it behaves according to the truth table to the right. A HIGH output results if one or both the inputs to the gate are HIGH . If neither input is HIGH, a LOW output results...
s, etc.) This results in the logic not performing its function properly. The three different most common kinds of hazards are usually referred to as static, dynamic and function hazards.
Hazards are a temporary problem, as the logic circuit will eventually settle to the desired function. However, despite the logic arriving at the correct output, it is imperative that hazards be eliminated as they can have an effect on other connected systems.
Static hazards
A static hazard is the situation where, when one input variable changes, the output changes momentarily before stabilizing to the correct value. There are two types of static hazards:- Static-1 Hazard: the output is currently 1 and after the inputs change, the output momentarily changes to 0 before settling on 1
- Static-0 Hazard: the output is currently 0 and after the inputs change, the output momentarily changes to 1 before settling on 0
In properly formed two-level AND-OR logic based on a Sum Of Products expression, there will be no static-0 hazards. Conversely, there will be no static-1 hazards in an OR-AND implementation of a Product Of Sums expression.
The most commonly used method to eliminate static hazards is to add redundant logic (consensus terms in the logic expression).
Example of a static hazard
Let us consider an imperfect circuit that suffers from a delay in the physical logic elements i.e. AND gates etc.The simple circuit performs the function noting:
f = X1 * X2 + X1' * X3
If we first look at the starting diagram, it is clear that if no delays were to occur, then the circuit would function normally. However since this isn't a perfect circuit, and an error occurs when the input changes from 111 to 011. i.e. when X1 changes state.
Now we know roughly how the hazard is occurring, for a clearer picture and the solution on how to solve this problem, we would look to the Karnaugh Map
Karnaugh map
The Karnaugh map , Maurice Karnaugh's 1953 refinement of Edward Veitch's 1952 Veitch diagram, is a method to simplify Boolean algebra expressions...
.
The two gates are shown by solid rings, and the hazard can be seen under the dashed ring. The theory proved by Huffman tells us that by adding a redundant loop 'X2X3' this will eliminate the hazard.
So our original function is now: f = X1 * X2 + X1' * X3 + X2 * X3
Now we can see that even with imperfect logic elements, our example will not show signs of hazards when X1 changes state. This theory can be applied to any logic system. Computer programs deal with most of this work now, but for simple examples it is quicker to do the debugging by hand. When there are many input variables (say 6 or more) it will become quite difficult to 'see' the errors on a Karnaugh map.
Dynamic hazards
A dynamic hazard is the possibility of an output changing more than once as a result of a single input change. Dynamic hazards often occur in larger logic circuits where there are different routes to the output (from the input). If each route has a different delay, then it quickly becomes clear that there is the potential for changing output values that differ from the required / expected output.e.g. A logic circuit is meant to change output state from 1 to 0, but instead changes from 1 to 0 then 1 and finally rests at the correct value 0. This is a dynamic hazard.
As a rule, dynamic hazards are more complex to resolve, but note that if all static hazards have been eliminated from a circuit, then dynamic hazards cannot occur.
Function hazards
Function hazards are non-solvable hazards which occur when more than one input variable changes at the same time. Hazards such as function hazards can not be logically eliminated as the problem lies with the actual specification of the circuit. The only real way to avoid such problems is to restrict the changing of input variables so that only one input should change at any given time.Restrictions are not always possible. For instance, let us imagine some logic circuit that has two inputs: One input is used for a clock signal, and the other is connected to a random noise source that we wish to measure. It should be clear that restrictions in this case would not be an effective solution.
The simplest example of this is the exclusive-or function.
In this scenario it is quite difficult to see how a hazard could occur if the circuit is built up on the same couple of chips. However let us imagine that some circuit designer has split this function across different chips (i.e. one NOT gate on one chip and the other NOT gate is implemented on another chip across the PCB somewhere)
Let us set-up the initial state of our circuit. A = 1, B = 0. Now lets say there is a delay in the NOT gate marked (X). The inputs now change simultaneously so that A = 0 and B = 1 (remember in an equally delayed circuit or a perfect circuit, the circuit output would match the specification).
If we observe what the circuit should do, and do not change the output of the NOT gate X (this simulates a delay in gate X), it should be clear that the output of the circuit changes. Now we change the output of NOT gate X and the circuit goes back to the proper state.
The most effective way to solve this hazard would be to carefully design the PCB so that delays are all equal, or at least match the delays on each path. i.e. Delay of A's path = Delay of B's path. Yet adding more gates to the circuit by the same methods as described in dynamic and static hazards will not work as Huffman's method
Huffman coding
In computer science and information theory, Huffman coding is an entropy encoding algorithm used for lossless data compression. The term refers to the use of a variable-length code table for encoding a source symbol where the variable-length code table has been derived in a particular way based on...
cannot be applied..