# THE TRUTH, THE WHOLE TRUTH AND NOTHING BUT THE TRUTH Part 3 of 3

--

Has 3 subsections — A B and C

In the last articles, we consolidated our knowledge of how connectives behave by looking at some complex examples of expressions involving multiple connectors. We have also used truth tables to examine the truth and falsehood of compound statements by listing all possible combinations of its constituent statements. Since article 1 of this series, we have been using variables to represent our propositions. This was to make compound statements more compact and manageable for programmers to work on, and easier for machines to process. Using variables makes the process more scalable and mathematical.

Now we are going to go further by looking at logic gates and the fundamental laws of logic. This will complete your introduction to the way a machine interprets our world!

**Section A — Logic Gates**

In this section, we will explain how logic applies to electronic circuits inside machines. We can begin our journey with the fundamental gates, which includes the AND gate, OR gate and NOT gate. After that, we will explore combinations of fundamental gates, namely NAND gate and NOR gate. And we will look at their truth tables as well.

What’s a logic circuit and what are gates?

Logic gates are built using different combinations of transistors, which are basically electrically controlled switches. The circuit that contains the logic gates can be a theoretical one (as in many computing or mathematical problems) or a real one with actual components (as in electronic engineering). Either way, you have input signals, gates and output signals. Your outputs will depend on the gate and the inputs.

The states 0 and 1 now represent on and off, with the output (on or off) being the resultant behaviour or state of the relevant component or machine.

Your goal is to design a circuit that will produce the required outputs for the given inputs, so you choose an arrangement of logic gates that accomplishes this.

What are the basic or standard gates?

There are some basic logic gates that are used. These mimic our logical operators in terms of their action on the input signals, so the gate and the truth table will look very similar.

**NOT**

This gate has one input signal and one output signal. The NOT gate is also called an inverter because it flips the state of the signal to the opposite. We saw this in the reversal of the truth state of a proposition when looking at truth tables.

Input Output

0 1

1 0

**AND**

This gate will have two input signals and one output signal. Similarly to the action of the AND operator, the output signal is only 1 when both inputs were 1. In all other cases (one or both being 0) the output is 0. Again this resembles the truth table for the AND operator.

Input1 Input2 Output

1 1 1

1 0 0

0 1 0

0 0 0

**OR**

This gate will have two input signals and one output signal. Similarly to the action of the OR operator, the output signal is only 0 when both inputs were 0. In all other cases (one or both being 1) the output is 1. Again, the table of inputs and outputs resembles the truth table for the OR operator.

Input1 Input2 Output

1 1 1

1 0 1

0 1 1

0 0 0

**Implication- **There are no standard gate equivalents for implication operators, but you can mimic their effect by combining gates. We’ll see this when we look at logic laws.

**NAND** gates- the equivalent of an AND followed by a NOT

This gate will have two input signals and one output signal. The output signal is only 0 when both inputs were 1. In all other cases (one or both being 0) the output is 1.

Input1 Input2 Output

1 1 0

1 0 1

0 1 1

0 0 1

**NOR** gates- the equivalent of an OR followed by a NOT

This gate will have two input signals and one output signal. The output signal is only 1 when both inputs were 0. In all other cases (one or both being 1) the output is 0.

Input1 Input2 Output

1 1 0

1 0 0

0 1 0

0 0 1

NAND as the universal gate

A NAND gate is a universal gate, meaning that any other gate can be represented as a combination of NAND gates.

**NAND**

Remember that a NAND gate has output 0 when both inputs were 1 and is 1 for any other input combination

Input1 Input2 Output= 1NAND2

1 1 0

1 0 1

0 1 1

0 0 1

**NOT**

A NOT gate is made by making the inputs of a NAND gate equal (ie. both 1 or both 0). This can be done by splitting one signal into two and making both the inputs of the NAND gate.

Truth Table

InputA OutputQ

0 1

1 0

Original signal.. Inputs for NAND.. Output signal from NAND

A ………………..1.A 1 2.A 2…………… = NOT A

1 ………………….1 …… 1 …………………. 0

0 ………………….0 …….0 …………………. 1

**AND**

An AND gate is made by inverting the output of a NAND gate as shown below. We use the outputs of two NAND gates as the inputs of a third. The inputs of the two original NAND gates were the signals A and B (both A and B are split to go through 2 gates at once) so that their outputs are equal. These form the equal inputs of the third NAND, which then acts as a NOT, inverting the NAND effect.

Truth Table

Input A Input B Output Q

0 0 0

0 1 0

1 0 0

1 1 1

Original signals .. Inputs for final NAND …………Output from final NAND

A B ……………1.A NAND B .. 2.A NAND B ……………… 1 NAND 2= AANDB

1 1 …………………. 0 ………… 0 ……………………………. 1

1 0 …………………..1 …………1 ……………………………. 0

0 1 …………………..1 …………1 ……………………………. 0

0 0 …………………..1 …………1 ……………………………. 0

**OR**

If the truth table for a NAND gate is examined, it can be seen that if any of the inputs are 0, then the output will be 1. To be an OR gate, however, the output must be 1 if any input is 1. This can be achieved if the inputs are inverted. To invert them we must apply the idea behind the NOT gate as a NAND with equal inputs. We construct NOT/ inverter gates for 2 separate signals A and B, then use the outputs from these as the inputs of our third NAND gate.

Truth Table

Input A Input B Output Q

0 0 0

0 1 1

1 0 1

1 1 1

Originals/Split signals/Inputs for final NAND/Outputs from final NAND

A B……A1 A2 B1 B2 ..1.A1NANDA2 .. 2.B1NANDB2 … 1NAND2= AORB

1 1 ……1 1 1 1 ……………..0 ……………….1 ………………….1

1 0 ……1 1 0 0 ……………..0 ……………….1 ………………….1

0 1 ……0 0 1 1 ……………..1 ……………….0 ………………….1

0 0 ……0 0 0 0 ……………..1 ……………….1 ………………….0

**NOR**

A NOR gate is an OR gate with an inverted output. Output is 1 when both input A and input B are 0. Basically we make two OR constructions that become the inputs of the final NAND gate which inverts them. The original signals A and B are split to go through all the initial gates.

Truth Table

Input A Input B Output Q

0 0 1

0 1 0

1 0 0

1 1 0

Originals/ Outputs OR setup/ Inputs final NAND/ Outputs from final NAND

A B………… A OR B …….1.A OR B 1 .. 2.A OR B 2 …… 1 NAND 2 = ANORB

1 1 ……………1……………..1 ……………1 ………………0

1 0…………….1……………..1 ……………1 ………………0

0 1 ……………1.…………….1 ……………1 ………………0

0 0 ……………0……………..0 ……………0 ………………1

As you can see, there are often several options when it comes to constructing a logic circuit. Then how do we choose between them?

Like many engineering fields, electronics is wont to use the most efficient and cost-effective construction possible. In this case you’d want the option that uses the fewest gates and connections.

While it can be useful to be able to make any gate from NAND gates, it is certainly not the most efficient or cost-effective way to construct the circuit. You need 4 NAND gates to do the job of one OR gate.

While this is one example of equivalent constructions, there are many more. Collectively referred to as the logic laws, these work for output signals from circuits as well as for the truth of composite propositions.

The operator with no standard gate equivalent is IMPLICATION. The logic laws show us a way of modelling this from NOT and OR. See section B for all this and more!

# Section B- Logic Laws

We’ve covered how to represent a proposition as variables and symbols, then how to construct the corresponding truth table. For some operations, we’ve even looked at the representative circuit with logic gates. There are many equivalent ways to combine propositions for the same truth state or signal output. We’ve met some of them informally already, but in this section we’ll cover the actual laws that underpin Boolean logic. We can use truth tables and gates to help us visualise, prove and understand the laws.

This will help us solve problems such as

“Prove that ~ (q -> p) v ( p** **∧ q) is equivalent to just q” (done in section C)

and help us show formally that some expressions and circuits are equivalent (enabling us to choose the simplest or most efficient one).

As with truth tables, before we get to applying anything we must first cover the fundamentals. Here, these are the laws themselves.

Note that any law can be used in either direction, depending on which will simplify the expression or lead to a form that can be simplified.

**Double negation**

~(~P) ↔ P

This means that negating a proposition twice gets you back the original.

If you invert a signal twice you get the original signal back

P ~P ~~P

1 0 1

0 1 0

**Domination laws**

The domination laws show you where one signal can overpower another, making it more efficient to have just one (as the other is redundant).

The first domination law is P v TRUE ↔ TRUE

The true proposition is always true, so whatever the truth state of P you always have at least one true proposition so the output is true/1.

**P TRUE P∨TRUE**

1 1 1

0 1 1

Another domination law is P ∧ FALSE ↔ FALSE

The false proposition is always false, so you’ll never have both propositions being true. Thus the output is always false/0.

**P FALSE P∧FALSE**

1 0 0

0 0 0

**Identity laws**

The identity laws also show you where one signal can overpower another, making it more efficient to have just one (as the other is redundant).

The first identity law is P **∧ **TRUE ↔ P

The true proposition is always true, so the truth of the composite depends only on the truth of P. This is because P needs to be true for them both to be true.

**P TRUE P∧TRUE**

1 1 1

0 1 0

Another identity law is P v FALSE ↔ P

The false proposition is always false, so the truth of the composite depends only on the truth of P. This is because P is the only one that can be true and so provide at least one true.

**P FALSE P∨FALSE**

1 0 1

0 0 0

In both cases the composite column is identical to the P column.

**Idempotent laws**

These allow you to simplify an expression containing copies of the same expression

These are P ∧ P ↔ P and P v P ↔ P.

Applying AND or OR to identical propositions or signals will not alter anything. The output will be the same signal, so the gate does nothing. You may think that this law is trivial, but think of how a computer would waste time considering copies of a signal as different without this law to simplify things. This law is analogous to cancelling down or removing common factors in algebra- a powerful law that we can take for granted.

Here is the truth table that shows this- the PVP and P∧P columns are identical to the P column

P P PVP P∧P

1 1 1 1

0 0 0 0

**Commutative laws**

Again these may seem trivial, but it is very important that we and machines know that, for AND and OR, the order of the propositions doesn’t matter. Multiplication and addition are commutative operations that we learn early on. When we learn algebra we appreciate that commutativity allows us to simplify an expression by combining like terms.

The idea is the same here. The commutative laws allow us to consider expressions with the propositions the other way round as the same, then other laws such as the idempotent law can be used to simplify things.

If you’re not convinced of the importance of this, then consider how odd it would be if they were not equivalent (as in non-commutative operations like subtraction, division, or matrix multiplication).

Luckily, commutativity is a property we can take for granted. It also means that the order that the signals are considered in doesn’t affect the output, which is handy.

Multiplication and addition

2 x 3 = 3 x 2

1 + 2 = 2 + 1

Logic commutativity

P v Q ↔ Q v P

P ∧ Q ↔ Q ∧ P

**Complement laws**

Complement is the same as negation, so these laws concern a proposition and its negation. In terms of signals, we can consider this as opposing signals. If you’ve studied A Level (Higher, High school) Physics, you may see the similarities with wave superposition or opposing current flows.

It should be clear that a proposition and its negation can’t both be true, but one of them will always be true (as one is true where the other is false).

Thus we have the complement laws

- P
**∧**~P ↔ FALSE (always 0)

**P ~P P∧**~ **P**

1 0 0

0 1 0

2. P v ~P ↔ TRUE (always 1)

**P ~P P∨**~**P**

1 0 1

0 1 1

**Associative laws**

When doing AND and OR operations with 3 propositions or more, it is very useful to know that it doesn’t matter which pairs you consider first. Like with the commutative laws, it would be odd if this did matter, and these are also laws that you met with multiplication and addition. This is the power of associativity. Because the order you consider the pairs in doesn’t matter, you don’t necessarily need the brackets. This law what what allowed us to extend the OR and AND operator behaviour to any number of propositions

Multiplication and addition

2 x (3 x 5) = (2 x 3) x 5 = 2 x 3 x 5

1 + (2 + 3) = (1 + 2) + 3 = 1 + 2 + 3

Logic associativity

P v (Q v R) ↔ (P v Q) v R ↔ P v Q v R

P **∧** (Q **∧** R) ↔ (P **∧** Q) **∧** R ↔ P **∧** Q **∧** R

Here’s the truth tables as proof (in each table, columns 5, 7 and 8 are the same. They are separated off by ellipses for illustrative purposes)

- For P v (Q v R) ↔ (P v Q) v R ↔ P v Q v R

P Q R PVQ (PVQ)VR … QVR PV(QVR) … PVQVR

1 1 1 1 1 …. 1 1 …. 1

1 1 0 1 1 …. 1 1 …. 1

1 0 1 1 1 …. 1 1 …. 1

1 0 0 1 1 …. 0 1 …. 1

0 1 1 1 1 …. 1 1 …. 1

0 1 0 1 1 …. 1 1 …. 1

0 0 1 0 1 …. 1 1 …. 1

0 0 0 0 0 …. 0 0 …. 0

2. For P **∧** (Q **∧** R) ↔ (P **∧** Q) **∧** R ↔ P **∧** Q **∧** R

P Q R P**∧**Q (P**∧**Q)**∧**R …Q**∧**R P**∧(**Q**∧**R) …P**∧**Q**∧**R

1 1 1 1 1 …. 1 1 …. 1

1 1 0 1 0 …. 0 0 …. 0

1 0 1 0 0 …. 0 0 …. 0

1 0 0 0 0 …. 0 0 …. 0

0 1 1 0 0 …. 1 0 …. 0

0 1 0 0 0 …. 0 0 …. 0

0 0 1 0 0 …. 0 0 …. 0

0 0 0 0 0 …. 0 0 …. 0

*Distributive laws*

In algebra, this property was disguised as expanding brackets. Adding then multiplying is equivalent to multiplying then adding.

a (b + c) = ab + ac

2 (x + y)= 2x + 2y

This law can be used either way. Sometimes it can be easier to work with the factorised expression and other times the expanded out one is more helpful.

Similarly, the logical form can be used either way. It is very useful when you have both AND and OR operators in your expression. Which way you use it depends on what the rest of the expression looks like, as the aim is to be able to simplify it further using other laws. We’ll see some examples in video 7.

Logical form

P v (Q ∧ R) ↔ (P V Q) ∧ (P V R)

P ∧ (Q V R) ↔ (P ∧ Q) V (P ∧ R)

Proving these is best with truth tables

*P v (Q ∧ R) ↔ (P V Q) ∧ (P V R)*

Firstly we look at P v (Q ∧ R)

P Q R Q∧R Pv(Q∧R)

1 1 1 1 1

1 1 0 0 1

1 0 1 0 1

1 0 0 0 1

0 1 1 1 1

0 1 0 0 0

0 0 1 0 0

0 0 0 0 0

Now we look at (P V Q) ∧ (P V R)

P Q R PVQ PVR (PVQ)∧(PVR)

1 1 1 1 1 1

1 1 0 1 1 1

1 0 1 1 1 1

1 0 0 1 1 1

0 1 1 1 1 1

0 1 0 1 0 0

0 0 1 0 1 0

0 0 0 0 0 0

Thus we have the same output and these are equivalent.

We can then view the following setups as equivalent gate diagrams

- Take the output from an OR gate with inputs Q and R, and put that through an AND gate with P
- Take the outputs from 2 OR gates (one with inputs P and Q and the other with inputs P and R) and make them the inputs of an AND gate

The first setup contains just 2 gates so it’s more efficient than the second.

*2. P ∧ (Q v R) ↔ (P ∧ Q) v (P ∧ R)*

Firstly, we look at P ∧ (Q v R)

P Q R QvR P∧(QvR)

1 1 1 1 1

1 1 0 1 1

1 0 1 1 1

1 0 0 0 0

0 1 1 1 0

0 1 0 1 0

0 0 1 1 0

0 0 0 0 0

Now we look at (P ∧ Q) v (P ∧ R)

P Q R P∧Q P∧R (P∧Q)v(P∧R)

1 1 1 1 1 1

1 1 0 1 0 1

1 0 1 0 1 1

1 0 0 0 0 0

0 1 1 0 0 0

0 1 0 0 0 0

0 0 1 0 0 0

0 0 0 0 0 0

Thus we have the same output and these are equivalent.

We can then view the following setups as equivalent gate diagrams

- Take the output from an AND gate with inputs Q and R, and put that through an OR gate with P
- Take the outputs from 2 AND gates (one with inputs P and Q and the other with inputs P and R) and make them the inputs of an OR gate

The first setup contains just 2 gates so it’s more efficient than the second.

**Absorptive laws**

As an extension of the Distributive laws, we have that

P ∧ (P v Q) *↔ *P

and

P v (P ∧ Q) *↔ *P

*Proof of P v (P ∧ Q) ↔ P*

P v (P ∧ Q)

Step 1- Use the Identity law, which gives that P ∧ TRUE is equivalent to P

(P ∧ TRUE) v ( P ∧ Q)

Step 2- Use the Distributive law as both brackets contain P and there’s OR and AND operators

P ∧ (TRUE v Q)

Step 3- Use the Domination law to replace TRUE v Q with just TRUE

P ∧ TRUE

Step 4- Use the Identity law to give that P ∧ TRUE is equivalent to P

P

*Proof of P ∧ (P v Q) ↔ P*

P ∧ (P v Q)

Step 1- Use the Distributive law to expand this out

(P ∧ P) v (P ∧ Q)

Step 2- Use the Idempotent law to replace P ∧ P with just P

P v (P ∧ Q)

You’ll notice we proved that this is equivalent to P above

P

A truth table will act as further proof of these laws, with ellipses separating the three relevant parts of the table (columns 1, 4 and 6)

P … Q P∧Q Pv(P∧Q) …. PvQ P∧(PvQ)

1 …. 1 1 1 …. 1 1

1 …. 0 0 1 …. 1 1

0 …. 1 0 0 …. 1 0

0 …. 0 0 0 …. 0 0

Hopefully this establishes the equivalence of these.

The P v Q and P ∧ Q parts then add no value to the expression, which would make them redundant parts of a circuit.

*Conditional Identity*

This law formalises the behaviour of an IMPLICATION operator.

We have already seen that an IMPLICATION P → Q is true where Q is true or where both P and Q are false.

P Q P → Q

1 1 1

1 0 0

0 1 1

0 0 1

The law below summaries this. It says that the expression P → Q is equivalent to the requirement that either P is false or q is true or both.

(p → q) ↔ ~p v q

As proof, I’ll do the table for the ~p v q part

P Q notP ~P v Q

1 1 0 1

1 0 0 0

0 1 1 1

0 0 1 1

This is the same as the implication table, so they’re equivalent.

If you want a circuit where a signal P implies a signal Q, you’ll need to pass P through a NOT gate, then pass the output of this through an OR gate with Q.

*Extension of Conditional Identity to an equivalent for the biconditional*

Of course, we have that the biconditional P ↔ Q is true when both one-way implications are simultaneously true.

Thus, (P ↔ Q) ↔ (P→ Q) ∧ (Q → P) is one law concerning the biconditional.

This happens when P and Q have the same truth value- both 1 or both 0- as shown in the table.

P Q P ↔ Q

1 1 1

1 0 0

0 1 0

0 0 1

Then we need either P and Q both true, or P and Q both false (that is, ~P and ~Q both true).

Another useful law concerning the biconditional is then

(P ↔ Q) ↔ (P ∧ Q) V (~P ∧ ~Q)

This can be proven by constructing the table for (P ∧ Q) V (~P ∧ ~Q), as below

P Q notP notQ P∧Q notP∧notQ (P ∧ Q)V(~P ∧ ~Q)

1 1 0 0 1 0 1

1 0 0 1 0 0 0

0 1 1 0 0 0 0

0 0 1 1 0 1 1

This is the same as the biconditional table, so they’re equivalent.

If you want a circuit where signals P and Q have a 2-way implication (making P and Q are equivalent), you’ll need to either

- Construct an IMPLICATION circuit for P → Q and one for Q → P (each will need a NOT and an OR), then pass the outputs through an AND gate.
- Pass P and Q through an AND gate, pass P and Q through NOT gates then the outputs through an AND gate, then the outputs from both AND gates shall be the inputs of an OR gate.

Both setups involve 5 gates, but there could be differences in the number of connections and so the possibility of interference.

*De Morgan’s laws*

These laws encapsulate the difference between ‘neither’ and ‘not both’ in English. Have you spent time thinking about what these mean logically? Bear with me and try it now…..

‘Neither’ means that both are false; ‘not both’ means that at least one is false (which means they aren’t both true).

‘Neither’ is viewed as the negation of ‘either’ and ‘not both’ as the negation of ‘both’.

This may initially seem strange, as ‘both’ involves AND while ‘not both’ involves OR and ‘either’ involves OR while ‘neither’ involves AND.

Again, bear with me, as this isn’t the problem it appears to be. De Morgan’s laws straighten it all out. I hope that doing the worded treatment first helps convince you that the following logic works.

In symbols,

either= p v q

both = p ∧ q

neither= ~p ∧ ~q (both false)

not both= ~p v ~q (at least one false)

So, if ‘neither’ is the negation of ‘either’ and ‘not both’ is the negation of ‘both’, we have

~ (P v Q) ↔ ~P ∧ ~Q (for neither)

~ (P ∧ Q) ↔ ~P v ~Q (for not both)

These are De Morgan’s laws.

While the worded explanation helps, it wouldn’t hurt to prove this formally with a truth table

- ~ (P v Q) ↔ ~P ∧ ~Q (ellipses separate the two halves of the table so you can see the equivalent columns (4 and 7))

P Q PvQ ~(PVQ) …. ~P ~Q ~P∧~Q

1 1 1 0 …. 0 0 0

1 0 1 0 …. 0 1 0

0 1 1 0 …. 1 0 0

0 0 0 1 …. 1 1 1

Hopefully this establishes the two as equivalent.

2. ~ (P ∧ Q) ↔ ~P v ~Q (ellipses separate the two halves of the table so you can see the equivalent columns (4 and 7))

P Q P∧Q ~(P∧Q) …. ~P ~Q ~PV~Q

1 1 1 0 …. 0 0 0

1 0 0 1 …. 0 1 1

0 1 0 1 …. 1 0 1

0 0 0 1 …. 1 1 1

Again, this hopefully establishes the equivalence.

From these laws, we can say that the following are true about the corresponding gate diagrams

- Putting P and Q into an AND gate, then the output through a NOT gate is the same as putting P and Q through separate NOT gates, then putting the outputs through an OR gate. The first is more efficient due to only having 2 gates.
- Putting P and Q through an OR gate, then the output through a NOT gate is the same as putting P and Q through separate NOT gates, then putting the outputs through an AND gate. The first is again more efficient due to only having 2 gates.

In this section, we have built upon the understanding of compound statements with multiple variables and connectives and our more recent coverage of gate diagrams. We looked at the laws that can be used to help us simplify gate diagrams.

Now we know the basic laws of logic, you might wonder, what can we do with them? Section C goes over this.

# Section C- Working with the Logic Laws

We have talked about the identities that can be used in the world of propositional logic. Now it’s time to use them and interact with them! Let’s beware and not break the laws!

Remember we have the following laws from section B

- P ↔ (P v P) — — — Idempotence law
- P ↔ (P ∧ P) — — — Idempotence law
- P v (P ∧ Q) ↔ P — — Absorption law
- P ∧ (P v Q) ↔ P — — Absorption law
- (P v Q) ↔ (Q v P) — — — Commutativity law
- (P ∧ Q) ↔ (Q ∧ P) — — — Commutativity law
- [(P v Q) v R] ↔ [P v (Q v R)] — — — Associativity law
- [(P ∧ Q) ∧ R] ↔ [P ∧ (Q ∧ R)] — — — Associativity law
- ~ (P v Q) ↔ (~ P ∧ ~ Q) — — — De Morgan’s law
- ~ (P ∧ Q) ↔ (~ P v ~ Q) — — — De Morgan’s law
- [P ∧ (Q v R] ↔ [(P ∧ Q) v (P ∧ R)] — — — Distributivity law
- [P v (Q ∧ R] ↔ [(P v Q) ∧ (P v R)] — — -Distributivity law
- (P v True) ↔ True — — Domination law
- (P ∧ False) ↔ False — — Domination law
- (P v False) ↔ P — — Identity law
- (P ∧ True) ↔ P — — Identity law
- (P v ~ P) ↔ True — — — Complement law
- (P ∧ ~ P) ↔ False — — — Complement law
- P ↔ ~ (~ P) — — — Double Negation law
- (P →Q) ↔ (~P v Q) — — Conditional Identity

Here are some examples of using the logic laws to prove equivalence or to simplify expressions.

*Example 1*

“Prove that ~ (q → p) v ( p** **∧ q) is equivalent to just q”

~(q→p)∨(p ∧ q)

Step 1:Use the Conditional Identity to replace the implication so that q→p becomes ~q ∨ p

~(~q ∨ p) ∨ (p ∧ q)

Step 2: Use De Morgan’s Law to replace ~(~q ∨ p) with ~(~q) ∧ ~p

(~(~q) ∧ ~p) ∨ (p ∧ q)

Step 3: Use Double Negation to replace ~(~q) with just q

(q ∧ ~p) ∨ (p ∧ q)

Step 4: Use the Commutative law to replace p ∧ q with q ∧ p

(q ∧ ~p) v (q ∧ p)

Step 5: Notice both contain q and the expression has both AND and OR, so a distributive law would be useful

Use the Distributive law to factor out the q

q ∧ (~p v p)

Step 6: Note we have both p and it’s complement ~p, so use a complement law

Use the Complement law to replace ~p v p with TRUE

q ∧ TRUE

Step 7: We can now finish off with an Identity law

q

We’ve proven that ~ (q -> p) v ( p** **∧ q) is equivalent to just q

*Example 2*

The contrapositive of p → q is ~q →~p Prove that these are equivalent.

~q→ ~p

Step 1- Use the Conditional Identity to replace ~q→~p with ~(~q) ∨ ~p

~(~q) ∨ ~p

Step 2- Use the Double Negation law to replace ~(~q) with just q

q v ~p

Step 3- Use the Commutative law to swap the terms

~p v q

Step 4- Use the Conditional Identity in reverse to replace this with p → q

p → q

We have proven that they are equivalent

*Example 3*

Prove that p ∨ q → r is equivalent to (p → r) ∧ (q → r).

p ∨ q → r

Step 1- We know by the order of operators that the OR comes before the implication, so illustrate this with brackets

(p ∨ q) → r

Step 2- Use the Conditional Identity to make this ~(p ∨ q) v r

~(p ∨ q) v r

Step 3- Use De Morgan’s law to replace ~(p ∨ q) with ~p ∧ ~q

(~p ∧ ~q) v r

Step 4- Use the Commutative law to reverse the terms either side of the OR operator

r v (~p ∧ ~q)

Step 5- Use the Distributive law to expand this out

(r v ~p) ∧ (r v ~q)

Step 6- Use the Commutative law again to reverse the terms in the brackets

(~p v r) ∧ (~q v r)

Step 7- Use the Conditional Identity in reverse to put back the implications p → r and q → r

(p → r) ∧ (q → r)

We have proven that they are equivalent.

*Example 4*

Prove that ~(p ∨ ~q) ∨ (~p ∧ ~q) is equivalent to ~p

~(p ∨ ~q) ∨ (~p ∧ ~q)

Step 1- Use De Morgan’s law to convert ~(p ∨ ~q) to ~p ∧ ~(~q)

(~p ∧ ~(~q)) ∨ (~p ∧ ~q)

Step 2- Use Double Negation to replace ~(~q) with just q

(~p ∧ q) v (~p ∧ ~q)

Step 3- ~p is in both brackets and there’s both AND and OR operators, so we’ll use the Distributive law to factor out ~p

~p ∧ (q v ~q)

Step 4- Use the Complement law to replace q v ~q with TRUE

~p ∧ TRUE

Step 5- Use the Identity law to replace this with just ~p

~p

We have proven that they are equivalent

*Example 5*

Prove that (p ∧ q) → r is equivalent to p → (q → r)

(p ∧ q) → r

Step 1- Use the Conditional Identity to replace this with ~(p ∧ q) v r

~(p ∧ q) v r

Step 2- Use De Morgan’s law to replace ~(p ∧ q) with ~p v ~q

(~p v ~q) v r

Step 3- Use the Associative law to move the brackets, putting them round ~q v r

~p v (~q v r)

Step 4- Use the Conditional Identity to put the implication back, changing ~q v r to q → r

~p v (q → r)

Step 5- Use the Conditional Identity again to put in the other implication, changing ~p v (q → r) to p → (q → r)

p → (q → r)

We have proven they are equivalent

We’ve now seen the laws in action on some logical expressions.

We’ve seen how the laws work intuitively through equivalence of truth tables and by looking at the associated logic gate diagram we can see how powerful the laws can be.

That’s the end of this article series.