Propositional logic, also known as sentential logic ("sentential" means "relating to sentences") and statement logic, is the branch of deductive logic that involves drawing conclusions from premises that are in the form of propositions. As discussed on the logic page, propositions are statements that are either true or false, but not both.
In contrast to branches of logic such as term logic (also called syllogistic logic) or predicate logic, propositional logic looks at propositions as a whole and does not study logical properties and relationships that depend on parts of a statement, such as the subject or predicate of a statement. For example, taking the proposition "Socrates is a man," propositional logic treats the entire sentence as an indivisible unit; we can't use propositional logic to draw conclusions about, for example, the subject of the sentence ("Socrates") or the predicate ("a man"). Rather, propositional logic is used to combine individual propositions together into compound propositions using various operators, and examine the truth value of these compound propositions. Individidual propositions are combined with operators called sentence connectives. The four most common operators are:
Name | English Word |
---|---|
Negation | "not" or "it is not the case that" |
Conjunction | "and" |
Disjunction | "or" |
Material Conditional | "if ... then" |
Note that the negation operator operates on a single proposition. For example, taking the negation of "The sky is blue" would result in "It is not the case that the sky is blue". The other three operate on a pair of propositions. Examples of such connectives are: "Roses are red and violets are blue," "You are in Sweden or you are in New York," and "If you ate the last piece of cake, then there is no cake left."
Before discussing the truth values of compound propositions, it can be helpful to introduce a suitable notation. While there are several different symbols that can be used, the following symbols are in common use:
Individual propositions | p, q, r, etc., or any other variable name. |
Negation | ¬ |
Conjunction | ∧ |
Disjunction | ∨ |
Material Conditional | → |
True | T |
False | F |
p | q | p → q |
---|---|---|
F | F | T |
F | T | T |
T | F | F |
T | T | T |
p | q | p ∨ q |
---|---|---|
F | F | F |
F | T | T |
T | F | T |
T | T | T |
p | q | p ∧ q |
---|---|---|
F | F | F |
F | T | F |
T | F | F |
T | T | T |
p | ¬p |
---|---|
F | T |
T | F |
We can represent the values that these operators return in truth tables. A truth table lists every possible value of the individual propositions, as well as the result of the operation. The truth tables for the four operators listed above are found on the right.
It is important to keep in mind that these logical symbols do not function in the exact same way in which the words "not", "and", "or", or "if ... then" are used in English. For example:
Truth tables, like the ones above, can be used systematically to consider all possible combinations of truth values. They can still be used when evaluating the truth value of a proposition that consists of several individual propositions; however, the number of rows required grows exponentially as the number of individual propositions increases, so rules of inference come in handy. Here is a list of rules of inference and mutual inference that can be used. The symbol "::" implies mutual inferability or logical equivalence; for example, given "p :: p∧p", we can conclude p∧p given p, or p given p∧p. The symbol ∴ means "therefore". Wherever it is used, it means that, given whatever is on the left-hand side, we can conclude whatever is on the right-hand side.
Symbolic Representation | Name |
---|---|
p→q, p ∴ q | Modus Ponens |
p→q, ¬q ∴ ¬p | Modus Tollens |
p∨q, ¬p ∴ q | Disjunctive Syllogism |
p→q, q→r ∴ p→r | Hypothetical Syllogism |
p→q ∴ (q→r)→(p→r) | Hypothetical Syllogism |
p→(q→r) ∴ (p∧q)→r | Importation |
p→q ∴ (p∧r)→(q→r) | Addition of a Factor |
p, q ∴ p∧q | Conjunction |
p∧q ∴ p | Simplification |
¬p ∴ ¬(p∧q) | Negative Conjunction |
¬p∨¬q ∴ ¬(p∧q) | De Morgan's Theorem |
¬p∧¬q ∴ ¬(p∨q) | De Morgan's Theorem |
p :: p∧p | Tautology |
p→q :: ¬p∨q | Condition Disjunction |
p→q :: ¬q→¬p | Transposition |
p :: ¬¬p | Double Negation |
p∧q :: q∧p | Commutation |
p∨q :: q∨p | Commutation |
p∧(q∧r) :: (p∧q)∧r | Association |
p∨(q∨r) :: (p∨q)∨r | Association |
p∨(q∧r) :: (p∨q)∧(p∨r) | Distribution |
As an aside, note that the last five are similar in form to the laws of arithmetic. If you have any doubts about these rules, you may want to try to create the corresponding truth tables and verify that they are correct.
Using all of these rules, we can apply them in proofs. Take the following example:
Given | If you did not mow the lawn, you did not receive $10 from me. |
If you did not receive $10 from me, you did not go to the movies. | |
You went to the movies. | |
Prove | You mowed the lawn. |
We could let p represent "You mowed the lawn," q represent "You received $10 from me," and r represent "You went to the movies." Symbolically, we would have:
Given | ~p → ~q |
~q → ~r | |
r | |
Prove | p |
We could represent a proof as follows:
Number | Formula | Reason |
---|---|---|
1 | ~p → ~q | Premise |
2 | ~q → ~r | Premise |
3 | r | Premise |
4 | ~p → ~r | 1, 2, Hypothetical Syllogism |
5 | p | 3, 4, Modus Tollens |
Q.E.D.
See also Formal Fallacies and Converse, Inverse, and Contrapositive.