PHIL 12A Syntax and Semantics Philosophy Problems Set 8 and 9

User Generated

fhcreqn

Humanities

Description

Please complete all questions in problem set 9. You may also need problem set 8, lecture slides, and the textbook called Logic in Action. I will provide all of them in attachments.

Unformatted Attachment Preview

PHIL 12A – Spring 2019 Problem Set 9 100 points 1 Syntax and Semantics of Monadic Predicate Logic 1.1 1.1.1 Identity and Substitution The Identity Predicate 1. 4 points Exercise 4.36 of Logic in Action. Write out your formula with no abbreviations. 2. 4 points Exercise 4.38(2) of Logic in Action. Demonstrate the difference in meaning by providing a model in which the two formulas have different truth values. 1.1.2 Substitution 3. 10 points total; 2.5 points each part Give a recursive definition of the set of terms t that are substitutable for a variable x in a formula ϕ. That is, fill in the question marks in the following template (note that we do not assume that the metavariables ‘x’ and ‘y’ refer to distinct variables, so you need to analyze the case where x = y and the case where x 6= y in part (d)): (a) t is always substitutable for x in an atomic formula P (u) or s = ˙ t (because there are no quantifiers in an atomic formula to capture a variable in t); (b) t is substitutable for x in ¬ϕ iff ? (c) for # ∈ {∧, ∨, →, ↔}, t is substitutable for x in (ϕ # ψ) iff ? (d) for Q ∈ {∀, ∃}, t is substitutable for x in Qyϕ iff ? 2 Syntax and Semantics of Predicate Logic 4. These problems concern translating English sentences into the language of predicate logic: (a) 3 points Assume the domain of discourse is all dogs. Translate the following sentences into predicate logic: (Use f for Fido, r for Rover, and L for loves) • Fido is loved by everyone. • Rover loves everyone who loves Fido. • Fido and Rover love each other. 1 (b) 5 points For each of the following, specify an appropriate domain of discourse, specify a translation key, and translate into predicate logic. (Note: you do not have to understand what a sentence means before you can attempt to translate it.) • Cats that love dogs don’t taunt dogs. • There is a greatest natural number. • Friend’s of Fido’s friends are Rover’s friends. • There is no largest even number. • Every apple is tastier than every orange. (c) 4 points Translate the following sentences into predicate logical formulas. Assume the domain of discourse is cats and dogs. • Some cat doesn’t love all dogs. • Every dog who loves all cats doesn’t love every dog. • Every cat who loves a dog is purred at by some cat. • No cat loves a cat who loves a dog, except for the cats who don’t love dogs. 5. These problems concern describing situations using the language of predicate logic: (a) 4 points Exercise 4.18 of Logic in Action. (b) 5 points Exercise 4.22 of Logic in Action. 6. These problems concern describing binary relations: (a) 3 points The formula ∀x∀y∀z((R(x, y)∧R(y, z)) → R(x, z)) expresses the transitivity of the relation R. Which of the following relations are transitive? • Being a cousin of ... on the set of human beings. • Being a sibling of ... on the set of human beings. • The divides relation on the set of natural numbers. (Recall that a natural number n divides a natural number m if there is some natural k such that n × k = m) (b) 5 points Exercise 4.24 of Logic in Action. (c) 5 points Exercise 4.25 of Logic in Action. 7. 12 points The following natural language sentences are ambiguous: they can be interpreted in at least two different ways. For each one of them, provide two predicate logic formulas that correspond to two different readings (i.e., four formulas in total). (a) There is a dog who likes only cats who like only dogs. (b) Fido and Rover are liked by some cat. Decide for each of the four predicate logic formulas that you have given whether they are true or false in the model in Figure 1. Explain your answers. 2 : dog •: cat • → : • likes  f :Fido, r:Rover, t:Tibbles r f t Figure 1: Model for Problem 7. Figure 2: Model for Problem 8. 8. 10 points Consider the model in Figure 2. Let the set of solid dots be the interpretation of the unary predicate symbol P . Let the edge relation of the graph be the interpretation of the binary predicate symbol R. What does ∀x(P (x) → (∃y(¬P (y) ∧ ∀z(R(y, z) → P (z))))) mean? Is this true in the model? 9. 8 points Consider the four models in Figure 3. Let R be the binary predicate interpreted by the arrow in the diagram. Give for each model a predicate logic formula that is true in that model but not in the other three. 3 Figure 3: Models for Problem 9. 10. These problems concern validity and consequence in predicate logic: (a) 8 points Which of the following statements are true? (i)  ∃xF (x) ∨ ∀x¬F (x) (ii)  ∀xF (x) ∨ ∀x¬F (x) (iii)  ∀xF (x) ∨ ∃x¬¬F (x) (iv)  ∀x∀y∀z((R(x, y) ∧ R(y, z)) → R(x, z)) (v)  ∀x∀yR(x, y) → ∀xR(x, x) (vi)  ∀x∀yR(x, y) → ∀zR(z, z) (vii)  ∀xR(x, x) → ∀x∃yR(y, x) (viii)  ∃xR(x, x) → ∀x∃yR(y, y) (b) 6 points Which of the following statements are true? If the statement is false, provide a counterexample. (i) ∀x∀yR(y, x)  ∀y∀xR(y, x) (ii) ∀xR(x, x)  ∀x∃yR(x, y) (iii) ∃x∃yR(x, y)  ∃xR(x, x) (iv) ∀x∃yR(x, y)  ∀xR(x, x) (v) ∃x∀yR(y, x)  ∃x∀yR(x, y) (vi) ∃xR(x, x)  ∀x∃yR(y, y) (c) 4 points Which of the following statements are true? If the statement is false, provide a counterexample. (i) {∀x∀y(R(x, y) → R(y, x)), R(a, b)}  R(a, a) (ii) {∀x∀y(R(x, y) → R(y, x)), R(a, b)}  R(b, b) (iii) {∀x∀y(R(x, y) → R(y, x)), ¬R(b, a)}  ¬R(a, b) (iv) {∀x∀y(R(x, y) → R(y, x)), R(b, c), ¬R(a, c)}  ¬R(a, b) 11. 10 points Extra Credit In an extra credit problem for Problem Set 8, we showed that if a formula of pure monadic predicate logic is satisfiable, then it is satisfiable in a 4 model of size at most 2k where k is the number of predicates occurring in the formula. (We stated this in terms of ‘falsifiable’, but ϕ is satisfiable iff ¬ϕ is falsifiable, so we can state it either way.) Let us now consider a sentence with a binary predicate symbol: ∀x∃yR(x, y) ∧ ∀x¬R(x, x) ∧ ∀x∀y∀z((R(x, y) ∧ R(y, z)) → R(x, z)). Is this sentence satisfiable? If so, are there any finite models that make the sentence true? If so, give an example. If not, explain why not. 12. 8 points Extra Credit A formula of predicate logic is in prenex normal form iff it is of the form Q1 x1 . . . Qn xn ψ where Qi ∈ {∀, ∃} and ψ does not contain quantifiers. The following is an important fact about predicate logic. Proposition 1. Every formula ϕ of predicate logic is equivalent to a formula in prenex normal form. To get a feel for this, find prenex equivalents of the following formulas: (a) ∀x(P (x) → ∀yR(x, y)); (b) ∃x(∀yR(y, x) → P (x)); (c) ∃x(P (x) → ∃yR(x, y)); (d) ∀x(∃yR(y, x) → P (x)). 5 PHIL 12A – Spring 2019 Problem Set 8 60 points 1 Syntax and Semantics of Monadic Predicate Logic 1.1 1.1.1 Pure Monadic Predicate Logic Pure Monadic Predicate Logic I 1. 2 points Exercise 4.6 in Logic in Action. 2. 2 points Exercise 4.7 in Logic in Action. 3. 10 points total; each part 2.5 points Give a recursive definition of the set of free variables of a formula ϕ, i.e., those variables with at least one free occurrence in ϕ, as described in the slides. To do so, fill in the question marks in the following template (note that we do not assume that the metavariables ‘x’ and ‘y’ refer to distinct variables, so you need to analyze the case where x = y and the case where x 6= y): (a) x is a free variable of P (y) iff ? (b) x is a free variable of ¬ϕ iff ? (c) for # ∈ {∧, ∨, →, ↔}, x is a free variable of (ϕ # ψ) iff ? (d) for Q ∈ {∀, ∃}, x is a free variable of Qyϕ iff ? 4. 10 points Given a model M = (D, I), a subset A ⊆ D is said to be definable in the language of pure monadic predicate logic iff there is a pure monadic formula ϕ with exactly one free variable x such that for some variable assignment g, A = {d ∈ D | M g[x:=d] ϕ}, i.e., A is exactly the set of objects d such that ϕ is true under the variable assignment that maps x to d. For example, in the model in Figure 1, the set {2, 4} is defined by the formula ¬Sophomore(x). For every subset of the domain of that model, say whether it is definable by a formula or not. If it is definable, give a formula that defines it. 1.1.2 Pure Monadic Predicate Logic II 5. 3 points Exercise 4.14 of Logic in Action. 6. 3 points Exercise 4.15 of Logic in Action. 1 Language: Sophomore Student Faculty I I Invitee I 1 I 3 Model: 2 4 Figure 1: Model for Problem 4. 7. 12 points Determine whether each of the following formulas are true in the model in Figure 2, where R stands for red, G stands for green, B stands for blue, P stands for purple, S stands for square, and C stands for circle: (a) practice ∃x(R(x) ∧ C(x)); (b) practice ∀x(C(x) ∨ S(x)); (c) ∃xG(x) ∨ ∃xC(x); (d) ∃xR(x) ∧ ∃xC(x); (e) ∀xC(x) ∨ ∀xS(x); (f) ∃x(G(x) ∨ C(x)); (g) ∀x(¬P (x) → S(x)); (h) ∀x(S(x) → (P (x) ∨ R(x))). Figure 2: 8. practice Exercise 4.17 of Logic in Action. 9. 8 points For each of the following sentences, say whether or not it is valid. If it is not valid, present a model in which it is false. (a) practice ∀x(P (x) ∧ Q(x)) → (∀xP (x) ∧ ∀xQ(x)); (b) ∀x(P (x) ∨ Q(x)) → (∀xP (x) ∨ ∀xQ(x)); (c) ∀xP (x) → ∃xP (x); 2 (d) ∃x∃y(P (x) ∧ Q(y)) → ∃z(P (z) ∧ Q(z)); (e) ∃x(P (x) → ∀xP (x)). 10. 5 points An important fact about pure monadic predicate logic is the following. Proposition 1. Each sentence ϕ of pure monadic predicate logic is equivalent to a sentence ϕ? containing the same predicate symbols as ϕ and only one variable. To get a feel for this, explain the following equivalences by appealing to the semantics of monadic predicate logic: (a) ∀x∃y(P (x) ∧ Q(y)) is equivalent to ∀xP (x) ∧ ∃yQ(y); (b) ∀xP (x) ∧ ∃yQ(y) is equivalent to ∀zP (z) ∧ ∃zQ(z). 11. 5 points Extra Credit. In the slides, we mentioned the important lemma that if a formula ϕ of pure monadic predicate logic is not valid, then it is falsified in a model on the domain D = {1, . . . , 2k } where k is the number of predicate symbols appearing in ϕ. In this extra credit problem, we will show that if ϕ is not valid, then it is falsified in a model where D has at most 2k elements (from which the lemma easily follows). Suppose M = (D, I) is a model and g an assignment such that M 2g ϕ. We will shrink M to a model with no more than 2k elements that also falsifies ϕ. Let P red(ϕ) be the set of all unary predicates that occur in ϕ. For example, if ϕ is ∀x(P1 (x) → P3 (x)), then P red(ϕ) = {P1 , P3 }. We assume that P red(ϕ) has k elements. It follows that P red(ϕ) has 2k subsets. For each d ∈ D, let db = {d0 ∈ D | for all P ∈ P red(ϕ) : d ∈ I(P ) iff d0 ∈ I(P )}. That is, db is the set of all objects that are “indistinguishable” from d in M using predicates that appear in ϕ. The idea behind the proof is that such indistinguishable b (For example, in the model in objects should be collapsed into a single object d. Figure 4, the objects 1 and 3 cannot be distinguished from each other by any of the four predicates, so they can be collapsed into a single object b 1=b 3.) There can be at k most 2 distinct sets db because each distinct db defines a distinct subset of P red(ϕ), namely the subset {P ∈ P red(ϕ) | d ∈ I(P )}, and there are only 2k subsets of c = (D, b I): b P red(ϕ). This leads us to the definition of our collapsed model M b = {db | d ∈ D}; • D b ) iff d ∈ I(P ); • for each predicate P ∈ P red(ϕ), db ∈ I(P b ) = ∅. • for each predicate P 6∈ P red(ϕ), I(P c by Given any variable assignment g for M, define the variable assignment gb for M [ gb(xi ) = g(x i ). 3 It follows that for any variable x and d ∈ D, \ b g[x := d] = gb[x := d]. It also follows that for any subformula ψ of ϕ and variable assignment g, c gb ψ iff M g ψ. M Prove this ‘iff’ by induction on ψ. Give only the base case where ψ is an atomic formula of the form P (x) and the inductive step where ψ is of the form ∃xα. Use the equations given above (plus the inductive hypothesis for the ∃ case). c 2gb ϕ, From what you proved and our initial assumption that M 2g ϕ, it follows that M which shows that ϕ is indeed falsified in a model with at most 2k elements! 1.2 1.2.1 Constants and Functions Constants 12. 5 points Let ϕ be a formula in which the constant c appears but the variable x does not appear. Prove that if ϕ is valid, then ∀xϕcx is valid, where ϕcx is the formula obtained from ϕ by replacing all occurrences of c with x. Intuitively: “if ϕ holds of an arbitrary element, then it holds of everything.” In your argument you may assume (without proof) the fact that for any model M = (D, I) and variable assignment g for M: M g ϕ iff M g[x:=JcKgI ] ϕcx . You may also assume that if two models M and N differ only on the interpretation of a constant that does not appear in a formula ψ, then M g ψ iff N g ψ. 1.2.2 Function Symbols 13. 5 points Extra Credit In the slides, we mentioned the fact that there is an algorithm that converts any formula ϕ of monadic predicate logic with unary function symbols into a formula ϕ0 of monadic predicate logic without function symbols such that ϕ is valid iff ϕ0 is valid. To give you a feel for how this could be true, let ψ be a formula of monadic predicate logic, P a unary predicate, and f a unary function symbol not occurring in ψ. Prove that the formula ∀z(ψ(z) → P (f (z))) is valid if and only if the formula ∀x∃y(P (y) ↔ Q(x)) → ∀z(ψ(z) → Q(z)) is valid, where Q is a new predicate symbol that does not appear in ψ. Suggestion: suppose ∀z(ψ(z) → P (f (z))) is not valid, so it is falsified in some model 4 M. Then define a model M0 that differs from M at most in the interpretation of 0 the predicate Q (i.e., just define I M (Q) for us), such that M0 falsifies ∀x∃y(P (y) ↔ Q(x)) → ∀z(ψ(z) → Q(z)) (prove it). In the other direction, suppose ∀x∃y(P (y) ↔ Q(x)) → ∀z(ψ(z) → Q(z)) is not valid, so it is falsified in some model N . Then define a model N 0 that differs from N at most in the interpretation of the function symbol 0 f (i.e., just define I N (f ) for us) such that N 0 falsifies ∀z(ψ(z) → P (f (z))) (prove it). 5 Logic in Action –New Edition, November 23, 2016– Johan van Benthem, Hans van Ditmarsch, Jan van Eijck, Jan Jaspars 0-2 Contents 1 General Introduction 1-1 1.1 Inference, Observation, Communication . . . . . . . . . . . . . . . . . . 1-1 1.2 The Origins of Logic . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1-3 1.3 Uses of Inference . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1-5 1.4 Logic and Other Disciplines . . . . . . . . . . . . . . . . . . . . . . . . 1-9 1.5 Overview of the Course . . . . . . . . . . . . . . . . . . . . . . . . . . . 1-11 Classical Systems 2-1 2 2-1 Propositional Logic 2.1 Reasoning in Daily Life . . . . . . . . . . . . . . . . . . . . . . . . . . . 2-1 2.2 Inference Patterns, Validity, and Invalidity . . . . . . . . . . . . . . . . . 2-3 2.3 Classification, Consequence, and Update . . . . . . . . . . . . . . . . . . 2-5 2.4 The Language of Propositional Logic . . . . . . . . . . . . . . . . . . . 2-8 2.5 Semantic Situations, Truth Tables, Binary Arithmetic . . . . . . . . . . . 2-13 2.6 Valid Consequence and Consistency . . . . . . . . . . . . . . . . . . . . 2-18 2.7 Proof . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2-22 2.8 Information Update . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2-24 2.9 Expressiveness . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2-26 2.10 Outlook — Logic, Mathematics, Computation . . . . . . . . . . . . . . . 2-28 2.11 Outlook — Logic and Practice . . . . . . . . . . . . . . . . . . . . . . . 2-32 2.12 Outlook — Logic and Cognition . . . . . . . . . . . . . . . . . . . . . . 2-34 0-3 0-4 3 4 CONTENTS Syllogistic Reasoning 3-1 3.1 Reasoning About Predicates and Classes . . . . . . . . . . . . . . . . . . 3-1 3.2 The Language of Syllogistics . . . . . . . . . . . . . . . . . . . . . . . . 3-4 3.3 Sets and Operations on Sets . . . . . . . . . . . . . . . . . . . . . . . . . 3-5 3.4 Syllogistic Situations . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3-10 3.5 Validity Checking for Syllogistic Forms . . . . . . . . . . . . . . . . . . 3-12 3.6 Outlook — Satisfiability and Complexity . . . . . . . . . . . . . . . . . 3-18 3.7 Outlook — The Syllogistic and Actual Reasoning . . . . . . . . . . . . . 3-21 The World According to Predicate Logic 4-1 4.1 Learning the Language by Doing . . . . . . . . . . . . . . . . . . . . . . 4-2 4.2 Practising Translations . . . . . . . . . . . . . . . . . . . . . . . . . . . 4-8 4.3 Reasoning Patterns with Quantifiers . . . . . . . . . . . . . . . . . . . . 4-13 4.4 Formulas, Situations and Pictures . . . . . . . . . . . . . . . . . . . . . . 4-17 4.5 Syntax of Predicate Logic . . . . . . . . . . . . . . . . . . . . . . . . . . 4-25 4.6 Semantics of Predicate Logic . . . . . . . . . . . . . . . . . . . . . . . . 4-30 4.7 Valid Laws and Valid Consequence . . . . . . . . . . . . . . . . . . . . . 4-35 4.8 Proof . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4-38 4.9 Identity, Function Symbols, Algebraic Reasoning . . . . . . . . . . . . . 4-41 4.10 Outlook — Mathematical Background . . . . . . . . . . . . . . . . . . . 4-46 4.11 Outlook — Computational Connection . . . . . . . . . . . . . . . . . . . 4-49 4.12 Outlook — Predicate Logic and Philosophy . . . . . . . . . . . . . . . . 4-51 Knowledge, Action, Interaction 5 Logic, Information and Knowledge 4-57 5-1 5.1 Logic and Information Flow . . . . . . . . . . . . . . . . . . . . . . . . 5-1 5.2 Information versus Uncertainty . . . . . . . . . . . . . . . . . . . . . . . 5-3 5.3 Modeling Information Change . . . . . . . . . . . . . . . . . . . . . . . 5-10 5.4 The Language of Epistemic Logic . . . . . . . . . . . . . . . . . . . . . 5-12 5.5 Models and Semantics for Epistemic Logic . . . . . . . . . . . . . . . . 5-15 CONTENTS 0-5 5.6 Valid Consequence . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5-21 5.7 Proof . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5-25 5.8 Information Update . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5-30 5.9 The Logic of Public Announcement . . . . . . . . . . . . . . . . . . . . 5-37 5.10 Outlook — Information, Knowledge, and Belief . . . . . . . . . . . . . . 5-42 5.11 Outlook – Social Knowledge . . . . . . . . . . . . . . . . . . . . . . . . 5-44 5.12 Outlook – Secrecy and Security . . . . . . . . . . . . . . . . . . . . . . 5-47 6 Logic and Action 6-1 6.1 Actions in General . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6-1 6.2 Sequence, Choice, Repetition, Test . . . . . . . . . . . . . . . . . . . . . 6-6 6.3 Viewing Actions as Relations . . . . . . . . . . . . . . . . . . . . . . . . 6-10 6.4 Operations on Relations . . . . . . . . . . . . . . . . . . . . . . . . . . . 6-13 6.5 Combining Propositional Logic and Actions: PDL 6.6 Transition Systems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6-20 6.7 Semantics of PDL . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6-23 6.8 Axiomatisation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6-26 6.9 Expressive power: defining programming constructs . . . . . . . . . . . . 6-30 . . . . . . . . . . . . 6-17 6.10 Outlook — Programs and Computation . . . . . . . . . . . . . . . . . . 6-31 6.11 Outlook — Equivalence of Programs and Bisimulation . . . . . . . . . . 6-35 7 Logic, Games and Interaction 7-1 7.1 Logic meets Games . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7-1 7.2 Evaluation of Assertions as a Logical Game . . . . . . . . . . . . . . . . 7-4 7.3 Zermelo’s Theorem and Winning Strategies . . . . . . . . . . . . . . . . 7-8 7.4 Sabotage Games: From Simple Actions to Games . . . . . . . . . . . . . 7-11 7.5 Model Comparison as a Logic Game . . . . . . . . . . . . . . . . . . . . 7-13 7.6 Different Formulas in Model Comparison Games . . . . . . . . . . . . . 7-16 7.7 Bisimulation Games . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7-19 7.8 Preference, Equilibrium, and Backward Induction . . . . . . . . . . . . . 7-21 7.9 Game logics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7-31 0-6 CONTENTS 7.10 Games with imperfect information . . . . . . . . . . . . . . . . . . . . . 7-33 7.11 Logic and Game Theory . . . . . . . . . . . . . . . . . . . . . . . . . . 7-36 7.12 Outlook — Iterated Game Playing . . . . . . . . . . . . . . . . . . . . . 7-44 7.13 Outlook — Knowledge Games . . . . . . . . . . . . . . . . . . . . . . . 7-46 7.14 Outlook — Games and Foundations . . . . . . . . . . . . . . . . . . . . 7-47 7.15 Outlook — Games, Logic and Cognition . . . . . . . . . . . . . . . . . . 7-48 Methods 8-1 8 8-1 Validity Testing 8.1 Tableaus for propositional logic . . . . . . . . . . . . . . . . . . . . . . 8-2 8.1.1 8.2 8.3 9 Tableaus for predicate logic . . . . . . . . . . . . . . . . . . . . . . . . . 8-9 8.2.1 Rules for quantifiers . . . . . . . . . . . . . . . . . . . . . . . . 8-13 8.2.2 Alternative rules for finding finite counter-models . . . . . . . . . 8-17 8.2.3 Invalid inferences without finite counter-examples . . . . . . . . 8-19 8.2.4 Tableaus versus natural reasoning . . . . . . . . . . . . . . . . . 8-20 Tableaus for epistemic logic . . . . . . . . . . . . . . . . . . . . . . . . 8-22 Proofs 9.1 9.2 9.1.1 Proof by refutation . . . . . . . . . . . . . . . . . . . . . . . . . 9-5 9.1.2 Introduction and elimination rules . . . . . . . . . . . . . . . . . 9-7 9.1.3 Rules for conjunction and disjunction . . . . . . . . . . . . . . . 9-9 Natural deduction for predicate logic . . . . . . . . . . . . . . . . . . . . 9-13 Rules for identity . . . . . . . . . . . . . . . . . . . . . . . . . . 9-18 Natural deduction for natural numbers . . . . . . . . . . . . . . . . . . . 9-18 9.3.1 9.4 9-1 Natural deduction for propositional logic . . . . . . . . . . . . . . . . . . 9-2 9.2.1 9.3 Reduction rules . . . . . . . . . . . . . . . . . . . . . . . . . . . 8-5 The rule of induction . . . . . . . . . . . . . . . . . . . . . . . . 9-20 Outlook . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9-23 9.4.1 Completeness and incompleteness . . . . . . . . . . . . . . . . . 9-23 9.4.2 Natural deduction, tableaus and sequents . . . . . . . . . . . . . 9-23 CONTENTS 0-7 9.4.3 Intuitionistic logic . . . . . . . . . . . . . . . . . . . . . . . . . 9-23 9.4.4 Automated deduction . . . . . . . . . . . . . . . . . . . . . . . . 9-23 10 Computation 10-1 10.1 A Bit of History . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10-1 10.2 Processing Propositional Formulas . . . . . . . . . . . . . . . . . . . . . 10-3 10.3 Resolution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10-7 10.4 Automating Predicate Logic . . . . . . . . . . . . . . . . . . . . . . . . 10-12 10.5 Conjunctive Normal Form for Predicate Logic . . . . . . . . . . . . . . . 10-15 10.6 Substitutions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10-17 10.7 Unification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10-19 10.8 Resolution with Unification . . . . . . . . . . . . . . . . . . . . . . . . . 10-24 10.9 Prolog . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10-26 Appendices A Sets, Relations and Functions A-1 A-1 A.1 Sets and Set Notation . . . . . . . . . . . . . . . . . . . . . . . . . . . . A-1 A.2 Relations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . A-3 A.3 Back and Forth Between Sets and Pictures . . . . . . . . . . . . . . . . . A-5 A.4 Relational Properties . . . . . . . . . . . . . . . . . . . . . . . . . . . . A-6 A.5 Functions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . A-9 A.6 Recursion and Induction . . . . . . . . . . . . . . . . . . . . . . . . . . A-11 B Solutions to the Exercises B-1 0-8 CONTENTS Chapter 1 General Introduction 1.1 Inference, Observation, Communication Much of our interaction with each other in daily life has to do with information processing and reasoning about knowledge and ignorance of the people around us. If I ask a simple question, like “Can you tell me where to find the Opera House?”, then I convey the information that I do not know the answer, and also, that I think that you may know. Indeed, in order to pick out the right person for asking such informative questions, we need to reason about knowledge of others. It is our ability to reason in the presence of other reasoning agents that has made us historically so successful in debate, organization, and in planning collective activities. And it is reasoning in this broad sense that this course is about. We will study informational processes of inference and information update – and while we can start dealing with these for single agents, our theories must also work interactively when many agents exchange information, say, in a conversation or a debate. As we proceed, you will see many further aspects of this program, and you will learn about mathematical models for it, some quite recent, some already very old. Reasoning and Proof While reasoning in daily life and solving practical tasks is important, many logical phenomena become more pronounced when we look at specialized areas, where our skills have been honed to a greater degree. To see the power of pure inference unleashed, think of mathematical proofs. Already in Greek Antiquity (and in parallel, in other cultures), logical inference provided a searchlight toward surprising new mathematical facts. In our later √ chapter on Proof, we will give examples, including the famous Pythagorean proof that 2 is not a rational number.The Holy Writ of this tradition are Euclid’s Elements from around 300 BC with its formal set-up of axioms, definitions, and theorems for geometry. 1-1 1-2 CHAPTER 1. GENERAL INTRODUCTION (1.1) Indeed, mathematical methods have deeply influenced the development of logic. They did so in two ways. First, mathematical proof is about the purest form of inference that exists, so it is an excellent ‘laboratory’ for studying inference. But also, mathematics is about the clearest way that we have for modeling phenomena and studying their properties, and logical systems of any kind, even when dealing with daily life, use mathematical techniques. Reasoning and Observation Combinations of inference with other information sources drive the natural sciences, where experiments provide information that is just as crucial as mathematical proof. Observations about Nature made by scientists involves the same sort of information update as in simple question answering. Seeing new facts removes uncertainty. And the art is to ask the right questions, to find the right mixtures of new evidence and deduction from what we have seen already. The same skill actually occurs in other specialized practices. Conan Doyle’s famous detective Sherlock Holmes is constantly thinking about what follows from what he has seen already, but he also uses his powers of deduction to pinpoint occasions where he needs new evidence. In a famous story, the dog did not bark at night-time (and so, the intruder must have been known to the dog), but this conclusion also directs attention toward making further observations, needed to see which of the various familiar persons committed the crime. 1.2. THE ORIGINS OF LOGIC 1-3 (1.2) Reasoning and Argumentation From crime it is only one step to lawyers and courts. Legal reasoning is another major tradition where logic is much in evidence, and we will return to this later. 1.2 The Origins of Logic Logic as a systematic discipline dates back two and a half millennia: younger than Mathematics or the Law, but much older than most current academic disciplines, social institutions, or for that matter, religions. Aristotle and the Stoic philosophers formulated explicit systems of reasoning in Greek Antiquity around 300 BC. (1.3) Aristotle appearing on two Greek postal stamps The early Stoic Zeno of Citium Independent traditions arose around that time in China and in India, which produced famous figures like the Buddhist logician Dignaga, or Gangesa, and this long tradition lives on in some philosophical schools today. Through translations of Aristotle, logic also reached the Islamic world. The work of the Persian logician Avicenna around 1000 AD was still taught in madrassa’s by 1900. All these traditions have their special concerns and features, and there is a growing interest these days in bringing them closer together. 1-4 CHAPTER 1. GENERAL INTRODUCTION We mention this point because the cross-cultural nature of logic is a social asset beyond its scientific agenda. (1.4) Mo Zi, founder of Mohism Dignaga, Indian Buddhist Logician Avicenna, Persian Logician Still, with all due respect for this historical past that is slowly coming to light, it seems fair to say that logic made a truly major leap in the nineteenth century, and the modern logic that you will see in this course derives its basic mind-set largely from the resulting golden age of Boole, Frege, Gödel, and others: a bunch of European university professors, some quite colourful, some much less so. (1.5) George Boole on the cover of the ‘Laws of Thought’ (1847), the book that created propositional logic, the theme of the next chapter. Gottlob Frege with on the right the first page of his ‘Begriffsschrift’ (1879), with the system of first-order predicate logic that can analyze much of mathematics. Even so, it remains an intriguing and unsolved historical question just how and why logic arose — and we will have more to say on this below. The standard story is that great thinkers like Aristotle suddenly realized that there is structure to the human reasoning that we see all around us. Some patterns are valid and reliable, while others are not. But it has also been suggested that an interest in logic arose out of philosophical, mathematical, juridical, or even political practice. Some ‘moves’ worked, others did not – and people became curious to see the general reasons why. 1.3. USES OF INFERENCE 1.3 1-5 Uses of Inference The TV has gone dark. If it goes dark, this is due to the apparatus or the remote (or both). But the remote is working, so it must be the apparatus, and we must start repairs there. This pattern involves a logical key-word, the disjunction ‘or’: A or R, not R. So: A. (1.6) In pure form, we can also see this pattern at work in solving Sudoku puzzles. Logic also helps create new Sudoku puzzles. Start with any complete nine-digit diagram. Now pick a random slot and remove the digit in that slot. The remaining digits in the diagram still completely determine what should be in the open slot, for the digit in that slot follows by logical inference (or: by valid inference) from the other digits and the general sudoku constraints. In this way, one can go on picking filled positions at random, and checking if the digit in that position still follows from others by a valid inference. Keep doing this until no longer possible. You have now generated a minimal puzzle, and since your steps are hidden, it may take readers quite a while to figure out the unique solution. Cognitive scientists have suggested that the primary use of logic may have been in planning. Clearly, thinking about constraints and consequences of tasks beforehand is an immense evolutionary advantage. Here is a simple illustration. Planning a party How can we send invitations given the following constraints? (i) John comes if Mary or Ann comes. (ii) Ann comes if Mary does not come. (1.7) (iii) If Ann comes, John does not. In the chapter on propositional logic, you will learn simple techniques for solving this: for now, just try! (Here is a hint: start out with a ‘maximal’ invitation list John, Ann, Mary, and check what you have to drop to satisfy the constraints. Bear in mind that there may be several solutions to this.) Legal reasoning We also said that daily skills can be optimized for special purposes. As we said already, inference is crucial to legal reasoning, and so is the earlier-mentioned multi-agent feature that different actors are involved: defendant, lawyer, prosecutor, judge. The prosecutor has to prove that the defendant is guilty (G) on the basis of the available admissible evidence (E), i.e., she has to prove the conclusion G from evidence E. But the usual ‘presumption of innocence’ means that the lawyer has another logical task: viz. making it plausible that G does not follow from E. This does not require her to demonstrate that her client is innocent: she just needs to paint one scenario consistent with the evidence E where G fails, whether it is the actual one or not. 1-6 CHAPTER 1. GENERAL INTRODUCTION Logical key-words There are certain logical key-words driving patterns of inference. Expressions like ‘not’, ‘and’, ‘or’, ‘if then’ are sentence forming constructions that classify situations as a whole. What we mean by this is that these expressions can be used to construct new sentences from existing sentences. From “it is raining” to “it is not raining”. From “it is raining” and “it is wet” to “if it is raining then it is wet”, and so on. But there are other expressions that tell us more about the internal structure of these situations, in terms of objects and their properties and relations. “Hans is friendly” ascribes a property to a person. “Hans and Jan are colleagues” describes a relation between two persons. Historically, the most important example are quantifiers, expressions of quantity such as ‘all’, ‘every’, ‘some’ or ‘no’. “All logicians are friendly” describes how the properties of being a logician and being friendly are related, using the quantifier ‘all’. The view of inference as the result of replacing some parts in expressions by variable parts, so that only logical key-words and variables remain, can already be found in the work of the Bohemian philosopher and priest Bernhard Bolzano (1781 – 1848). (1.8) Bernard Bolzano Aristotle’s syllogisms listed the basic inference patterns with quantifiers, such as All humans are animals, no animals are mortal. So, no humans are mortal. (1.9) This is a valid inference. But the following is not valid: Not all humans are animals, no animals are mortal. So, some humans are mortal. (1.10) Syllogistic forms were long considered the essence of logical reasoning, and their format has been very influential until the 19th century. Today, they are still popular test cases for psychological experiments about human reasoning. Quantifiers are essential to understanding both ordinary and scientific discourse. If you unpack standard mathematical assertions, you will find any amount of stacked quantifiers. For instance, think of saying that 7 is a prime number. This involves: All of 7’s divisors are either equal to 1 or to 7, where x divides y if for some z: x · z = y. (1.11) 1.3. USES OF INFERENCE 1-7 Here ‘all of’ and ‘for some’ are the quantifiers that provide the logical glue of the explanation of what it means to be prime, or to be a divisor. Other examples with many quantifiers occur in Euclid’s geometry and spatial reasoning in general. We will devote two entire chapters to the logic of the quantifiers ‘all’, ‘some’, given its central importance. Actually, natural language has many further quantifier expressions, such as ‘three’, ‘most’, ‘few’, ‘almost all’, or ‘enough’. This broad repertoire raises many issues of its own about the expressive and communicative function of logic, but we sidestep these here. Many further logical key-words will emerge further on in this course, including expressions for reasoning about knowledge and action. Another crucial feature of logic, that makes it a true scientific endeavour in a systematic sense, is the turning of human reasoning to itself as a subject of investigation. But things go even one step further. Logicians study reasoning practices by developing mathematical models for them – but then, they also make these systems themselves into a new object of investigation. Logical systems Indeed, Aristotle already formulated explicit logical systems of inference in his Syllogistics, giving all valid rules for syllogistic quantifier patterns. Interestingly, Aristotle also started the study of grammar, language looking at language — and earlier than him, the famous Sanskrit grammarian Panini had used mathematical systems there, creating a system that is still highly sophisticated by modern standards: (1.12) This mathematical system building tradition has flourished over time, largely (but not exclusively) in the West. In the nineteenth century, George Boole gave a complete analysis of propositional logic for reasoning with sentential operators like ‘not’, ‘and’, ‘or’, that has become famous as the ‘Boolean algebra’ that underlies the switching circuits of your computer. Boole showed that all valid principles of propositional reasoning can be derived from a simple calculus, by purely algebraic manipulations. We will explain how this works later on in this course. Subsequently, Frege gave formal systems for reasoning with quantifiers in ways that go far beyond Aristotle’s Syllogistic. Over time, systems in this line have proved strong enough to formalize most of mathematics, including its foundational set theory. 1-8 CHAPTER 1. GENERAL INTRODUCTION Foundations of mathematics Through this process of scrutiny, mathematical and logical theories themselves become objects of investigation. And then, some startling discoveries were made. For instance, here is the so-called Russell Paradox from the foundations of set theory. Set theory is a general way of talking about collections of entities What the Russell paradox tells us is that we have to be very careful in how to express ourselves in talking about collections of entities. For suppose anything goes in defining sets, so that, if we have a description we can construct the set of all things satisfying the description. Then the following can happen. Some sets contain themselves as a member (e.g., the set of all non-teaspoons is not a teaspoon, so the set of non-teaspoon has itself as a member). Others do not (for instance, the set of all teaspoons is not itself a teaspoon.) Now consider the set R of all sets that do not have themselves as members. It is easy to see that R is a member of R if and only if R is not a member of R: and that is a contradiction. The sort of reasoning that leads to this paradox will be taken up in several later chapters. The formal definition of the Russell set R is: R = {x | x ∈ / x}. The paradoxical statement is: R ∈ R if and only if R ∈ / R. If you have never seen the symbol ∈ or the bracket notation {x | . . .} then you should at some point consult Appendix A to catch up with the rest of us. The foundational problems in the development of logic illustrated by Russell’s paradox led to the so-called foundational study of mathematics, which investigates formal properties of mathematical theories, and power and limits of proofs. A famous name here is Kurt Gödel, probably the greatest figure in the history of logic. His incompleteness theorems are fundamental insights into the scope and reliability of mathematics, that got him on the TIME 2001 list of most influential intellectuals of the twentieth century. But in Amsterdam, we also cite our own L.E.J. Brouwer, the father of ‘intuitionistic logic’, an important program in the foundations of mathematics and computation. These mathematical theoretical aspects of logic belong more properly to an advanced course, but we will give you some feeling for this theme further on in this book. 1.4. LOGIC AND OTHER DISCIPLINES 1-9 (1.13) Kurt Gödel 1.4 Brouwer on a Dutch post stamp Logic and Other Disciplines Looking at the list of topics discussed above, you have seen switches from language and conversation to mathematics and computation. Indeed, in a modern university, logic lies at a cross-roads of many academic disciplines. This course will make you acqainted with a number of important systems for doing logic, but it will also draw many connections between logic and related disciplines. We have already given you a taste of what logic has to do with mathematics. Mathematics supplies logic with its techniques, but conversely, logic can also be used to analyze the foundations of mathematics. Now we look at a few more important alliances. Logic, language and philosophy Perhaps the oldest connection of logic is with philosophy. Logic has to do with the nature of assertions, meaning, and knowledge, and philosophers have been interested in these topics from the birth of philosophy. Logic can serve as a tool for analyzing philosophical arguments, but it is also used to create philosophical systems. Logical forms and calculating with these is a role model for conceptual abstraction. It has even been claimed that logical patterns of the sort sketched here are close to being a ‘universal language of thought’. But it will also be clear that logic has much to do with linguistics, since logical patterns arise from abstraction out of the grammar of ordinary language, and indeed, logic and linguistics share a long history from Antiquity through the Middle Ages. Logic and computation Another long-standing historical theme interleaves logic and computation. Since the Middle Ages, people have been fascinated by machines that would 1-10 CHAPTER 1. GENERAL INTRODUCTION automate reasoning, and around 1700, Leibniz (1.14) Gottfried Wilhelm von Leibniz The first binary addition mechanism as described by Leibniz in a paper called ‘Mechanica Dyadica’ (around 1700) realized that logical inference may be viewed as a sort of computation, though not with ordinary but with binary numbers. A straight line runs from here to modern computers and computer science, and the seminal work of Turing and others. (1.15) Alan Turing A ‘Turing Machine’ Logic and games While mathematics, philosophy, linguistics, and computer science are old neighbours of logic, new interfaces keep emerging. We end with one directed toward the social and behavioural sciences. As we have said before, logic had its origins in a tradition of conversation, debate, and perhaps legal procedure. This brings us back to our earlier theme that much logical behaviour is interactive, crucially involving other persons. Argumentation itself is a key example. There are different parties playing different roles, and reacting to each other over time. This clearly has the structure of a game. In such a game logical operations like ‘or’, ‘and’ and ‘not’ function as a sort of ‘switches’, not just in a Boolean computer, but also in discussion. When I defend that ‘A or B’, then you can hold me to this, and I have to choose eventually which of the two I will defend. Thus, a disjunction offers a choice to its defender — and likewise, a conjunction ‘A and B’ 1.5. OVERVIEW OF THE COURSE 1-11 offers a choice to the attacker: since the defender is committed to both parts. Interesting interactions also arise by means of the third item of Boolean algebra: logical negation. This triggers a role switch: defending ‘not A’ is attacking ‘A’, and vice versa. Indeed, being able to ‘put yourself in another person’s place’ has been called the quintessential human cognitive achievement. In this way, logic comes to describe the structure of rational interaction between conversation partners. Traditions of vigorous regimented logical debating games flourished in the Middle Ages, and they still do in some parts of the world: (1.16) Karma Guncho, ten monasteries battle each other on Buddhist philosophy using logical analysis. In this game setting, we may call an inference valid if the defender of the conclusion has a ‘winning strategy’: that is, a rule for playing which will always lead her to win the game against any defender of the premises, whatever that person brings up over time. But if logic has much to do with games, then it also has links with economic game theory, and not surprisingly, this is another flourishing interface today. We will develop this topic in greater depth in a separate chapter, but now you know why. 1.5 Overview of the Course In this course, logic will be presented as a key element in the general study of reasoning, information flow and communication: topics with a wide theoretical and practical reach. The course starts with introductions to three important systems of reasoning: propositional logic (Chapter 2), syllogistics (Chapter 3), and predicate logic (Chapter 4). Together, these describe situations consisting of objects with a great variety of structure, and in doing so, they cover many basic patterns that are used from natural language to the depths of mathematics. Next, we move on to the newer challenges on a general agenda of studying information flow. The first is agents having information and interacting through questions, answers, and other forms of communication. This social aspect is crucial if you think about how we use language, or how we behave in scientific investigation. We will model observation and reasoning in a multi-agent setting, introducing the logic of knowledge in Chapter ??. 1-12 CHAPTER 1. GENERAL INTRODUCTION To model the dynamic aspect of all this, we turn to the basic logic of action in Chapter 6. And Chapter 7 takes up a more recent theme: the use of games as a model of interaction. These bring together many of the separate topics in the course so far. The next group of chapters then develop three logical methods more in detail. Chapter 8 is about precise ways of testing logical validity, that give you a sense of how a significant logical calculus really works. Chapter 9 goes into mathematical proof and its structures. Chapter 10 gives more details on the many relations between logic and computation. In all of these chapters, and even more in the internet version of this text, you will find links to topics in philosophy, mathematics, linguistics, cognition and computation, and you will discover that logic is a natural ‘match-maker’ between these disciplines. We have tried to give an indication of the difficulty of the exercises, as follows: ♥ indicates that a problem is easy (solving the problems marked as ♥ can be used as a test to check that you have digested the explanations in the text), ♠ indicates that a problem is a bit harder than average, and ♠♠ indicates that a problem is quite hard. If you feel you can handle an extra challenge, you are encouraged to try your hand at these. Classical Systems 1-13 Chapter 2 Propositional Logic Overview The most basic logical inferences are about combinations of sentences, expressed by such frequent expressions as ‘not’, ‘and’, ‘or’, ‘if, then’. Such combinations allow you to describe situations, and what properties these situations have or lack: something is ‘not this, but that’. You could call this reasoning about ‘classification’, and it is the basis of any description of the world. At the same time, these logical sentence combinations are also fundamental in another sense, as they structure how we communicate and engage in argumentation. When you disagree with a claim that someone makes, you often try to derive a consequence (’if then’) whose negation (‘not’) is easier to show. We will study all these patterns of reasoning below. More precisely, in this first chapter you will be introduced to propositional logic, the logical system behind the reasoning with ‘not’, ‘and’, ‘or’, ‘if, then’ and other basic sentence-combining operators. You will get acquainted with the notions of formula, logical connective, truth, valid consequence, information update, formal proof, and expressive power, while we also present some backgrounds in computation and cognition. 2.1 Reasoning in Daily Life Logic can be seen in action all around us: In a restaurant, your Father has ordered Fish, your Mother ordered Vegetarian, and you ordered Meat. Out of the kitchen comes some new person carrying the three plates. What will happen? We have know this from experience. The waiter asks a first question, say “Who ordered the meat?”, and puts that plate. Then he asks a second question “Who has the fish?”, and puts that plate. And then, without asking further, he knows he has to put the remaining plate in front of your Mother. What has happened here? 2-1 2-2 CHAPTER 2. PROPOSITIONAL LOGIC Starting at the end, when the waiter puts the third plate without asking, you see a major logical act ‘in broad daylight’: the waiter draws a conclusion. The information in the two answers received allows the waiter to infer automatically where the third dish must go. We represent this in an inference schema with some special notation (F for “fish”, M for “meat”, V for “vegetarian”): F or V or M, not M, not F =⇒ V. (2.1) This formal view has many benefits: one schema stands for a wide range of inferences, for it does not matter what we put for F , V and M . Inferences often come to the surface especially vividly in puzzles, where we exercise our logical abilities just for the fun of it. Think of successive stages in the solution of a 3 × 3 Sudoku puzzle, produced by applying the two basic rules that each of the 9 positions must have a digit, but no digit occurs twice on a row or column: (2.2) Each successive diagram displays a bit more explicit information about the solution, which is already implicitly determined by the initial placement of the two digits 1, 2. And the driving mechanism for these steps is exactly our Restaurant inference. Think of the step from the first to the second picture. The top right dot is either 1, 2 or 3. It is not 1. It is not 2. Therefore, it has to be 3. But is much more information flow in this Restaurant scene. Before his final inference, the waiter first actively sought to find out enough facts by another typical informationproducing act, viz. asking a question. And the answers to his two questions were also crucial. The essence of this second process is a form of computation on information states. During a conversation, information states of people – singly, and in groups – change over time, triggered by communicative events. The Restaurant scenario starts with an initial information state consisting of six options, all the ways in which three plates can be distributed over three people (M F V, M V F, ...). The answer to the first question then reduces this to two (the remaining orders F V , V F ), and the answer to the second question reduces this to one, zooming in on just the actual situation (for convenience, assume it is M F V ). This may be pictured as a diagram (‘video’) of successive updates: 2.2. INFERENCE PATTERNS, VALIDITY, AND INVALIDITY MFV MVF first answer FMV FVM VMF 2-3 second answer MFV VFM MVF MFV 2 (2.3) 1 6 2.2 Inference Patterns, Validity, and Invalidity Consider the following statement from your doctor: If you take my medication, you will get better. But you are not taking my medication. (2.4) So, you will not get better. Here the word ‘so’ (or ‘therefore’, ‘thus’, etc.) suggests the drawing of a conclusion from two pieces of information: traditionally called the ‘premises’. We call this an act of inference. Now, as it happens, this particular inference is not compelling. The conclusion might be false even though the two premises are true. You might get better by taking that greatest medicine of all (but so hard to swallow for modern people): just wait. Relying on a pattern like this might even be pretty dangerous in some scenarios: If I resist, the enemy will kill me. But I am not resisting. (2.5) So, the enemy will not kill me. Now contrast this with another pattern: If you take my medication, you will get better. But you are not getting better. (2.6) So, you have not taken my medication. This is valid: there is no way that the two stated premises can be true while the conclusion is false. It is time for a definition. Broadly speaking, 2-4 CHAPTER 2. PROPOSITIONAL LOGIC we call an inference valid if there is ‘transmission of truth’: in every situation where all the premises are true, the conclusion is also true. Stated differently but equivalently, an inference is valid if it has no ‘counter-examples’: that is, situations where the premises are all true while the conclusion is false. This is a crucial notion to understand, so we dwell on it a bit longer. What validity really tells us While this definition makes intuitive sense, it is good to realize that it may be weaker than it looks a first sight. For instance, a valid inference with two premises P1 , P2 , so C (2.7) allows many combinations of truth and falsity. If any premise is false, nothing follows about the conclusion. In particular, in the second doctor example, the rule may hold (the first premise is true), but you are getting better (false second premise), and you did take the medication (false conclusion). Of all eight true-false combinations for three sentences, validity rules out 1 (true-true-false)! The most you can say for sure thanks to the validity can be stated in one of two ways: (a) if all premises are true, then the conclusion is true (b) if the conclusion is false, then at least one premise is false (2.8) The first version is how people often think of logic: adding more things that you have to accept given what you have accepted already. But there is an equally important use of logic in refuting assertions, perhaps made by your opponents. You show that some false consequence follows, and then cast doubt on the original assertion. The second formulation says exactly how this works. Logical inferences also help us see what things are false — or maybe more satisfyingly, refute someone else. But note the subtlety: a false conclusion does not mean that all premises were false, just that at least one is. Detecting this bad apple in a basket may still take further effort. To help you understand both aspects of validity, consider the tree below: representing a ‘complex argument’ consisting of individual inferences with capital letters for sentences, premises above the bar, and the conclusion below it. Each inference in the tree is valid: A A B C D E E A F B (2.9) G You are told reliably that sentence A is true and G is false. For which further sentences that occur in the tree can you now determine their truth and falsity? (The answer is that A, B, are true, C, D, E , G are false, while we cannot tell whether F is true or false.) 2.3. CLASSIFICATION, CONSEQUENCE, AND UPDATE 2-5 Inference patterns The next step in the birth of logic was the insight that the validity and invalidity here have to do with abstract patterns, the shapes of the inferences, rather than their specific content. Clearly, the valid second argument would also be valid in the following concrete form, far removed from doctors and medicine: If the enemy cuts the dikes, Holland will be inundated. Holland is not inundated. (2.10) So, the enemy has not cut the dikes. This form has variable parts (we have replaced some sentences by others), but there are also constant parts, whose meaning must stay the same, if the inference is to be valid. For instance, if we also replace the negative word ‘not’ by the positive word ‘indeed’, then we get the clearly invalid inference: If the enemy cuts the dikes, Holland will be inundated. Holland is indeed inundated. (2.11) So, the enemy has indeed cut the dikes. For counter-examples: the inundation may be due to faulty water management, rain, etc. To bring out the relevant shared underlying form of inferences, we need a notation for both fixed and variable parts. We do this by using variable letters for expressions that can be replaced by others in their linguistic category, plus special notation for key expressions that determine the inference, often called the logical constants. 2.3 Classification, Consequence, and Update Classification The main ideas of propositional logic go back to Antiquity (the Stoic philosopher Chrysippus of Soli, c.280–c.207 BC), but its modern version starts in the nineteenth century, with the work of the British mathematician George Boole (1815– 1864). 2-6 CHAPTER 2. PROPOSITIONAL LOGIC Chrysippus George Boole Our earlier examples were essentially about combinations of propositions (assertions expressed by whole sentences). From now on, we will indicate basic propositions by letters p, q, etcetera. A finite number of such propositions generates a finite set of possibilities, depending on which are true and which are false. For instance, with just p, q there are four true/false combinations, that we can write as pq, pq, pq, pq (2.12) where p represents that p is true and p that p is false. Thus, we are interested in a basic logic of this sort of classification. (Note that p is not meant as a logical proposition here, so that it is different from the negation not-p that occurs in inferences that we will use below. The distinction will only become clear later.) Drawing consequences Now consider our earlier examples of valid and invalid arguments. For instance, (a) the argument “from if-p-then-q and not-p to not-q” was invalid, whereas (b) the argument “from if-p-then-q, not-q to not-p” was valid. Our earlier explanation of validity for a logical consequence can now be sharpened up. In this setting, it essentially says the following: each of the above four combinations that makes the premises true must also make the conclusion true. You can check whether this holds by considering all cases in the relevant list that satisfy the premises. For instance, in the first case mentioned above, 2.3. CLASSIFICATION, CONSEQUENCE, AND UPDATE 2-7 (a) not-p is true at pq and pq. if-p-then-q holds also in these two situations, since the condition p is not true. So, the first of the two situations, pq, support the two premises but the conclusion not-q is false in this situation. The argument is therefore invalid! For the second case we get (b) not-q is true at pq and pq. while if-p-then-q only holds in the second, so pq is the only situation in which all the premises hold. In this situation not-p also holds, and therefore, the argument is valid. Updating information Propositional logic describes valid (and invalid) inference patterns, but it also has other important uses. In particular, it describes the information flow in earlier examples, that may arise from observation, or just facts that are being told. With the set of all combinations present, we have no information about the actual situation. But we may get additional information, ruling out options. To see how, consider a simple party, MFV with just MVFtwo possible invitees Mary and John. We write p and q, respecfirst answer second answer tively, for “Mary comes toFVM the party” and “John Suppose that we are FMV MFV comes MVF to the party”. MFV first told that at least one of the invitees comes to the party: p-or-q. Out of four possible 1 VMF VFM situations this new information rules out just one, viz. 2pq. Next, the we learn that not-p. MFV MVF6 answerleft with only the second answer situation pq. Here is This rules out two more options, and wefirstare actual FMV FVM MFV MVF MFV a ‘video-clip’ of the successive information states, that get ‘updated’ by new information: VMF VFM 1 2 6 pq pq pq p or q pq not p pq pq pq (2.13) pq pq pq pq pq pq p!q pq ¬p pq pq Incidentally, you can now also see why the conclusion q is a valid inference from ‘p or q’ and ‘not p’. Adding the informationqthat q does not change the final information state, pq pq nothing is ruled out: q pq pq (2.14) But if adding the information that q does not change anything, this means that q is already true. So the truth of q is guaranteed by the fact that the two earlier updates have taken place. This must mean that q is logically implied by the earlier formulas. Exercise 2.1 Consider the case where there are three facts that you are interested in. You wake up, you open your eyes, and you ask yourself three things: “Have I overslept?”, “Is it raining?”, 2-8 CHAPTER 2. PROPOSITIONAL LOGIC “Are there traffic jams on the road to work?”. To find out about the first question, you have to check your alarm clock, to find out about the second you have to look out of the window, and to find out about the third you have to listen to the traffic info on the radio. We can represent these possible facts with three basic propositions, p, q and r, with p expressing “I have overslept”, q expressing “It is raining”, and r expressing “There are traffic jams.” Suppose you know nothing yet about the truth of your three facts. What is the space of possibilities? Exercise 2.2 (Continued from previous exercise.) Now you check your alarm clock, and find out that you have not overslept. What happens to your space of possibilities? Toward a system Once we have a system in place for these tasks, we can do many further things. For instance, instead of asking whether a given inference is valid, we can also just look at given premises, and ask what would be a most informative conclusion. Here is a case that you can think about (it is used as a basic inference step to program computers that perform reasoning automatically): Exercise 2.3 You are given the information that p-or-q and (not-p)-or-r. What can you conclude about q and r? What is the strongest valid conclusion you can draw? (A statement is stronger than another statement if it rules out more possibilities.) A precise system for the above tasks can also be automated, and indeed, propositional logic is historically important also for its links with computation and computers. Computers become essential with complex reasoning tasks, that require many steps of inference or update of the above simple kinds, and logical systems are close to automated deduction. But as we shall see later in Section 2.10, there is even a sense in which propositional logic is the language of computation, and it is tied up with deep open problems about the nature of computational complexity. But the start of our story is not in computation, but in natural language. We will identify the basic expressions that we need, and then sharpen them up in a precise notation. 2.4 The Language of Propositional Logic Reasoning about situations involves complex sentences with the ‘logical connectives’ of natural language, such as ‘not’, ‘and’, ‘or’ and ‘if .. then’. These are not the only expressions that drive logical reasoning, but they do form the most basic level. We could stay close to natural language itself to define our system (traditional logicians often did), but it has become clear over time that working with well-chosen notation makes things much clearer, and easier to manipulate. So, just like mathematicians, logicians use formal notations to improve understanding and facilitate computation. 2.4. THE LANGUAGE OF PROPOSITIONAL LOGIC 2-9 From natural language to logical notation As we have seen in Section 2.3, logical forms lie behind the valid inferences that we see around us in natural language. So we need a good notation to bring them out. For a start, we will use special symbols for the key logical operator words: Symbol In natural language Technical name ¬ not negation ∧ and conjunction ∨ or disjunction → if ... then implication ↔ if and only if equivalence (2.15) Other notations occur in the literature, too: some dialects have & for ∧, and ≡ for ↔. We write small letters for basic propositions p, q, etcetera. For arbitrary propositions, which may contain connectives as given in the table (2.15), we write small Greek letters ϕ, ψ, χ, etc. Inclusive and exclusive disjunction The symbol ∨ is for inclusive disjunction, as in ‘in order to pass the exam, question 3 or question 4 must have been answered correctly’. Clearly, you don’t want to be penalized if both are correct! This is different from the exclusive disjunction (most often written as ⊕), as in ‘you can marry Snowwhite or Cinderella’. This is not an invitation to marry both at the same time. When we use the word ‘disjunction’ without further addition we mean the inclusive disjunction. Now we can write logical forms for given assertions, as ‘formulas’ with these symbols. Consider a card player describing the hand of her opponent: Sentence “He has an Ace if he does not have a Knight or a Spade” Logical formula ¬(k ∨ s) → a It is useful to see this process of formalization as something that is performed in separate steps, for example, as follows. In cases where you are in doubt about the formalization of a phrase in natural language, you can always decide to ‘slow down’ to such a stepwise analysis, to find out where the crucial formalization decision is made. 2-10 CHAPTER 2. PROPOSITIONAL LOGIC He has an Ace if he does not have a Knight or a Spade, if (he does not have a Knight or a Spade), then (he has an Ace), (he does not have a Knight or a Spade) → (he has an Ace), not (he has a Knight or a Spade) → (he has an Ace), ¬ (he has a Knight or a Spade) → (he has an Ace), ¬ ((he has a Knight) or (he has a Spade)) → (he has an Ace), ¬ ((he has a Knight) ∨ (he has a Spade)) → (he has an Ace), ¬(k ∨ s) → a In practice, one often also sees mixed notations where parts of sentences are kept intact, with just logical keywords in formal notation. This is like standard mathematical language, that mixes symbols with natural language. While this mixing can be very useful (the notation enriches the natural language, and may then be easier to absorb in cognitive practice), you will learn more here by looking at the extreme case where the whole sentence is replaced by a logical form. Ambiguity The above process of taking natural language to logical forms is not a routine matter. There can be quite some slack, with genuine issues of interpretation. In particular, natural language sentences can be ambiguous, having different interpretations. For instance, another possible logical form for the card player’s assertion is the formula (¬k ∨ s) → a (2.16) Check for yourself that this says something different from the above. One virtue of logical notation is that we see such differences at a glance: in this case, by the placement of the brackets, which are auxiliary devices that do not occur as such in natural language (though it has been claimed that some actual forms of expression do have ‘bracketing functions’). Sometimes, the logical form of what is stated is even controversial. According to some people, ‘You will get a slap (s), unless you stop whining (¬w)’ expresses the implication w → s. According to others, it expresses the equivalence w ↔ s. Especially, negations in natural language may quickly get hard to grasp. Here is a famous test question in a psychological experiment that many people have difficulty with. How many negations are there, and what does the stacking of negations mean in the following sentence: “Nothing is too trivial to be ignored?” Formal language and syntactic trees Logicians think of the preceding notations, not just as a device that can be inserted to make natural language more precise, but as something that is important on its own, namely, an artificial or formal language. You can think of formulas in such a language as being constructed, starting from basic propositions, often indicated by letters p, q, etcetera, and then applying logical operations, with brackets added to secure unambiguous readability. 2.4. THE LANGUAGE OF PROPOSITIONAL LOGIC 2-11 Example 2.4 The formula ((¬p ∨ q) → r) is created stepwise from proposition letters p, q, r by applying the following construction rules successively: (a) from p, create ¬p, (b) from ¬p and q, create (¬p ∨ q) (c) from (¬p ∨ q) and r, create ((¬p ∨ q) → r) This construction may be visualized in trees that are completely unambiguous. Here are trees for the preceding example plus a variant that we already noted above. Mathematically, bracket notation and tree notation are equivalent — but their cognitive appeal differs, and trees are widely popular in mathematics, linguistics, and elsewhere: ((¬p ∨ q) → r) (¬p ∨ q) ¬p p r (¬(p ∨ q) → r) ¬(p ∨ q) r (p ∨ q) q p q This example has prepared us for the formal presentation of the language of propositional logic. There are two ways to go about this, they amount to the same: an ‘inductive definition’ (for this technical notion, see Appendix A). Here is one way: Every proposition letter (p, q, r, . . .) is a formula. If ϕ is a formula, then ¬ϕ is also a formula. If ϕ1 and ϕ2 are formulas, then (ϕ1 ∧ ϕ2 ), (ϕ1 ∨ ϕ2 ), (ϕ1 → ϕ2 ) and (ϕ1 ↔ ϕ2 ) are also formulas. Nothing else is a formula. We can now clearly recognize that the way we have constructed the formula in the example above is exactly according to this pattern. That is merely a particular instance of the above definition. The definition is formulated in more abstract terms, using the formula variables ϕ1 and ϕ2 . An even more abstract specification, but one that amounts to exactly the same inductive definition, is the so-called BNF specification of the language of propositional logic. BNF stands for ‘Backus Naur Form’, after the computer scientists John Backus and Peter Naur who introduced this device for the syntax of programming languages. Definition 2.5 (Language of propositional logic) Let P be a set of proposition letters and let p ∈ P . ϕ ::= p | ¬ϕ | (ϕ ∧ ϕ) | (ϕ ∨ ϕ) | (ϕ → ϕ) | (ϕ ↔ ϕ) 2-12 CHAPTER 2. PROPOSITIONAL LOGIC We should read such a definition as follows. In the definition we define objects of the type ‘formula in propositional logic’, in short: formulas. The definition starts by stating that every atomic proposition is of that type, i.e., is a formula. Then it says that if an object is of type ϕ, then ¬ϕ is also of type ϕ. Note that it does not say that ¬ϕ is the same formula ϕ. It merely says that both can be called ‘formula’. This definition then helps us to construct concrete formulas step by step, as in the previous example. Backus Naur form is an example of linguistic specification. In fact, BNF is a computer science re-invention of a way to specify languages that was proposed in 1956 by the linguist Noam Chomsky. In practice we often do not write the parentheses, and we only keep them if their removal would make the expression ambiguous, as in p ∨ q ∧ r. This can mean ((p ∨ q) ∧ r) but also (p ∨ (q ∧ r)) and that makes quite a difference. The latter is already true if only p is true, but the former requires r to be true. Or take a natural language example: “Haddock stays sober or he drinks and he gets angry.” Exercise 2.6 Write in propositional logic: • I will only go to school if I get a cookie now. • John and Mary are running. • A foreign national is entitled to social security if he has legal employment or if he has had such less than three years ago, unless he is currently also employed abroad. Exercise 2.7 Which of the following are formulas in propositional logic: • p → ¬q • ¬¬ ∧ q ∨ p • p¬q Exercise 2.8 Construct trees for the following formulas: • (p ∧ q) → ¬q • q ∧ r ∧ s ∧ t (draw all possible trees: think of bracket arrangements). Exercise 2.9 From the fact that several trees are possible for q∧r∧s∧t, we see that this expression can be read in more than one way. Is this ambiguity harmful or not? Why (not)? If you find this hard to answer, think of a natural language example. 2.5. SEMANTIC SITUATIONS, TRUTH TABLES, BINARY ARITHMETIC 2-13 A crucial notion: pure syntax Formulas and trees are pure symbolic forms, living at the level of syntax, as yet without concrete meaning. Historically, identifying this separate level of form has been a major abstraction step, that only became fully clear in 19th century mathematics. Most uses of natural language sentences and actual reasoning come with meanings attached, unless very late at parties. Pure syntax has become the basis for many connections between logic, mathematics, and computer science, where purely symbolic processes play an important role. Logic, language, computation, and thought The above pictures may remind you of parse trees in grammars for natural languages. Indeed, translations between logical forms and linguistic forms are a key topic at the interface of logic and linguistics, which has also started working extensively with mathematical forms in the 20th century. Connections between logical languages and natural language have become important in Computational Linguistics and Artificial Intelligence, for instance when interfacing humans with computers and symbolic computer languages. In fact, you can view our syntax trees in two ways, corresponding to two major tasks in these areas. ‘Top down’ they analyze complex expressions into progressively simpler ones: a process of parsing given sentences. But ‘bottom up’ they construct new sentences, a task called language generation. But also philosophically, the relation between natural and artificial languages has been long under debate. The more abstract level of logical form has been considered more ‘universal’ as a sort of ‘language of thought’, that transcends differences between natural languages (and perhaps even between cultures). You can also cast the relation as a case of replacement of messy ambiguous natural language forms by clean logical forms for reasoning and perhaps other purposes — which is what the founding fathers of modern logic had in mind, who claimed that natural languages are ‘systematically misleading’. But less radically, and perhaps more realistic from an empirical cognitive viewpoint, you can also see the relation as a way of creating hybrids of existing and newly designed forms of expression. Compare the way the language of mathematicians consists of natural language plus a growing fund of notations, or the way in which computer science extends our natural repertoire of expression and communication. 2.5 Semantic Situations, Truth Tables, Binary Arithmetic Differences in formal syntax often correspond to differences in meaning: the above two trees are an example. To explain this in more detail, we now need a semantics that, for a start, relates syntactic objects like formulas to truth and falsity in semantic situations. Thus, formulas acquire meaning in specific settings, and differences in meaning between formulas are often signalled by differences in truth in some situation. 2-14 CHAPTER 2. PROPOSITIONAL LOGIC Truth values and valuations for atoms As we said already, each set of proposition letters p, q, r, . . . generates a set of different situations, different ways the actual world might be, or different states that it could be in (all these interpretations make sense in applications). Three proposition letters generate 23 = 8 situations: {pqr, pqr, pqr, pqr, pqr, pqr, pqr, pqr} (2.17) Here proposition letters stand for ‘atomic propositions’, while logical operations form ‘molecules’. Of course this is just a manner of speaking, since what counts as ‘atomic’ in a given application is usually just our decision ‘not to look any further inside’ the proposition. A convenient mathematical view of situations is as functions from atomic propositions to truth values 1 (‘true’) and 0 (‘false’). For instance, the above situation pqr corresponds to the function sending p to 1, q to 0, and r to 1. An alternative notation for truth values is t and f , but we use numbers for their suggestive analogy with binary arithmetic (the heart of computers). We call these functions V valuations; V (ϕ) = 1 says that the formula ϕ is true in the situation (represented by) V , and V (ϕ) = 0 says that the formula ϕ is false in the situation V . For V (ϕ) = 1 we also write V |= ϕ and for V (ϕ) = 0 we also write V 6|= ϕ. One can read V |= ϕ as “V makes true ϕ”, or as “V satisfies ϕ” or “V is a model of ϕ”. The notation using |= will reappear in later chapters. Boolean operations on truth values Any complex sentence constructed from the relevant atomic proposition letters is either true or false in each situation. To see how this works, we first need an account for the meaning of the logical operations. This is achieved by assigning them Boolean operations on the numbers 0, 1, in a way that respects (as far as reasonable) their intuitive usage. For instance, if V (ϕ) = 0, then V (¬ϕ) = 1, and vice versa; and if V (ϕ) = 1, then V (¬ϕ) = 0, and vice versa. Such relations are easier formatted in a table. Definition 2.10 (Semantics of propositional logic) A valuation V is a function from proposition letters to truth values 0 and 1. The value or meaning of complex sentences is computed from the value of basic propositions according to the following truth tables. ϕ ψ ϕ∧ψ ϕ∨ψ ϕ→ψ ϕ↔ψ ϕ ¬ϕ 0 0 0 0 1 1 0 1 0 1 0 1 1 0 1 0 1 0 0 1 0 0 1 1 1 1 1 1 (2.18) Bold-face numbers give the truth values for all relevant combinations of argument values: four in the case of connectives with two arguments, two in the case of the connective with one argument, the negation. 2.5. SEMANTIC SITUATIONS, TRUTH TABLES, BINARY ARITHMETIC 2-15 Explanation The tables for negation, conjunction, disjunction, and equivalence are quite intuitive, but the same does not hold for the table for implication. The table for implication has generated perennial debate, since it does not match the word ‘implies’ in natural language very well. E.g., does having a false antecedent (condition) ϕ and a true consequent ψ really make the implication if-ϕ-then-ψ true? But we are just doing the best we can in our simple two-valued setting. Here is a thought that has helped many students. You will certainly accept the following assertion as true: ‘All numbers greater than 13 are greater than 12’. Put differently, ‘if a number n is greater than 13 (p), then n is greater than 12 (q)’. But now, just fill in different numbers n, and you get all combinations in the truth table. For instance, n = 14 motivates the truth-value 1 for p → q at pq, n = 13 motivates 1 for p → q at pq, and n = 12 motivates 1 for p → q at pq. A mismatch with natural language can actually be very useful. Conditionals are a ‘hot spot’ in logic, and it is a challenge to create systems that get closer to their behaviour. Propositional logic is the simplest treatment that exists, but other logical systems today deal with further aspects of conditionals in natural language and ordinary reasoning. You will see a few examples later in this course. Computing truth tables for complex formulas How exactly can we compute truth values for complex formulas? This is done using our tables by following the construction stages of syntax trees. Here is how this works. Take the valuation V with V (p) = V (q) = 1, V (r) = 0 and consider two earlier formulas: ((¬p ∨ q) → r 0 r 0 (¬p ∨ q) 1 ¬p p 0 1 q (¬(p ∨ q) → r) ¬(p ∨ q) 1 p r 0 (p ∨ q) 1 1 q 1 0 1 Incidentally, this difference in truth value explains our earlier point that these two variant formulas are different readings of the earlier natural language sentence. Computing in this manner for all valuations, we can systematically tabulate the truth value 2-16 CHAPTER 2. PROPOSITIONAL LOGIC behaviour of complex propositional formulas in all relevant situations: p q r ((¬p ∨ q) → r) (¬(p ∨ q) → r) 0 0 0 0 0 0 0 1 1 1 0 1 0 0 1 0 1 1 1 1 1 0 0 1 1 1 0 1 1 1 1 1 0 0 1 1 1 1 1 1 (2.19) Paying attention to the proper placement of brackets in formulas, you can compute truthtables step by step for all situations. As an example we take the second formula from (2.19). First, start with summing up the situations and copy the truth-values under the proposition letters as has been done in the following table. p q r (¬ (p ∨ q) → r) 0 0 0 · 0 · 0 · 0 0 0 1 · 0 · 0 · 1 0 1 0 · 0 · 1 · 0 0 1 1 · 0 · 1 · 1 1 0 0 · 1 · 0 · 0 1 0 1 · 1 · 1 · 1 1 1 0 · 1 · 0 · 0 1 1 1 · 1 · 1 · 1 (2.20) Then start filling in the truth-values for the first possible operator. Here it is the disjunction: it can be computed because the values of its arguments are given (you can also see this from the construction tree). (p ∨ q) gets value 0 if and only if both p and q have the value 0. The intermediate result is given in the first table in (2.21). The next steps are the 2.5. SEMANTIC SITUATIONS, TRUTH TABLES, BINARY ARITHMETIC 2-17 negation and then the conjunction. This gives the following results: (¬ (p ∨ q) → r) (¬ (p ∨ q) → r) (¬ (p ∨ q) → r) · 0 0 0 · 0 1 0 0 0 · 0 1 0 0 0 0 0 · 0 0 0 · 1 1 0 0 0 · 1 1 0 0 0 1 1 · 0 1 1 · 0 0 0 1 1 · 0 0 0 1 1 1 0 · 0 1 1 · 1 0 0 1 1 · 1 0 0 1 1 1 1 · 1 1 0 · 0 0 1 1 0 · 0 0 1 1 0 1 0 · 1 1 0 · 1 0 1 1 0 · 1 0 1 1 0 1 1 · 1 1 1 · 0 0 1 1 1 · 0 0 1 1 1 1 0 · 1 1 1 · 1 0 1 1 1 · 1 0 1 1 1 1 1 (2.21) One does not have to draw three separate tables. All the work can be done in a single table. We just meant to indicate the right order of filling in truth-values. Exercise 2.11 Construct truth tables for the following formulas: • (p → q) ∨ (q → p), • ((p ∨ ¬q) ∧ r) ↔ (¬(p ∧ r) ∨ q). Exercise 2.12 Using truth tables, investigate all formulas that can be readings of ¬p → q ∨ r (by inserting brackets in appropriate places), and show that they are not equivalent. If, Only If, If and Only If Here is a useful list of different ways to express implications: If p then q p if q p only if q p→q q→p p→q The third item on this list may come as a surprise. To see that the third item is correct, reflect on how one can check whether “We will help you only if you help us” is false. This can can happen only if “We help you” is true, but “You help us” is false. These uses of ‘if’ and ‘only if’ explain the use of the common abbreviation ‘if and only if’ for an equivalence. “We will help you if and only if you help us” states that “you help us” implies “we help you”, and vice versa. A common abbreviation for ’if and only if’ that we will use occasionally is iff. 2-18 2.6 CHAPTER 2. PROPOSITIONAL LOGIC Valid Consequence and Consistency We now define the general notion of valid consequence for propositional logic. It is a more precise version of the notion of a valid argument that we introduced on page 2-4. The notion runs over all possible valuations, and as we will see in a moment, we can use truth tables to check given inferences for validity. (In what follows, k can be any number. If it is k = 0, then there are no premises.) Definition 2.13 (Valid consequence) The inference from a finite set of premises ϕ1 , . . . , ϕk to a conclusion ψ is a valid consequence, something for which we write ϕ1 , . . . , ϕk |= ψ, if each valuation V with V (ϕ1 ) = . . . = V (ϕk ) = 1 also has V (ψ) = 1. Definition 2.14 (Logical equivalence) If ϕ |= ψ and ψ |= ϕ we say that ϕ and ψ are logically equivalent. Here it is useful to recall a warning that was already stated above. Do not confuse valid consequence with truth of formulas in a given situation: validity quantifies over truth in many situations, but it has no specific claim about truth or falsity of the premises and conclusions in the situation. Indeed, validity rules out surprisingly little in this respect: of all the possible truth/falsity combinations that might occur for premises and conclusion, it only rules out one case: viz. that all ϕi get value 1, while ψ gets value 0. Another point from Section 2.2 that is worth repeating here concerns the role of propositional inference in conversation and argumentation. Valid inference does not just help establish truth, but it can also achieve a refutation of claims: when the conclusion of a valid consequence is false, at least one of the premises must be false. But logic does not tell us in general which one: some further investigation may be required to find the culprit(s). It has been said by philosophers that this refutational use of logic may be the most important one, since it is the basis of learning, where we constantly have to give up current beliefs when they contradict new facts. Here is a simple example of how truth tables can check for validity: Example 2.15 (Modus Tollens) The simplest case of refutation depends on the rule of modus tollens: ϕ → ψ, ¬ψ |= ¬ϕ. 2.6. VALID CONSEQUENCE AND CONSISTENCY 2-19 Below you see the complete truth table demonstrating its validity: ϕ ψ ϕ → ψ ¬ψ ¬ϕ 1 1 1 0 0 1 0 0 1 0 0 1 1 0 1 0 0 1 1 1 (2.22) !! Of the four possible relevant situations here, only one satisfies both premises (the valuation on the fourth line), and we can check that there, indeed, the conclusion is true as well. Thus, the inference is valid. By contrast, when an inference is invalid, there is at least one valuation (i.e., a line in the truth table) where its premises are all true, and the conclusion false. Such situations are called counter-examples. The preceding table also gives us a counter-example for the earlier invalid consequence from ϕ → ψ, ¬ϕ to ¬ψ namely, the valuation on the third line where ϕ → ψ and ¬ϕ are true but ¬ψ is false. Please note that invalidity does not say that all valuations making the premises true make the conclusion false. The latter would express a valid consequence again, this time, the ‘refutation’ of ψ (since ¬ϕ is true iff ϕ is false): ϕ1 , . . . , ϕk |= ¬ψ (2.23) Satisfiability Finally, here is another important logical notion that gives another perspective on the same issues: Definition 2.16 (Satisfiable) A set of formulas X (say, ϕ1 , . . . , ϕk ) is satisfiable if there is a valuation that makes all formulas in X true. There is a close connection between satisfiability and consistency. Satisfiable versus Consistent A set of formulas that does not lead to a contradiction is called a consistent formula set. Here ‘leading to a contradiction’ refers to proof rules, so this is a definition in terms of proof theory. But it is really the other side of the same coin, for a set of formulas is consistent iff the set is satisfiable. Satisfiability gives the semantic perspective on consistency. 2-20 CHAPTER 2. PROPOSITIONAL LOGIC Instead of ‘not consistent’ we also say inconsistent, which says that there is no valuation where all formulas in the set are true simultaneously. Satisfiability (consistency) is not the same as truth: it does not say that all formulas in X are actually true, but that they could be true in some situation. This suffices for many purposes. In conversation, we often cannot check directly if what people tell us is true (think of their accounts of their holiday adventures, or the brilliance of their kids), but we often believe them as long as what they say is consistent. Also, as we noted in Chapter 1, a lawyer does not have to prove that her client is innocent, she just has to show that it is consistent with the given evidence that he is innocent. We can test for consistency in a truth table again, looking for a line making all relevant formulas true. This is like our earlier computations, and indeed, validity and consistency are related. For instance, it follows directly from our definitions that ϕ |= ψ if and only if {ϕ, ¬ψ} is not consistent. (2.24) Tautologies Now we look briefly at the ‘laws’ of our system: Definition 2.17 (Tautology) A formula ψ that gets the value 1 in every valuation is called a tautology. The notation for tautologies is |= ψ. Many tautologies are well-known as general laws of propositional logic. They can be used to infer quick conclusions or simplify given assertions. Here are some useful tautologies: Double Negation ¬¬ϕ ↔ ϕ De Morgan laws ¬(ϕ ∨ ψ) ↔ (¬ϕ ∧ ¬ψ) ¬(ϕ ∧ ψ) ↔ (¬ϕ ∨ ¬ψ) (2.25) Distribution laws (ϕ ∧ (ψ ∨ χ)) ↔ ((ϕ ∧ ψ) ∨ (ϕ ∧ χ)) (ϕ ∨ (ψ ∧ χ)) ↔ ((ϕ ∨ ψ) ∧ (ϕ ∨ χ)) Check for yourself that they all get values 1 on all lines of their truth tables. Tautologies are a special zero-premise case of valid consequences, but via a little trick, they encode all valid consequences. In fact, every valid consequence corresponds to a tautology, for it is easy to see that: ϕ1 , . . . , ϕk |= ψ if and only if (ϕ1 ∧ . . . ∧ ϕk ) → ψ is a tautology Exercise 2.18 Using a truth table, determine if the two formulas ¬p → (q ∨ r), ¬q together logically imply (1) p ∧ r. (2.26) 2.6. VALID CONSEQUENCE AND CONSISTENCY 2-21 (2) p ∨ r. Display the complete truth table, and use it to justify your answers to (1) and (2). Exercise 2.19 Show using a truth table that: • the inference from p → (q ∧ r), ¬q to ¬p is valid and • the inference from p → (q ∨ r), ¬q to ¬p is not valid. Exercise 2.20 Check if the following are valid consequences: (1) ¬(q ∧ r), q |= ¬r (2) ¬p ∨ ¬q ∨ r, q ∨ r, p |= r. Exercise 2.21 Give truth tables for the following formulas: (1) (p ∨ q) ∨ ¬(p ∨ (q ∧ r)) (2) ¬((¬p ∨ ¬(q ∧ r)) ∨ (p ∧ r)) (3) (p → (q → r)) → ((p → q) → (p → r)) (4) (p ↔ (q → r)) ↔ ((p ↔ q) → r) (5) ((p ↔ q) ∧ (¬q → r)) ↔ (¬(p ↔ r) → q) Exercise 2.22 Which of the following pairs are logically equivalent? Confirm your answer using truth tables: (1) ϕ → ψ and ψ → ϕ (2) ϕ → ψ and ¬ψ → ¬ϕ (3) ¬(ϕ → ψ) and ϕ ∨ ¬ψ (4) ¬(ϕ → ψ) and ϕ ∧ ¬ψ (5) ¬(ϕ ↔ ψ) and ¬ϕ ↔ ¬ψ (6) ¬(ϕ ↔ ψ) and ¬ϕ ↔ ψ (7) (ϕ ∧ ψ) ↔ (ϕ ∨ ψ) and ϕ ↔ ψ 2-22 2.7 CHAPTER 2. PROPOSITIONAL LOGIC Proof Proof: symbolic inference So far we tested inferences for validity with truth tables, staying close to the semantic meaning of the formulas. But a lot of inference happens automatically, by manipulating symbols. People usually do not reason via truth tables. They rather combine many simple proof steps that they already know, without going back to their motivation. The more such rules they have learnt, the faster their reasoning goes. Likewise, mathematicians often do formal calculation and proof via symbolic rules (think of your school algebra), and of course, computers have to do proof steps purely symbolically (as long as they have not yet learnt to think, like us, about what their actions might mean). Logic has many formal calculi that can do proofs, and later on, we will devote a whole chapter to this topic. But in this chapter, we give you a first taste of what it means to do proof steps in a formal calculus. There is a certain pleasure and surprise to symbolic calculation that has to be experienced. Below, we present an axiomatic system organized a bit like the famous geometry book of Euclid’s Elements from Antiquity. It starts from just a few basic principles (the axioms), after which chains of many proof steps, each one simple by itself, lead to more and more, sometimes very surprising theorems. Here is a modern axiomatic symbol game for logic: Definition 2.23 (Axiomatization) A proof is a finite sequence of formulas, where each formula is either an axiom, or follows from previous formulas in the proof by a deduction rule. A formula is a theorem if it occurs in a proof, typically as the last formula in the sequence. A set of axioms and rules defines an axiomatization for a given logic. The following is an axiomatization for propositional logic. The axioms are given in schematic form, with the formula variables that we have already seen. It means that we can put any specific formula in the place of these variables: (1) (ϕ → (ψ → ϕ)) (2) ((ϕ → (ψ → χ)) → ((ϕ → ψ) → (ϕ → χ))) (3) ((¬ϕ → ¬ψ) → (ψ → ϕ)) and there is only one deduction rule, the Modus Ponens that we have already encountered: • if ϕ and (ϕ → ψ) are theorems, then ψ is also a theorem. This axiomatization originates with the Polish logician Jan Łukasiewicz. In this system for propositional logic we may only use implication and negation symbols, and no other logical connectives, such as conjunctions. In our later section on expressivity it will be become clear why this restricted vocabulary is sufficient. 2.7. PROOF 2-23 Training in axiomatic deduction will not be a key focus of this course. Still, we do want you to experience the interest of performing purely syntactic proofs, as a sort of ‘symbol game’ that can be interpreted later. We give one more abstract logical example here, and also one closer to practice. Example 2.24 As an example of an axiomatic proof, we show that p → p is a theorem. This seems a self-evident tautology semantically, but now, the art is to derive it using only the rules of our game! In what follows we use well-chosen concrete instantiations of axiom schemas. For instance, the first line uses Axiom Schema 1 with the atomic proposition p for the variable formula ϕ and q → p for the variable formula ψ. And so on: 1. 2. 3. 4. 5. p → ((q → p) → p) Axiom (1) (p → ((q → p) → p)) → ((p → (q → p)) → (p → p)) Axiom (2) (p → (q → p)) → (p → p) Modus Ponens, from steps 1, 2 p → (q → p) Axiom (1) p→p Modus Ponens, from steps 3, 4 It takes some skill to find such proofs by oneself. But it is actually an exciting game to many students, precisely because of the purely symbolic nature of the steps involved. More general proofs can have certain assumptions, in addition to instances of axiom schemas. Here is an example closer to practice. Example 2.25 Use only Modus Ponens and suitable axioms to derive the solution to the following problem. You want to throw a party, respecting people’s incompatibilities. You know that: (a) John comes if Mary or Ann comes. (b) Ann comes if Mary does not come. (c) If Ann comes, John does not. Can you invite people under these constraints? There are several ways of solving this, including truth tables with update as in our next Section. But for now, can you prove what the solution must be? Here is a little help with the formal rendering: (i) ‘If Ann comes, John does not’ is the formula a → ¬j, (ii) ‘Ann comes if Mary does not come’: ¬m → a, (c) ‘John comes if Mary or Ann comes’: here you can rewrite to an equivalent conjunction ‘John comes if Mary comes’ and ‘John comes if Ann comes’ to produce two formulas that fall inside our language: a → j, m → j. Now try to give a proof just using the above axioms and rule for the solution, deriving successively that ¬a, m, j. Have fun! This concludes our first glimpse of a proof game with a fixed repertoire. 2-24 CHAPTER 2. PROPOSITIONAL LOGIC System properties: soundness and completeness If all theorems of an axiomatic system are valid, the system is called sound , and conversely, if all valid formulas are provable theorems, the logic is called complete. Soundness seems an obvious requirement, as you want to rely totally on your proof procedure. The above system is sound, as you can see by noting that all axioms are tautologies, while Modus Ponens always takes tautologies to tautologies, that is, if ϕ and ϕ → ψ are tautologies, then ψ is also a tautology. Completeness is a different matter, and can be harder to obtain for a given system. (Does Euclid’s system of axioms suffice for proving all truths of geometry? The answer took centuries of investigation and reformulation of the system.) The above proof system is indeed complete, and so are the proof systems that we will present in later chapters. But showing that completeness holds can be hard. The completeness of predicate logic, that we will discuss in later chapters, was one of the first deep results in modern logic, discovered by the then 23-year old Kurt Gödel in his 1929 dissertation. Axiomatic deduction is only one of many proof methods used in logic. Others include natural deduction (used a lot in logical Proof Theory) and resolution (used in many automated theorem provers). Chapter 9 in Part III of the book will take you much further into this area. 2.8 Information Update With all this in place, we can now also define our earlier notions of information structure and information growth: The information content of a formula ϕ is the set MOD(ϕ) of its models, that is, the valuations that assign the formula ϕ the truth-value 1. You can think of this as the range of possible situations that ϕ leaves open. Note that the more possibilities are left open by a formula ϕ, the less information ϕ contains. Formulas that leave many possibilities open correspond to information states with much uncertainty. Formulas that leave just one possibility open — that have just one satisfying valuation — leave no uncertainty at all about what the situation is like. Information update by elimination of possibilities such information states: Here is the dynamics that changes An update with new information ψ reduces the current set of models X to the overlap or intersection of X and MOD(ψ). The valuations in X that assign the value 0 to ψ are eliminated. 2.8. INFORMATION UPDATE 2-25 Thus, propositional logic gives an account of basic cognitive dynamics, where information states (sets of satisfying valuations) shrink as new information comes in: growth of knowledge is loss of uncertainty. We have seen earlier how this worked with simple inferences like ‘from p ∨ q, ¬p to q’, if we assume that the premises update an initial information state of no information (maximal uncertainty: all valuations still present). As a second example, we return to an earlier question in Section 2.3 (see Exercise 2.3) What information is given by p ∨ q, ¬p ∨ r? Here are the update stages: initial state {pqr, pqr, pqr, pqr, pqr, pqr, pqr, pqr} update with p ∨ q {pqr, pqr, pqr, pqr, pqr, pqr} (2.27) update with ¬p ∨ r {pqr, pqr, pqr, pqr} We can conclude whatever is true in all of the remaining four states. One valid conclusion is the inclusive disjunction q ∨r, and this is indeed the one used in the so-called...
Purchase answer to see full attachment
User generated content is uploaded by users for the purposes of learning and should be used following Studypool's honor code & terms of service.

Explanation & Answer

Please find the answer in the attachment below, (i was not able to the finish number 11 since i couldn't understand how it connects with the notes- i hope you understand, there was barely enough time to finish everything credibly). Anyway of you have any questions and clarrificarions please feel free to reach out

Surname 1

Name of student

Name of tutor

Course

Date

Problem Set 9
1

Syntax and Semantics of Monadic Predicate Logic

1.1

Identity and Substitution

1.1.1 The Identity Predicate

1. 4 points Exercise 4.36 of Logic in Action. Write out your formula with no
abbreviations.

 ∃x∃y∃z (¬x = y ∧ ¬x = z ∧ ¬x = y ∧ ∀v (P v ↔ (v = y ∨ v =
x ∨ v = z)))
2. 4 points Exercise 4.38(2) of Logic in Action. Demonstrate the difference in
meaning by providing a model in which the two formulas have different
truth values.

 In a model with two elements one of which is A but not B and
the other is B but not A the formulas have a different truth
value: ∃!x(Ax ∨ Bx) is false and ∃!x Ax ∨ ∃!x Bx is true.

Surname 2

1.1.2 Substitution
3. 10 points total; 2.5 points each part Give a recursive definition of the set of
terms t that are substitutable for a variable x in a formula ϕ. That is, fill in
the question marks in the following template (note that we do not assume
that the metavariables ‘x’ and ‘y’ refer to distinct variables, so you need to
analyze the case where x = y and the case where x ≠ y in part (d)):
 t is always substitutable for x in an atomic formula P(u) or s
˙= t (because there are no quantifiers in an atomic formula to
capture a variable in t);
Solution
For any terms s and t and variable x, S xt recursive as follows:
Consider the term Biomom (x) which can be substituted to get
biomom (biomom (Alex)
 If c ∈ Const, then cxt = c;
 if y ∈ Var and y ≠ x, then yxt = y;
 Xxt = t;
 if f ∈ Func and u ∈ Term, then f (u) xt = f (u xt ).

 t is substitutable for x in ¬ϕ iff ?
 P(s) xt = P(sxt ), where Sxt is defined as before
 For # ∈ {∧, ∨, →, ↔}, t is substitutable for x in (ϕ # ψ) iff?
 (ϕ) xt = ϕxt and (ϕ # ꝕ) xt = (ϕ xt # ꝕ xt)
 For Q ∈ {∀, ∃}, t is substitutable for x in Qyϕ iff?
 (∀yP(x) ∧ ∃xQ(x)) xalex = (∀yP(x)) xalex ∧ (∃xQ(x)) xalex

2

Syntax and Semantics of Predicate Logic
4. These problems concern translating English sentences into the language of
predicate logic:
(a) 3 points assume the domain of discourse is all dogs (d). Translate the
following sentences into predicate logic: (Use f for Fido, r for Rover,
and L for loves)
 Fido is loved by everyone.
∃x →Lxf
 Rover loves everyone who
love Fido.
∀x (Rdf→LRd)
 Fido and Rover love each
other.
Lfr ∧ Lrf

Surname 3

(b) 5 points for each of the following, specify an appropriate domain of
discourse, specify a translation key, and translate into predicate logic.
(Note: you do not have to understand what a sentence means before you
can attempt to translate it.)

 Cats that love dogs don’t
taunt dogs.
∀x ((Cx → Lyx) ¬ ∀y(Ty)
 There is a greatest natural
number.
∃x∀yG xy
 Friends of Fido’s friends are
Rover’s friends.
∀xy ((Ffx ∧ Fxr) →Ffr)
 There is no largest even number.
∀x(Ex→∃y(Ey ∧ Sxy ))
 Every apple is tastier than every orange.
∀xy(Ax >Oy)

(c) 4 points translate the following sentences into predicate logical
formulas. Assume the domain of discourse is cats and dogs.
 Some cat doesn’t love all dogs.
∃X ∃y (Cx ∧ Dy ∧ ¬ Lxy)
 Every dog who loves all cats doesn’t
love every dog.
∀x((C x ∧ ∃y(Dy ∧ Lxy)) → ∃z(Dz ∧ Lzx))
 Every cat who loves a dog is purred at
by some cat.
∀x((C x ∧ ∀y (Dv→Lxy)) → ∃z(Cz ∧ Lz...


Anonymous
Just what I was looking for! Super helpful.

Studypool
4.7
Trustpilot
4.5
Sitejabber
4.4

Related Tags