Artificial Intelligence Prop. Logic Vibhav Gogate The University of Texas at Dallas Some material courtesy of Rina Dechter, Alex Ihler and Stuart Russell, Luke Zettlemoyer, Dan Weld

Outline • Representing knowledge using logic – Agent that reason logically – A knowledge based agent

• Representing and reasoning with logic – Propositional logic • • • • • •

Syntax Semantic validity and models Rules of inference for propositional logic Resolution Complexity of propositional inference.

• Reading: Russel and Norvig, Chapter 7

Knowledge bases

• Knowledge base = set of sentences in a formal language • Declarative approach to building an agent (or other system): – Tell it what it needs to know

• Then it can ask itself what to do - answers should follow from the KB • Agents can be viewed at the knowledge level – i.e., what they know, regardless of how implemented

• Or at the implementation level

– i.e., data structures in KB and algorithms that manipulate them

Knowledge representation Defined by syntax and semantics

 

Computer Inference Assertions (knowledge base)

Conclusions

Semantics

 

Facts

Imply

Facts Real-World

x  y, y  z | x  z

Reasoning: in the syntactic level Example:

The party example • • • • •

If Alex goes, then Beki goes: A  B If Chris goes, then Alex goes: C  A Beki does not go: not B Chris goes: C Query: Is it possible to satisfy all these conditions?

• Should I go to the party?

Example of languages • Programming languages: – Formal languages, not ambiguous, but cannot express partial information. Not expressive enough.

• Natural languages: – Very expressive but ambiguous: ex: small dogs and cats.

• Good representation language: – Both formal and can express partial information, can accommodate inference

• Main approach used in AI: Logic-based languages.

Wumpus World test-bed • Performance measure

– gold +1000, death -1000 – -1 per step, -10 for using the arrow

• Environment – – – – – – –

Squares adjacent to wumpus are smelly Squares adjacent to pit are breezy Glitter iff gold is in the same square Shooting kills wumpus if you are facing it Shooting uses up the only arrow Grabbing picks up gold if in same square Releasing drops the gold in same square

• Sensors: Stench, Breeze, Glitter, Bump, Scream • Actuators: Left turn, Right turn, Forward, Grab, Release, Shoot

Wumpus world characterization • • • • • •

Fully Observable? Deterministic? Episodic? Static? Discrete? Single-agent?

No – only local perception Yes – outcomes exactly specified No – sequential at the level of actions Yes – Wumpus and Pits do not move Yes Yes – Wumpus is essentially a natural feature

Aside: NetHack • WoW of my youth

Exploring a wumpus world

Exploring a wumpus world

Exploring a wumpus world

Exploring a wumpus world

Exploring a wumpus world

Exploring a wumpus world

Exploring a wumpus world

Exploring a wumpus world

Artificial Intelligence: CS 4365 Vibhav Gogate University of Texas, Dallas

Vibhav Gogate University of Texas, Dallas ()

Propositional Logic

Artificial Intelligence: CS 4365

Propositional Logic

1 / 20

Knowledge Representation and Reasoning System

Vibhav Gogate University of Texas, Dallas ()

Artificial Intelligence: CS 4365

Propositional Logic

2 / 20

Logic

Logics are formal languages for representing information such that conclusions can be drawn Syntax defines the sentences in the language Semantics define the “meaning” of sentences; i.e., define truth of a sentence in a world E.g., the language of arithmetic x + 2 ≥ y is a sentence; x2 + y > is not a sentence x + 2 ≥ y is true iff the number x + 2 is no less than the number y x + 2 ≥ y is true in a world where x = 7, y = 1 x + 2 ≥ y is false in a world where x = 0, y = 6

Vibhav Gogate University of Texas, Dallas ()

Artificial Intelligence: CS 4365

Propositional Logic

3 / 20

Entailment and Models

KB |= α Knowledge base KB entails sentence α if and only if α is true in all worlds where KB is true E.g., the KB containing “the Giants won” and “the Reds won” entails “Either the Giants won or the Reds won” We say m is a model of a sentence α if α is true in m M(α) is the set of all models of α Then KB |= α if and only if M(KB) ⊆ M(α)

Vibhav Gogate University of Texas, Dallas ()

Artificial Intelligence: CS 4365

Propositional Logic

4 / 20

Inference: Deriving Conclusions from a KB

KB ⊢i α = sentence α can be derived from KB by procedure i Soundness: i is sound if whenever KB ⊢i α, it is also true that KB |= α Completeness: i is complete if whenever KB |= α, it is also true that KB ⊢i α Preview: we will define a logic (first-order logic) which is expressive enough to say almost anything of interest, and for which there exists a sound and complete inference procedure. That is, the procedure will answer any question whose answer follows from what is known by the KB.

Vibhav Gogate University of Texas, Dallas ()

Artificial Intelligence: CS 4365

Propositional Logic

5 / 20

Propositional logic: Syntax

Propositional logic is the simplest logic—illustrates basic ideas The proposition symbols P1 , P2 etc are sentences If S is a sentence, ¬S is a sentence If S1 and S2 is a sentence, S1 ∧ S2 is a sentence If S1 and S2 is a sentence, S1 ∨ S2 is a sentence If S1 and S2 is a sentence, S1 ⇒ S2 is a sentence If S1 and S2 is a sentence, S1 ⇔ S2 is a sentence

Vibhav Gogate University of Texas, Dallas ()

Artificial Intelligence: CS 4365

Propositional Logic

6 / 20

Propositional logic: Semantics

Each model specifies true/false for each proposition symbol E.g. A B C True True False Rules for evaluating truth with respect to a model m: ¬S is true iff S is false S1 ∧ S2 is true iff S1 is true and S2 is true S2 is true S1 ∨ S2 is true iff S1 is true or S1 ⇒ S2 is true iff S1 is false or S2 is true i.e., is false iff S1 is true and S2 is false S1 ⇔ S2 is true iff S1 ⇒ S2 is true and S2 ⇒ S1 is true

Vibhav Gogate University of Texas, Dallas ()

Artificial Intelligence: CS 4365

Propositional Logic

7 / 20

Propositional inference: Enumeration method

Let α = A ∨ B and KB = (A ∨ C) ∧ (B ∨ ¬C) Is it the case that KB |= α? Check all possible models—α must be true wherever KB is true Is it sound and complete? Also called model checking

Vibhav Gogate University of Texas, Dallas ()

Artificial Intelligence: CS 4365

Propositional Logic

8 / 20

Propositional Theorem Proving

Apply rules of inference directly to sentences to construct a proof of the desired sentence without consulting models. If the number of models is large and the proof is small, we have a winner!! Important concepts for Theorem proving: Logical equivalence Validity Satisfiability

Vibhav Gogate University of Texas, Dallas ()

Artificial Intelligence: CS 4365

Propositional Logic

9 / 20

Logical Equivalences

Vibhav Gogate University of Texas, Dallas ()

Artificial Intelligence: CS 4365

Propositional Logic

10 / 20

Validity and Satisfiability A sentence is valid if it is true in all models e.g., A ∨ ¬A, A ⇒ A, (A ∧ (A ⇒ B)) ⇒ B Validity is connected to inference via the Deduction Theorem: KB |= α if and only if (KB ⇒ α) is valid A sentence is satisfiable if it is true in some model e.g., A ∨ B, C A sentence is unsatisfiable if it is true in no models e.g., A ∧ ¬A Satisfiability is connected to inference via the following: KB |= α if and only if (KB ∧ ¬α) is unsatisfiable Fun facts: α is valid iff ¬α is unsatisfiable α is satisfiable iff ¬α is not valid. Vibhav Gogate University of Texas, Dallas ()

Artificial Intelligence: CS 4365

Propositional Logic

11 / 20

Normal forms Other approaches to inference use syntactic operations on sentences, often expressed in standardized forms Conjunctive Normal Form (CNF—universal) conjunction of disjunctions of literals | {z } clauses E.g., (A ∨ ¬B) ∧ (B ∨ ¬C ∨ ¬D) Disjunctive Normal Form (DNF—universal) disjunction of conjunctions of literals {z } | terms E.g., (A ∧ B) ∨ (A ∧ ¬C) ∨ (A ∧ ¬D) ∨ (¬B ∧ ¬C) ∨ (¬B ∧ ¬D) Horn Form (restricted) conjunction of Horn clauses (clauses with ≤ 1 positive literal) E.g., (A ∨ ¬B) ∧ (B ∨ ¬C ∨ ¬D) Often written as set of implications: B ⇒ A and (C ∧ D) ⇒ B Vibhav Gogate University of Texas, Dallas ()

Artificial Intelligence: CS 4365

Propositional Logic

12 / 20

Proof methods

Proof methods divide into (roughly) two kinds: Model checking truth table enumeration (sound and complete for propositional) heuristic search in model space (sound but incomplete) e.g., the GSAT algorithm (Ex. 6.15) Application of inference rules Legitimate (sound) generation of new sentences from old Proof = a sequence of inference rule applications Can use inference rules as operators in a standard search alg.

Vibhav Gogate University of Texas, Dallas ()

Artificial Intelligence: CS 4365

Propositional Logic

13 / 20

Inference rules for propositional logic

Resolution (for CNF): complete for propositional logic α ∨ β,

¬β ∨ γ α∨γ

Modus Ponens (for Horn Form): complete for Horn KBs α1 , . . . , α n ,

α1 ∧ · · · ∧ α n ⇒ β β

Can be used with forward chaining or backward chaining

Vibhav Gogate University of Texas, Dallas ()

Artificial Intelligence: CS 4365

Propositional Logic

14 / 20

Proof by Resolution

Prove KB |= α. Proof by Contradiction: Prove that KB ∧ ¬α is unsatisfiable. Convert the KB ∧ ¬α to CNF Φ Apply Resolution: Each pair of clauses that contains complementary literals is resolved to produce a new clause which is added to the set Φ. The process continues until one of the following happens: 1

2

There are no clauses that can be added in which case the KB does not entail α Two clauses resolve to yield an empty clause in which case the KB entails α.

Vibhav Gogate University of Texas, Dallas ()

Artificial Intelligence: CS 4365

Propositional Logic

15 / 20

Conversion to CNF Method 1

Eliminate ⇔, replacing α ⇔ β with two formulas α ⇒ β and β ⇒ α

2

Eliminate ⇒ replacing α ⇒ β by ¬α ∨ β Move ¬ inwards.

3

1 2 3 4

¬(¬α) = α ¬(α ∧ β) = ¬α ∨ ¬β ¬(α ∨ β) = ¬α ∧ ¬β

Distribute ∨ over ∧ where possible α ∨ (β ∧ γ) = (α ∨ β) ∧ (α ∨ γ)

Example Convert the following KB to CNF: A ⇔ (B ∨ E), E ⇒ D, C ∧ F ⇒ ¬B, E ⇒ B, B ⇒ F , B ⇒ C Prove by Resolution that ¬A ∧ ¬B is entailed by the KB. Vibhav Gogate University of Texas, Dallas ()

Artificial Intelligence: CS 4365

Propositional Logic

16 / 20

Proof by Forward and Backward Chaining

Given: A KB containing Horn Clauses. Algorithm: Forward Chaining If all premises (left hand side) of an implication are known then its conclusion is added to the set of known facts The process continues until the query q is added or until no further inferences can be made.

Vibhav Gogate University of Texas, Dallas ()

Artificial Intelligence: CS 4365

Propositional Logic

17 / 20

Proof by Forward and Backward Chaining: Example

AND-OR graphs: Multiple links joined by arc indicate a conjunction – every link must be proved. multiple links without an arc indicate a disjunction – any link can be proved. Vibhav Gogate University of Texas, Dallas ()

Artificial Intelligence: CS 4365

Propositional Logic

18 / 20

Model Checking

1

Convert the KB ∧ ¬α to CNF

2

The prove that the CNF is unsatisfiable (entailed) or find a solution to the CNF (does not entail) Use DPLL + Advanced SAT schemes (Backtracking search). Use Walksat (Local search with random restarts and random walks)

Vibhav Gogate University of Texas, Dallas ()

Artificial Intelligence: CS 4365

Propositional Logic

19 / 20

Summary Logical agents apply inference to a knowledge base to derive new information and make decisions Basic concepts of logic: syntax: formal structure of sentences semantics: truth of sentences wrt models entailment: necessary truth of one sentence given another inference: deriving sentences from other sentences soundess: derivations produce only entailed sentences completeness: derivations can produce all entailed sentences Propositional logic suffices for some of these tasks Truth table method is sound and complete for propositional logic Resolution is sound and complete Vibhav Gogate University of Texas, Dallas ()

Artificial Intelligence: CS 4365

Propositional Logic

20 / 20