Thanks to visit codestin.com
Credit goes to www.scribd.com

0% found this document useful (0 votes)
21 views45 pages

07 Knowledge Based

Uploaded by

Xty Ctyiu
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
21 views45 pages

07 Knowledge Based

Uploaded by

Xty Ctyiu
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 45

Knowledge, reasoning, and

planning

Knowledge-Based Agents
Outline

Knowledge- Large
Logical Probabilistic
Based Language
Agents Reasoning
Agents Models
Reality vs. Knowledge Representation
Facts

Knowledge

Learning

• Facts: Sentences we know to be true.


• Possible worlds: all worlds/models which are consistent with the
facts we know (compare with belief state).
• Learning new facts reduces the number of possible worlds.
• Entailment: A new sentence logically follows from what we already
know.
Knowledge-Based Agents

Knowledge
Domain-specific content
base
Domain-independent algorithms that
Inference engine find new sentences using entailment.

• Knowledge base (KB) = set of facts. E.g., set of sentences in a formal


language that are known to be true.
• Declarative approach to building an agent: Define what it needs to know in its
KB.
• Separation between data (knowledge) and program (inference).
• Actions are based on knowledge (sentences + inferred sentences) + an
objective function. E.g., the agent knows the effects of 5 possible actions
and chooses the action with the largest utility.
Generic Knowledge-based Agent
KB

Learning from
Inference engine

action
precepts
Prior
knowledge

Memorize percept at
time t

Ask for logical action


given an objective

Record action taken


at time t
Different Languages to Represent
Knowledge

+ Natural Language word patterns representing


facts, objects, relations, … ???
Outline

Knowledge- Large
Logical Probabilistic
Based Language
Agents Reasoning
Agents Models
Logical Agents

• Facts are logical sentences that are known to be true.


• Inference: Generate new sentences that are entailed by all known sentences.
• Implementation: Typically using Prolog
• Declarative logic programing language.
• Runs queries over the program (= the knowledge base)

Issues:
• Inference is computationally very expensive.
• Logic cannot deal with uncertainty.
Outline

Knowledge- Large
Logical Probabilistic
Based Language
Agents Reasoning
Agents Models*

* This is not in the AIMA textbook!


LLMs - Large Language Models

+ Natural Language word patterns representing


facts, objects, relations, … ???

• Store knowledge as parameters in a deep neural networks.


Using Natural Language for Knowledge Representation

Pretrained model knows words


relationship, grammar, and facts
stored as parameters in a network.

KB
Generates

Prompts Text

Useful?
• The user formulates a question about the real world as a natural language
prompt (a sequence of tokens).
• The LLM generates text using a model representing its knowledge base.
• The text (hopefully) is useful in the real world. The objective function is
not clear. Maybe it is implied in the prompt?
LLM as a Knowledge-Based Agents
Learned word relationships,
grammar, facts.

Domain-independent content (pre-training)


pretrained
Knowledge base
+
Domain-specific content (fine tuning)

Text Generator Domain-independent algorithms

Current text generators are:


• Pretrained decoder-only transformer models (e.g., GPT stands for Generative
Pre-trained Transformer). The knowledge base is not updated during
interactions.
• Tokens are created autoregressively. One token is generated at a time based
on all the previous tokens using the transformer attention mechanism.
LLM as a Generic Knowledge-based
Agent Prompt + already
generated tokens

Next token

• A chatbot repeatedly calls the agent function till the agent function returns
the ‘end’ token.
Outline

Knowledge- Large
Logical Probabilistic
Based Language
Agents Reasoning
Agents Models
Probabilistic Reasoning

+ Natural Language word patterns representing


facts, objects, relations, … ???

• Replaces true/false with a probability.


• This is the basis for
• Probabilistic reasoning under uncertainty
• Decision theory
• Machine Learning

We will talk about these topics a lot more


Logic
Details on Propositional and First-Order Logic
Logic to Represent Knowledge

Logic is a formal system for representing and manipulating facts


(i.e., knowledge) so that true conclusions may be drawn

Syntax: rules for constructing valid E.g., x + 2  y is a valid arithmetic


sentences sentence, x2y + is not

Semantics: “meaning” of Specifically, semantics defines truth


of sentences
sentences, or relationship between
E.g., x + 2  y is true in a world where
logical sentences and the real x=5
world and y = 7
Propositional Logic
Propositional Logic:
Syntax in Backus-Naur Form

= Symbols

Negation
Conjunction
Disjunction
Implication
Biconditional
Validity and Satisfiability

e.g., True, A A, A  A, (A  (A  B))  B


A sentence is valid if it is are called tautologies and are useful to
true in all models/worlds
deduct new sentences.

e.g., AB, C
A sentence is satisfiable if useful to find new facts that satisfy all
it is true in some model
current possible worlds.

A sentence is
unsatisfiable if it is true in
no models e.g., AA
(false in all worlds)
Possible Worlds, Models and Truth
Tables
A model specifies a “possible world” with the true/false
status of each proposition symbol in the knowledge base
• E.g., P is true and Q is true
• With two symbols, there are possible worlds/models, and they can be
enumerated exhaustively using:
A truth table specifies the truth value of a composite sentence for each
possible assignments of truth values to its atoms. Each row is a model.

We have 3 possible worlds for


Propositional Logic: Semantics

Rules for evaluating truth with respect to a model:

• P is true iff P is false


• P  Q is true iff P is true and Q is true
• P  Q is true iff P is true or Q is true
• P  Q is true iff P is false or Q is true

Sentence Model
Logical Equivalence
Two sentences are logically equivalent iff (read if, and only if)
they are true in same models
Entailment

• Entailment means that a sentence follows from the


premises contained in the knowledge base:
KB ╞ α
• The knowledge base KB entails sentence is true in
all models where KB is true
• E.g., KB with x = 0 entails sentence x * y = 0
• Tests for entailment
• KB ╞ α iff (KB  α) is valid
• KB ╞ α iff (KB α) is unsatisfiable
Inference

• Logical inference: a procedure for generating


sentences that follow from (ar entailed by) a
knowledge base KB.

• An inference procedure is sound if it derives a


sentence α iff KB╞ α. I.e, it only derives true
sentences.

• An inference procedure is complete if it can derive


all α for which KB╞ α.
Inference

• How can we check whether a sentence α is entailed by KB?

• How about we enumerate all possible models of the KB (truth


assignments of all its symbols), and check that α is true in every
model in which KB is true?
• This is sound: All produced answer are correct.
• This is complete: It will produce all correct answers.
• Problem: if KB contains n symbols, the truth table will be of size
2n

• Better idea: use inference rules, or sound procedures to generate


new sentences or conclusions given the premises in the KB.
• Look at the textbook for inference rules and resolution.
Inference Rules

• Modus Ponens

   , premises
 conclusion

This means: If the KB contains the sentences and then is true.


• And-elimination

 

Inference Rules

• And-introduction

, 
• Or-introduction  


 
Inference Rules

• Double negative elimination

 
• Unit resolution 

   , 

Resolution

   ,      ,  
or
   
• Example:
: “The weather is dry”
: “The weather is rainy”
γ: “I carry an umbrella”
Resolution is Complete
   ,   
 
• To prove KB╞ α, assume KB   α and derive a contradiction
• Rewrite KB   α as a conjunction of clauses,
or disjunctions of literals
• Conjunctive normal form (CNF)
• Keep applying resolution to clauses that contain complementary literals
and adding resulting clauses
to the list
• If there are no new clauses to be added, then KB does not entail α
• If two clauses resolve to form an empty clause, we have a contradiction and KB╞ α
Complexity of Inference

• Propositional inference is co-NP-complete


• Complement of the SAT problem: α ╞ β if and only if the sentence α   β is
unsatisfiable
• Every known inference algorithm has worst-case exponential run time
complexity.

• Efficient inference is only possible for restricted cases


• e.g., Horn clauses are disjunctions of literals with at most one positive literal.
Example: Wumpus World
Example: Wumpus World
Initial KB needs to contain rules like these for We have to
each square: enumerate all
possible scenarios
in propositional
logic! First-order
logic can help.
Percepts at (1,1) are no breeze or stench. Add
the following facts to the KB:

Inference will tell us that the following facts are

,,,
entailed:

This means that (1,2) and (2,1) are


safe.
Summary

Logical agents apply inference to a knowledge base to derive


new information and make decisions.
• Basic concepts of logic:
• syntax: formal structure of sentences
• semantics: truth of sentences in models
• entailment: necessary truth of one sentence given another
• inference: deriving sentences from other sentences
• soundness: derivations produce only entailed sentences
• completeness: derivations can produce all entailed sentences
• Resolution is complete for propositional logic.
• Algorithms use forward, backward chaining, are linear in
time, and complete for special clauses (definite clauses).
Limitations of Propositional Logic

Suppose you want to say “All humans are mortal”


• In propositional logic, you would need ~6.7 billion statements of the
form:
Michael_Is_Human and Michael_Is_Mortal,
Sarah_Is_Human and Sarah_Is_Mortal, …

Suppose you want to say “Some people can run a marathon”


• You would need a disjunction of ~6.7 billion statements:

Michael_Can_Run_A_Marathon or … or
Sarah_Can_Run_A_Marathon
First-Order Logic

First-order Logic adds objects and relations to the facts of


propositional logic.

This addresses the issues of propositional logic, which needs to store a


fact for each instance of and object individually.
Syntax of FOL

Objects

Relations. Predicate
is/returns True or False

Function returns an object


Universal Quantification

• x P(x)

• Example: “Everyone at SMU is smart”


x At(x,SMU)  Smart(x)
Why not x At(x,SMU)  Smart(x)?

• Roughly speaking, equivalent to the conjunction of all


possible instantiations of the variable:
[At(John, SMU)  Smart(John)]  ...
[At(Richard, SMU)  Smart(Richard)]  ...

• x P(x) is true in a model m iff P(x) is true with x being each


possible object in the model
Existential Quantification

• x P(x)

• Example: “Someone at SMU is smart”


x At(x,SMU)  Smart(x)
Why not x At(x,SMU)  Smart(x)?

• Roughly speaking, equivalent to the disjunction of all


possible instantiations:
[At(John,SMU)  Smart(John)] 
[At(Richard,SMU)  Smart(Richard)]  …

• x P(x) is true in a model m iff P(x) is true with x being some


possible object in the model
Properties of Quantifiers

• x y is the same as y x
• x y is the same as y x
• x y is not the same as y x
x y Loves(x,y)
“There is a person who loves everyone”
y x Loves(x,y)
“Everyone is loved by at least one person”

• Quantifier duality: each quantifier can be expressed using


the other with the help of negation
x Likes(x,IceCream) x Likes(x,IceCream)
x Likes(x,Broccoli) x Likes(x,Broccoli)
Equality

• Term1 = Term2 is true under a given model if and


only if Term1 and Term2 refer to the same object

• E.g., definition of Sibling in terms of Parent:


x,y Sibling(x,y) 
[(x = y)  m,f  (m = f)  Parent(m,x)  Parent(f,x) 
Parent(m,y)  Parent(f,y)]
Example: The Kinship Domain

• Brothers are siblings


x,y Brother(x,y)  Sibling(x,y)
• “Sibling” is symmetric
x,y Sibling(x,y)  Sibling(y,x)
• One's mother is one's female parent
m,c (Mother(c) = m)  (Female(m)  Parent(m,c))
Example: The Set Domain

• s Set(s)  (s = {})  (x,s2 Set(s2)  s = {x|s2})


• x,s {x|s} = {}
• x,s x  s  s = {x|s}
• x,s x  s  [ y,s2 (s = {y|s2}  (x = y  x  s2))]
• s1,s2 s1  s2  (x x  s1  x  s2)
• s1,s2 (s1 = s2)  (s1  s2  s2  s1)
• x,s1,s2 x  (s1  s2)  (x  s1  x  s2)
• x,s1,s2 x  (s1  s2)  (x  s1  x  s2)
Inference in FOL

Inference in FOL is complicated!

1. Reduction to propositional logic and then use propositional logic


inference.

2. Directly do inference on FOL (or a subset like definite clauses)


• Unification: Combine two sentences into one.
• Forward Chaining for FOL
• Backward Chaining for FOL
• Logical programming (e.g., Prolog)

You might also like