Unit 4 AI
Unit 4 AI
Knowledge:
Knowledge is a theoretical or practical understanding of a subject or a domain and it is also the
sum of what is currently known. Hence, knowledge is the sum of what is known: the body of
truth, information, and principles acquired by mankind.
Knowledge according to Sunasee and Sewery, 2002
“Knowledge is human proficiency stored in a person’s mind, gained through experience, and
interaction with the person’s environment."
In general, knowledge is more than just data, it consist of: facts, ideas, beliefs, heuristics,
associations, rules, abstractions, relationships, customs.
Research literature classifies knowledge as follows:
Classification-based Knowledge » Ability to classify information
Decision-oriented Knowledge » Choosing the best option
Descriptive knowledge » State of some world (heuristic)
Procedural knowledge » How to do something
Reasoning knowledge » What conclusion is valid in what situation?
Assimilative knowledge » What its impact is?
Knowledge Representation
Knowledge representation (KR) is the study of how knowledge about the world can be represented and
what kinds of reasoning can be done with that knowledge. Knowledge Representation is the method used
to encode knowledge in Intelligent Systems.
Some issues that arise in knowledge representation from an AI perspective are:
How do people represent knowledge?
What is the nature of knowledge and how do we represent it?
Should a representation scheme deal with a particular domain or should it be general
purpose?
How expressive is a representation scheme or formal language?
Should the scheme be declarative or procedural?
The following properties/Characters should be possessed by a knowledge representation system.
Representational Adequacy
the ability to represent the required knowledge;
Inferential Adequacy
the ability to manipulate the knowledge represented to produce new knowledge corresponding to
that inferred from the original;
Inferential Efficiency
the ability to direct the inferential mechanisms into the most productive directions by storing
appropriate guides;
Acquisitional Efficiency
the ability to acquire new knowledge using automatic methods wherever possible rather than
reliance on human intervention.
An axiom is a sentence or proposition that is not proved or demonstrated and is considered as self-
evident or as an initial necessary consensus for a theory building or acceptation. According as
requirements, the new sentences are added to the knowledge base and then new sentences are also
derived from old axiom & theorems, called inference.
Logic is method of reasoning process in which conclusions are drawn from premises using rules of
inference. The logic is knowledge representation technique that involves:
Syntax: defines well-formed sentences or legal expression in the language
Semantics: defines the "meaning" of sentences
Inference rules: for manipulating sentences in the language
Basically, the logic can be classified as:
Proposition (or statements or calculus) logic
Predicate [or First Order Predicate Logic (FOPL)] logic
A. Propositional Logic
A proposition is a declarative sentence to which only one of the “Truth value” (i.e. TRUE orFALSE) can
be assigned (but not both). Hence, the propositional logic is also called Booleanlogic. When a proposition
is true, we say that its truth value is T, otherwise its truth value is F.
For example:
The square of 4 is 16 T
The square of 5 is 27 F
The sentences of propositional logic can be categories as: atomic sentences and complex sentences.
- Atomic Sentences(Simple)
The atomic sentences consist of a single proposition symbol. Each such symbol stands for
aproposition that can be true or false. We use symbols that start with an uppercase letter andmay
contain other letters or subscripts, for example: p, q, r, s etc.
For example:
p = Sun rises in West. (False sentence)
- Complex Sentences (molecular or combined or compound)
The two or more statements connected together with some logical connectives suchas AND
(∧), OR (∨), Implication (→), etc. There are five connectives in common use:
Name Representation Meaning
Negation p not p
Conjunction p ∧q p and q
(true when both statement are
true, otherwise false)
Disjunction p ∨q p or q (or both)
(false when both statement are
false, otherwise true)
Exclusive Or p ⊕q either p or q, but not both
(false when both statement are
same)
Implication p→q if p then q
(false when p is true and q is
false)
Bi-conditional or Bi-implication p↔q p if and only if q
(true when both statement have
same truth value)
The order of precedence in propositional logic is (from highest to lowest): Inverse,
AND, OR, Implication and Double Implication.
Truth Table
p q p p ∧q p ∨q p ⊕q p→q p↔q
T T F T T F T T
T F F F T T F F
F T T F T T T F
F F T F F F T T
Logical equivalence Two proposition p and q are logically equivalent and written as if
both p and q have identical truth values
Eg:
(p ∧ q) ≡ (p ∨q)
(hint: draw the truth value for proof)
The following logical equivalences apply to any statements; the
p's, q's and r's can stand for atomic statements or compound
statements.
i. Double Negative Law (p) ≡ p
ii. De Morgan's Laws
(p q)≡(p) (q)
(p q) ≡(p)(q)
iii. Distributive Laws
p (q r) ≡ (p q) (p r)
p (q r) ≡ (p q) (p r)
Tautology If a proposition have a truth value for every interpretation
E.g.: p ∨p
Contradiction If a proposition have a false value for every interpretation
E.g.: p p
Contingent If a proposition have both true and false value
E.g.: p q
Q. Write converse, inverse, negation, implication, contrapositive of the following integration.
“The program is well structured only if it is readable”
Soln:
Here,
p = “the program is readable”
q = “the program is well structured”
a) Implication: (pq)
Q. There are two restaurants next to each other. One has a sign board as: “Good food is not
cheap”. The other has a sign board as “Cheap food is not good”. Are both the sign board saying the
same thing?
Here, let’s assume:
G= “Food is Good”
C= “Food is Cheap”
Now, Sentence 1:- “Good food is not cheap” can be symbolically written as G C
Similarly,
Sentence 2:- “Cheap food is not good” can be symbolically written as C G
Now, The Truth Table is:
G C G C G C C G
T T F F F F
T F F T T T
F T T F T T
F F T T T T
Since, G C and C G are logically equivalent. So both are saying same thing.
Rules of Inference
The process of drawing conclusion from given premises in an argument is called inference. To draw the
conclusion from the given statements, we must be able to apply some well-defined steps that helps
reaching the conclusion.
The inference algorithm is sound if everything returned is a needle (hence some needles may be missed)
and complete if all needles are returned (hence some hay may be returned too).
Let S be the set of all right answers.
A sound algorithm never includes a wrong answer in S, but it might miss a few right answers. => not
necessarily "complete".
A complete algorithm should get every right answer in S: include the complete set of right answers. But
it might include a few wrong answers. It might return a wrong answer for a single input. => not
necessarily "sound".
The steps of reaching the conclusion are provided by the rules of inference.
a. Modus Ponens Rule
pq
p
q
Example:
- If Ram is hard working, then he is intelligent.
- Ram is hard working.
Ram is intelligent.
b. Modus Tollens Rule
pq
q
p
Example:
- If it is sunny, then we will go to swimming.
- We will not go swimming.
It is not Sunny.
c. Hypothetical Syllogism Rule
pq
q→r
p→r
Example:
- If Subodh is a BE student, then he loves programming.
- If Subodh loves programming, then he is expert in java.
If Subodh is a BE student, then he is expert in java.
d. Disjunctive Syllogism Rule
pq
p
q
Example:
- Today is Wednesday or Thursday.
- Today is not Wednesday.
Today is Thursday.
e. Addition rule
p
pq
Example:
- Ram is a student of BE.
Ram is a student of BE or BCA.
f. Simplication rule
pq
p
Example:
-Subodh and Shyam are the students of BE.
Subodh is the student of BE
or
Shyam is the student of BE.
g. Conjunction rule
p
q
pq
Example:
- Shyam is the student of BE.
- Hari is the student of BE.
Shyam and Hari are the students of BE.
h. Resolution Rule
pq
q r
pr
Q. “If you send me an e-mail message then I will finish writing the program”, “If you do not send
me an e-mail message then I will go to sleep early”, and “If I go to sleep early then I will wake up
feeling refreshed”. Lead to the conclusion “If I do not finish writing the program then I will wake
up feeling refreshed”.
Solution:
Let
p = “You send me an e-mail message”
q = “I will finish writing the program”
r = “I will go to sleep early”
s = “I will wake up feeling refreshed”
Hypothesis:
a. p → q
b. p → r
c. r →s
Conclusion:q → s
Steps Operations Reasons
1 p→q Given hypothesis
2 q→ p Using contra positive on 1
3 p→r Given hypothesis
4 q→r Using hypothetical syllogism on 2 and 3
5 r→s Given hypothesis
6 q→s Using hypothetical syllogism on 4 and 5
Hence the given hypotheses lead to the conclusion q → s
Q.“ Hari is playing in garden”, “If he is playing in garden then he is not doing homework”, “If he
is not doing homework, then he is not learning” leads to the conclusion “He is not learning”.
Solution:
Let
p = “Hari is playing in garden”
q = “He is doing homework”
r = “He is learning”
Hypothesis:
a. p
b. p→q
c. q → r
Conclusion: r
Q. Assume that
P (x) denotes “x is an accountant.”
Q (x) denotes “x owns a maruti.”
Now, represent the following statement symbols.
a) All accountants own maruti.
Meaning: For all x, if x is an accountant, then x owns maruti
x p(x) q(x)
x ( p(x) q(x) )
b) Some accountants own maruti
Meaning: For some x, x is an accountant and x owns maruti
x p(x) q(x)
x ( p(x) q(x) )
c) All owners of maruti are accountants
Meaning: For all x, if x is a owner of maruti, then x is an accountant.
x ( q(x) p(x) )
d) Someone who owns a maruti, is an accountant
Meaning: For some x, who owns a maruti and x is an accountant
x ( q(x) p(x) )
Q. Convert into Well-Formed-Formula (WFF). Translate in two ways each of the following using
predicates, quantifiers, and logical connectives. First, let the domain consists of the student in your
class and second, let it consists of all people.
a. Everyone in your class is friendly.
Let
F(x) = “x is friendly”
S(x) = “x is student in the class”
Domain Well-Formed-Formulas (WFFs)
Student in the class x F(x)
All people x [S(x) → F(x)]
Q. Given Expression: All men are mortal. Einstein is a man. Prove that “Einstein is mortal” using
FOPL.
Solution: Let
M(x) = “x is a man”
N(x) = “x is mortal”
Hypothesis: x [M(x) → N(x)], M (Einstein)
Conclusion: N (Einstein)
Steps Operations Reasons
1. x [M(x) → N(x)] Given Hypothesis
2. M(Einstein)→N (Einstein) Using universal instantiation on 1
3. M (Einstein) Given Hypothesis
4. N (Einstein) Using modus pollens on 2 and 3
Hence the given hypotheses lead to the conclusion “Einstein is mortal”.
Q. Given Expression: “Lions are dangerous animals”, and “There are lions”. Prove that
“There are dangerous animals” using FOPL.
Solution: Let
D(x) = “x is a dangerous animal”
L(x) = “x is a lion”
Hypothesis: x [L(x) → D(x)], x L(x)
Conclusion: x D(x)
Steps Operations Reasons
1. x [L(x) → D(x)] Given Hypothesis
2. L(a) → D(a) Using universal instantiation on 1
3. x L(x) Given Hypothesis
4. L(a) Using existential instantiation on 3
5. D(a) Using modus pollenson 2 and 4
6. x D(x) Using existential generalization on 5
Hence the given hypotheses lead to the conclusion “There are dangerous animals”
Q. Given Expression: “A student in this class has not read the book”, and “Everyone in this class
passed the first exam”. Imply the conclusion “Someone has passed the exam has not read the
book”.
Solution:
Let
C(x) = “x is in this class”
R(x) = “x has read the book”
P(x) = “x has passed the first exam”
Hypothesis: x [C(x) ∧ R(x)], x [C(x) → P(x)],
Conclusion: x [P(x) ∧ R(x)]
Steps Operations Reasons
1. x [C(x) ∧ R(x)] Given hypothesis
2. C(a) ∧ R(a) Using existential instantiation on 1
3. x [C(x) → P(x)] Given hypothesis
4. C(a) → P(a) Using universal instantiation on 3
5. C(a) Simplification on 2
6. P(a) Using modus penance on 4 and 5
7. R(a) Simplification on 2
8. P(a) ∧ R(a) Conjunction on 6 and 7
9. x [P(x) ∧ R(x)] Using existential generalization on 8
Horn Clause
Horn clause is a disjunction of literals of which at most one (or only one) is positive. So all definite
clauses are Horn clauses, as are clauses with no positive literals; these are called goal clauses. Horn
clauses are closed under resolution: if we resolve two Horn clauses, we get back a Horn clause.
Resolution in FOPL
- Unification Algorithm
During resolution in propositional logic, it is easy to determine that two literals (e.g. p andp)
cannot both be true at the same time. In predicate logic this matching process is more complicated
since the argument of the predicate must be considered.
For example, MAN (John) and MAN (John) is a contradiction, while MAN (John) and MAN
(Smith) is not. Thus, in order to determine contradictions, we need a matching procedure, called
unification algorithm that compares two literals and discovers whether there exists a set of
substitutions that makes them identical.
To unify two literals, the initial predicate symbol on both must be same; otherwise there is no way
of unification. For example, Q(x, y) and R(x, y) cannot unify but P(x, x) and P(y, z) can be unify
by substituting z by x and y by x.
Q. Given Expression: John likes all kinds of foods. Apples are food. Chicken is food. Prove that
John likes Peanuts using resolution.
Soln:
FOPL
FOOD (apples)
FOOD (chicken)
Now, we have to prove: LIKES (John, peanuts). To prove the statement using resolution (proof by
contradiction); let’s take the negation of this as: LIKES (John, peanuts)
Now,
Since, LIKES (John, peanuts) is not possible and hence the: LIKES (John, peanuts) is proved.
Q. Given Expression: Bhaskar is a physician. All physicians know surgery. Prove that Bhaskar
knows surgery using principle of resolution.
Soln:
FOPL is
PHYSICIAN (Bhaskar)
x PHYSICIAN (x) → KNOWS (x, surgery) or
PHYSICIAN (x) ∨KNOWS (x, surgery)
Now, we have to prove that: KNOWS (Bhaskar, surgery). To prove the statement using resolution
(proof by contradiction); let’s take thenegation of this as: KNOWS (Bhaskar, surgery)
Now,
Q. Given Expression: All carnivorous animals have sharp teeth. Tiger is carnivorous. Fox is
carnivorous. Prove that tiger has sharp teeth.
Soln:
- FOPL is
x CARNIVOROUS (x) → SHARP TEETH (x) or
Since, SHARP TEETH (tiger) is not possible and hence the: SHARPTEETH (tiger) is
proved.
Q. Given Expression: Gita only likes easy course. Science courses are hard. All the courses in
KMC are easy. KMC302 is a KMC course. Use the resolution to answer the question “Which
course would Gita like?”
Soln:
- FOPL is
LIKES (Gita, easy course)
HARD COURSE (science)
x KMC (x) → EASY COURSE (x) or
Bayes Rule
Bayes rule can be useful for answering the probabilistic queries conditioned on one price of evidence. To
compute just one conditional probability, it requires two terms:
a) A conditional probability and
b) Two unconditional probability.
Let A and B are two dependent events then the probability of the event A when the event B has already
happened is called the conditional probability. It is denoted by P (A|B) and is given by:
P (A|B) = P (AB)/P (B), where P (B) ≠0, P (AB) = P (A|B). P (B) ---- (i)
Similarly,
P (B|A) = P (AB)/P (A), where P (A) ≠0, P (AB) = P (B|A). P (A) ---- (ii)
From equation (i) and (ii), we have
P (B|A). P (A) = P (A|B). P (B)
Bayes rule is useful for those cases where P (A|B) can be estimated but P (B|A) is hard to find
experimentally.
In a task such as medical diagnosis, we often have conditional probabilities on causal relationships and
want to derive a diagnosis. A doctor knows the probability of symptoms condition to disease P (S|D), and
the patient knows his own feeling or symptoms P(S). Also the doctor knows about probability of disease
P (D), then the probability of disease condition to symptoms can be defined as:
For example:
Let,
i) S = Symptoms on patients such as stiff neck whose probability P(S) is =1/20
ii) D = Disease known by doctor whose probability P (D) is = 1/50000
iii) the given P (S|D) = 0.5 or 50%
Now, the probability of disease condition to symptoms,
P (D|S)
= [P (S|D). P (D)]/P(S)
= 0.0002
Assignment
Why probabilistic reasoning is important in AI? Explain with Example.
Causal Networks
A causal network is an acyclic (not cyclic) directed graph arising from an evolution of a substitution
system. The substitution system is a map which uses a set of rules to transform elements of a sequence
into anew sequence using a set of rules which "translate" from the original sequence to its transformation.
For example, the substitution system 1→0, 0→11 would take 10→011 →1100→ 001111 → 11110000 →
…. ……..
A causal network is a Bayesian network with an explicit requirement that the relationships be causal.
Reasoning in Belief Networks
A Bayesian network, Bayes network, belief network or probabilistic directed acyclic graphical model is a
probabilistic graphical model (a type of statistical model) that represents a set of random variables and
their conditional dependencies via a directed acyclic graph (DAG).
For example, a Bayesian network could represent the probabilistic relationships between diseases and
symptoms. From given symptoms, the network can be used to compute the probabilities of the presence
of various diseases.
Formally, Bayesian networks are directed acyclic graphs whose nodes represent random variables in the
Bayesian sense: they may be observable quantities, latent variables, unknown parameters or hypotheses.
Edges represent conditional dependencies; nodes which are not connected represent variables which are
conditionally independent of each other. Each node is associated with a probability function that takes as
input a particular set of values for the node's parent variables and gives the probability of the variable
represented by the node. Bayesian networks are used for modeling knowledge in computational biology,
medicine, document classification, information retrieval, semantic search, image processing, data fusion,
decision support systems, engineering, and gaming.