csc384f11 Lecture04 BacktrackingSearch
csc384f11 Lecture04 BacktrackingSearch
• Chapter 6
– 6.1: Formalism
– 6.2: Constraint Propagation
– 6.3: Backtracking Search for CSP
– 6.4 is about local search which is a very
useful idea but we won’t cover it in class.
Torsten Hahmann, CSC384 Introduction to Artificial Intelligence, University of Toronto, Fall 2011 1
Acknowledgements
Torsten Hahmann, CSC384 Introduction to Artificial Intelligence, University of Toronto, Fall 2011 2
Constraint Satisfaction Problems (CSP)
Torsten Hahmann, CSC384 Introduction to Artificial Intelligence, University of Toronto, Fall 2011 3
Constraint Satisfaction Problems (CSP)
Torsten Hahmann, CSC384 Introduction to Artificial Intelligence, University of Toronto, Fall 2011 5
Example: Sudoku
Torsten Hahmann, CSC384 Introduction to Artificial Intelligence, University of Toronto, Fall 2011 6
Formalization of a CSP
Torsten Hahmann, CSC384 Introduction to Artificial Intelligence, University of Toronto, Fall 2011 7
Formalization of a CSP
• Each variable can be assigned any value from its
domain.
• Vi = d where d ∈ Dom[Vi]
• Each constraint C
– Has a set of variables it is over, called its scope;
• e.g., C(V1,V2,V4) ranges over V1, V2, V4
– Has a restriction on the values of the variables in the scope;
• e.g. C(V1,V2,V3) = ‹(V1,V2,V3), V1≠ V2 ^ V1 ≠ V4 ^ V2 ≠ V4 ›
or (shorter) C(V1,V2,V3): V1≠ V2, V1 ≠ V4, V2 ≠ V4
– Is a Boolean function that maps assignments to the variables in
its scope to true/false.
• e.g. C(V1=a,V2=b,V4=c) = True
– this set of assignments satisfies the constraint.
• e.g. C(V1=b,V2=c,V4=c) = False
– this set of assignments falsifies the constraint.
Torsten Hahmann, CSC384 Introduction to Artificial Intelligence, University of Toronto, Fall 2011 8
Formalization of a CSP
• Unary Constraints (over one variable)
– e.g. C(X):X=2; C(Y): Y>5
• Binary Constraints (over two variables)
– e.g. C(X,Y): X+Y<6
– Can be represented by Constraint Graph
• Nodes are variables, arcs show constraints.
• e.g. 4-Queens:
Torsten Hahmann, CSC384 Introduction to Artificial Intelligence, University of Toronto, Fall 2011 11
Example: Sudoku
Torsten Hahmann, CSC384 Introduction to Artificial Intelligence, University of Toronto, Fall 2011 12
Solving CSPs
• CSPs can be solved by a specialized
version of depth-first search.
– Actually depth-limited search. Why?
• Key intuitions:
– We can build up to a solution by searching through the
space of partial assignments.
– Order in which we assign the variables does not matter –
eventually they all have to be assigned. We can decide
on a suitable value for one variable at a time!
This is the key idea of backtracking search.
– If during the process of building up a solution we falsify a
constraint, we can immediately reject all possible ways of
extending the current partial assignment.
Torsten Hahmann, CSC384 Introduction to Artificial Intelligence, University of Toronto, Fall 2011 13
CSP as a Search Problem
Torsten Hahmann, CSC384 Introduction to Artificial Intelligence, University of Toronto, Fall 2011 14
Solving CSPs – Backtracking Search
• Bad news: 3SAT is a finite CSP and known to be
NP-complete, so we cannot expect to do better in
the worst case
Torsten Hahmann, CSC384 Introduction to Artificial Intelligence, University of Toronto, Fall 2011 15
Backtracking Search: The Algorithm BT
• These ideas lead to the backtracking search algorithm
BT(Level)
If all variables assigned
PRINT Value of each Variable
RETURN or EXIT (RETURN for more solutions)
(EXIT for only one solution)
V := PickUnassignedVariable()
Variable[Level] := V
Assigned[V] := TRUE
for d := each member of Domain(V) (the domain values of V)
Value[V] := d
for each constraint C such that V is a variable of C
and all other variables of C are assigned:
IF C is not satisfied by the set of current
assignments: BREAK;
ELSE BT(Level+1)
return
Torsten Hahmann, CSC384 Introduction to Artificial Intelligence, University of Toronto, Fall 2011 16
Backtracking Search
Search stops
Vj=1 Vj=2 descending if the
assignments on
path to the node
Subtree
violate a constraint
Torsten Hahmann, CSC384 Introduction to Artificial Intelligence, University of Toronto, Fall 2011 17
Backtracking Search
•Heuristics are used to determine
– the order in which variables are assigned:
PickUnassignedVariable()
– the order of values tried for each variable.
Torsten Hahmann, CSC384 Introduction to Artificial Intelligence, University of Toronto, Fall 2011 19
Example: N-Queens
• Problem formulation:
– N variables (N queens)
– N2 values for each variable representing the
positions on the chessboard
Torsten Hahmann, CSC384 Introduction to Artificial Intelligence, University of Toronto, Fall 2011 20
Example: N-Queens
Torsten Hahmann, CSC384 Introduction to Artificial Intelligence, University of Toronto, Fall 2011 21
Example: N-Queens
Torsten Hahmann, CSC384 Introduction to Artificial Intelligence, University of Toronto, Fall 2011 22
Example: N-Queens
• Better Modeling:
– N variables Qi, one per row.
– Value of Qi is the column the Queen in row i
is placed; possible values {1, …, N}.
• Q1 = 1, Q2 = 7, Q3 = 5, Q4 = 8,
Q5 = 2, Q6 = 4, Q7 = 6, Q8 = 3
Torsten Hahmann, CSC384 Introduction to Artificial Intelligence, University of Toronto, Fall 2011 24
Example: N-Queens
• Constraints:
– Can’t put two Queens in same column
Qi ≠ Qj for all i ≠ j
– Diagonal constraints
|Qi-Qj| ≠ i-j
•i.e., the difference in the values assigned
to Qi and Qj can’t be equal to the
difference between i and j.
Torsten Hahmann, CSC384 Introduction to Artificial Intelligence, University of Toronto, Fall 2011 25
Example: N-Queens
Torsten Hahmann, CSC384 Introduction to Artificial Intelligence, University of Toronto, Fall 2011 26
Example: N-Queens
Torsten Hahmann, CSC384 Introduction to Artificial Intelligence, University of Toronto, Fall 2011 27
Example: N-Queens
Torsten Hahmann, CSC384 Introduction to Artificial Intelligence, University of Toronto, Fall 2011 28
Example: N-Queens
Solution!
Torsten Hahmann, CSC384 Introduction to Artificial Intelligence, University of Toronto, Fall 2011 29
Problems with Plain Backtracking
Sudoku: The 3,3 cell has no possible value.
Torsten Hahmann, CSC384 Introduction to Artificial Intelligence, University of Toronto, Fall 2011 30
Problems with Plain Backtracking
• In the backtracking search we won’t detect that the
(3,3) cell has no possible value until all variables of the
row/column (involving row or column 3) or the sub-
square constraint (first sub-square) are assigned.
So we have the following situation:
Torsten Hahmann, CSC384 Introduction to Artificial Intelligence, University of Toronto, Fall 2011 32
Constraint Propagation
• Propagation has to be applied during the
search; potentially at every node of the search
tree.
• Propagation itself is an inference step which
needs some resources (in particular time)
– If propagation is slow, this can slow the search down
to the point where using propagation actually slows
search down!
– There is always a tradeoff between searching fewer
nodes in the search, and having a higher
nodes/second processing rate.
Torsten Hahmann, CSC384 Introduction to Artificial Intelligence, University of Toronto, Fall 2011 34
Forward Checking Algorithm
FCCheck(C,x)
// C is a constraint with all its variables already
// assigned, except for variable x.
for d := each member of CurDom[x]
IF making x = d together with previous assignments
to variables in scope C falsifies C
THEN remove d from CurDom[V]
IF CurDom[V] = {} then return DWO (Domain Wipe Out)
return ok
Torsten Hahmann, CSC384 Introduction to Artificial Intelligence, University of Toronto, Fall 2011 35
Forward Checking Algorithm
FC(Level) /*Forward Checking Algorithm */
If all variables are assigned
PRINT Value of each Variable
RETURN or EXIT (RETURN for more solutions) (EXIT for only one solution)
V := PickAnUnassignedVariable()
Variable[Level] := V
Assigned[V] := TRUE
for d := each member of CurDom(V)
Value[V] := d
DWOoccured:= False
for each constraint C over V that has one unassigned variable
in its scope (say X).
if(FCCheck(C,X) == DWO) /* X domain becomes empty*/
DWOoccurred:= True /* no point to continue*/
break
if(not DWOoccured) /*all constraints were ok*/
FC(Level+1)
RestoreAllValuesPrunedByFCCheck()
return;
Torsten Hahmann, CSC384 Introduction to Artificial Intelligence, University of Toronto, Fall 2011 36
4-Queens Problem
• Encoding with Q1, …, Q4 denoting a queen per column
– cannot put two queens in same row (instead of same column)
Q1 Q2
1 2 3 4 {1,2,3,4} {1,2,3,4}
1
2
3
4
Q3 Q4
{1,2,3,4} {1,2,3,4}
Torsten Hahmann, CSC384 Introduction to Artificial Intelligence, University of Toronto, Fall 2011 37
4-Queens Problem
• Forward checking reduced the domains of all variables that are
involved in a constraint with one uninstantiated variable:
– Here all of Q2, Q3, Q4
Q1 Q2
1 2 3 4 {1,2,3,4} {1,2,3,4}
1
2
3
4
Q3 Q4
{1,2,3,4} {1,2,3,4}
Torsten Hahmann, CSC384 Introduction to Artificial Intelligence, University of Toronto, Fall 2011 38
4-Queens Problem
Q1 Q2
1 2 3 4 {1,2,3,4} {1,2,3,4}
1
2
3
4
Q3 Q4
{1,2,3,4} {1,2,3,4}
Torsten Hahmann, CSC384 Introduction to Artificial Intelligence, University of Toronto, Fall 2011 39
4-Queens Problem
Q1 Q2
1 2 3 4 {1,2,3,4} {1,2,3,4}
1
2
3
4
Q3 Q4
{1,2,3,4} {1,2,3,4}
DWO
Torsten Hahmann, CSC384 Introduction to Artificial Intelligence, University of Toronto, Fall 2011 40
4-Queens Problem
Q1 Q2
1 2 3 4 {1,2,3,4} {1,2,3,4}
1
2
3
4
Q3 Q4
{1,2,3,4} {1,2,3,4}
Torsten Hahmann, CSC384 Introduction to Artificial Intelligence, University of Toronto, Fall 2011 41
4-Queens Problem
Q1 Q2
1 2 3 4 {1,2,3,4} {1,2,3,4}
1
2
3
4
Q3 Q4
{1,2,3,4} {1,2,3,4}
Torsten Hahmann, CSC384 Introduction to Artificial Intelligence, University of Toronto, Fall 2011 42
4-Queens Problem
Q1 Q2
1 2 3 4 {1,2,3,4} {1,2,3,4}
1
2
3
4
Q3 Q4
{1,2,3,4} {1,2,3,4}
Torsten Hahmann, CSC384 Introduction to Artificial Intelligence, University of Toronto, Fall 2011 43
4-Queens Problem
Q1 Q2
1 2 3 4 {1,2,3,4} {1,2,3,4}
1
2
3
4
Q3 Q4
{1,2,3,4} {1,2,3,4}
DWO
Torsten Hahmann, CSC384 Introduction to Artificial Intelligence, University of Toronto, Fall 2011 44
4-Queens Problem
Q1 Q2
1 2 3 4 {1,2,3,4} {1,2,3,4}
1
2
3
4
Q3 Q4
{1,2,3,4} {1,2,3,4}
Torsten Hahmann, CSC384 Introduction to Artificial Intelligence, University of Toronto, Fall 2011 45
4-Queens Problem
Q1 Q2
1 2 3 4 {1,2,3,4} {1,2,3,4}
1
2
3
4
Q3 Q4
{1,2,3,4} {1,2,3,4}
Torsten Hahmann, CSC384 Introduction to Artificial Intelligence, University of Toronto, Fall 2011 46
4-Queens Problem
Q1 Q2
1 2 3 4 {1,2,3,4} {1,2,3,4}
1
2
3
4
Q3 Q4
{1,2,3,4} {1,2,3,4}
Torsten Hahmann, CSC384 Introduction to Artificial Intelligence, University of Toronto, Fall 2011 47
4-Queens Problem
Q1 Q2
1 2 3 4 {1,2,3,4} {1,2,3,4}
1
2
3
4
Q3 Q4
{1,2,3,4} {1,2,3,4}
Torsten Hahmann, CSC384 Introduction to Artificial Intelligence, University of Toronto, Fall 2011 48
4-Queens Problem
Q1 Q2
1 2 3 4 {1,2,3,4} {1,2,3,4}
1
2
3
4
Q3 Q4
{1,2,3,4} {1,2,3,4}
Torsten Hahmann, CSC384 Introduction to Artificial Intelligence, University of Toronto, Fall 2011 49
4-Queens Problem
• We have now find a solution: an assignment of all variables to
values of their domain so that all constraints are satisfied
Q1 Q2
1 2 3 4 {1,2,3,4} {1,2,3,4}
1
2
3
4
Q3 Q4
{1,2,3,4} {1,2,3,4}
Torsten Hahmann, CSC384 Introduction to Artificial Intelligence, University of Toronto, Fall 2011 50
FC: Restoring Values
Torsten Hahmann, CSC384 Introduction to Artificial Intelligence, University of Toronto, Fall 2011 51
FC: Minimum Remaining Values Heuristics (MRV)
• FC also gives us for free a very powerful
heuristic to guide us which variables to try next:
– Always branch on a variable with the smallest
remaining values (smallest CurDom).
– If a variable has only one value left, that value is
forced, so we should propagate its consequences
immediately.
– This heuristic tends to produce skinny trees at the
top. This means that more variables can be
instantiated with fewer nodes searched, and thus
more constraint propagation/DWO failures occur
with less work.
– We can find a inconsistency such as in the Sudoku
example much faster.
Torsten Hahmann, CSC384 Introduction to Artificial Intelligence, University of Toronto, Fall 2011 52
MRV Heuristic: Human Analogy
• What variables would you try first?
Domain of each variable:
{1, …, 9}
Torsten Hahmann, CSC384 Introduction to Artificial Intelligence, University of Toronto, Fall 2011 54
Example – Map Colouring
• Modeling
– Variables: WA, NT, Q, NSW, V, SA, T
– Domains: Di={red, green, blue}
– Constraints: adjacent regions must have
different colors.
• E.g. WA ≠ NT
Torsten Hahmann, CSC384 Introduction to Artificial Intelligence, University of Toronto, Fall 2011 55
Example – Map Colouring
• Forward checking idea: keep track of remaining legal
values for unassigned variables.
• Terminate search when any variable has no legal
values.
Torsten Hahmann, CSC384 Introduction to Artificial Intelligence, University of Toronto, Fall 2011 56
Example – Map Colouring
• Assign {WA=red}
Torsten Hahmann, CSC384 Introduction to Artificial Intelligence, University of Toronto, Fall 2011 57
Example – Map Colouring
• Assign {Q=green}
• Effects on other variables connected by constraints with Q
– NT can no longer be green
– NSW can no longer be green
– SA can no longer be green
• MRV heuristic would automatically select NT or SA next
Torsten Hahmann, CSC384 Introduction to Artificial Intelligence, University of Toronto, Fall 2011 58
Example – Map Colouring
• Assign {V=blue}
• Effects on other variables connected by constraints with V
– NSW can no longer be blue
– SA is empty
Torsten Hahmann, CSC384 Introduction to Artificial Intelligence, University of Toronto, Fall 2011 59
Empirically
Torsten Hahmann, CSC384 Introduction to Artificial Intelligence, University of Toronto, Fall 2011 60
Constraint Propagation: Arc Consistency
•Another form of propagation:
make each arc consistent
– C(X,Y) is consistent iff for every value of X there
is some value of Y that satisfies C.
– Idea: ensure that every binary constraint is
satisfiable (2-consistency)
• Binary constraints = arcs in the constraint graph
• Remember: All higher-order constraints can be
expressed as a set of binary constraints
Torsten Hahmann, CSC384 Introduction to Artificial Intelligence, University of Toronto, Fall 2011 61
Constraint Propagation: Arc Consistency
• Can remove values from the domain of variables:
– e.g. C(X,Y): X>Y Dom(X)={1,5,11} Dom(Y)={3,8,15}
• For X=1 there is no value of Y s.t. 1>Y => remove 1 from domain X
• For Y=15 there is no value of X s.t. X>15, so remove 15 from domain Y
• We obtain more restricted domains Dom(X)={5,11} and Dom(Y)={3,8}
– Have to try much fewer values in the search tree.
Torsten Hahmann, CSC384 Introduction to Artificial Intelligence, University of Toronto, Fall 2011 62
Arc Consistency – Map Colouring Example
ok
Torsten Hahmann, CSC384 Introduction to Artificial Intelligence, University of Toronto, Fall 2011 64
Arc Consistency – Map Colouring Example
ok
Torsten Hahmann, CSC384 Introduction to Artificial Intelligence, University of Toronto, Fall 2011 66
Arc-Consistency – Example
– Constraints:
• C(X,Y): X > Y
• C(Y,Z): Y + Z = 12
• C(X,Z): X + Z = 16
Torsten Hahmann, CSC384 Introduction to Artificial Intelligence, University of Toronto, Fall 2011 67
Arc Consistency – Example
X Y C(X,Y): X > Y
C(Y,Z): Y + Z = 12
C(X,Z): X + Z = 16
Z
Torsten Hahmann, CSC384 Introduction to Artificial Intelligence, University of Toronto, Fall 2011 68
Arc Consistency – Example
• Strong k-consistency
– k-consistent, (k-1)-consistent, etc.
– Very expensive: any algorithm establishing k-consistency requires
exponential time and space in k
– In practical solvers: 2-consistency, sometimes 3-consistency
Torsten Hahmann, CSC384 Introduction to Artificial Intelligence, University of Toronto, Fall 2011 70
Many real-world applications of CSP
• Assignment problems
– who teaches what class
• Timetabling problems
– exam schedule
• Transportation scheduling
• Floor planning
• Factory scheduling
• Hardware configuration
– a set of compatible components
Torsten Hahmann, CSC384 Introduction to Artificial Intelligence, University of Toronto, Fall 2011 71
CSP Solvers
• Much work on various heuristics for variable and value selection
• Fourth CSP Solver Competition Results 2009,
Category: Only binary constraints
Torsten Hahmann, CSC384 Introduction to Artificial Intelligence, University of Toronto, Fall 2011 72
CSP Solvers
1600 1600
1400 1400
1200 1200
CPU time (s)
800 800
600 600
400 400
200 200
0 0
0 50 100 150 200 250 300 350 0 50 100 150 200 250
number of solved instances number of solved instances
Torsten Hahmann, CSC384 Introduction to Artificial Intelligence, University of Toronto, Fall 2011 73