Questions for the Exam
AI in General
1. What did John McCarthy NOT do?
2. Which of these statements is true about the Turing test and the Chinese
room argument?
3. Which of these tasks was NOT solved much better by Deep Learning than
previous algorithms?
4. What is NOT crucial for deep learning algorithms?
5. Which of these advances in AI that are used extensively in software
technology today were NOT invented by John McCarthy’s lab?
6. Finish the sentence: Nobody supposes that the computational model of
rainstorms in London
7. Which theory says that our minds are in fact computer programs?
8. Who was NOT present at the Dartmouth Summer Research Project on
Artificial Intelligence?
9. What was the name of the world’s first chatterbot?
10. What was NOT one of the problems with AI identified in the Lighthill
report?
11. What is Moravec’s paradox?
12. Who was not awarded the Turing prize despite being a significant contrib-
utor to deep learning?
Search
1. Which of these is not a search algoirhm?
2. How can we NOT reduce the complexity of a state space?
3. What does the complexity of a representation graph NOT depend on?
4. Which of these is NOT true of a state space graph?
5. In which of these problems is the problem space NOT the same as paths
of the representation graph starting from the start node?
6. Which of these is NOT true of a delta-graph?
7. Which of these algorithms use a tentative control strategy?
8. Which of these algorthms use an irrevocable control strategy?
9. Which of these is a general control strategy?
10. Can we think of the hill climbing method as a special case of tabu search?
11. In how many places does simulated annealing use randomness?
12. Which of these is a drawback of the tabu search?
13. Which of these is FALSE for local search algorithms?
14. Which of these is NOT a drawback of the hill climbing algorithm?
15. Which of these algorithms was NOT invented to avoid hill climbing getting
stuck in a dead end?
16. What does the global workspace of backtracking search contain?
1
17. What are the search rules of backtracking search?
18. What is the control strategy of backtracking search?
19. Which of these is NOT true about the first version of the backtracking
search (BT1)?
20. Which of these statements is NOT true about the second version of the
backtracking search (BT2)?
21. Which of these statements is NOT true about the second version of the
backtracking (BT2)?
22. Which of these is an advantage of backtracking search?
23. What does the global workspace of graph search contain?
24. What is the search rule of graph search?
25. What is the control strategy of graph search?
26. What kind of nodes are the open nodes?
27. How do we call the subgraph we store in the global workspace of graph
search?
28. What kind of nodes are the closed nodes?
29. What does the parent pointer function (pi) point to?
30. When is an evaluation function decreasing?
31. When is a node of a search graph correct?
32. Which of these statements is NOT true about the general graph search
algorithm?
33. Which of these statements is true about the general graph search?
34. Can we use order heuristic as a secondary control strategy in an uninformed
graph search?
35. Which of these is depth first search (f is the evaluation function, g is the
cost function, c is the cost of an edge)?
36. Which of these is breadth first search (f is the evaluation function, g is the
cost function, c is the cost of an edge)?
37. Which of these is uniform cost search (f is the evaluation function, g is the
cost function, c is the cost of an edge)?
38. What does admissibility mean for a graph search?
39. Which statement is NOT true about the constant 0 function?
40. Which of these is the look-forward graph search (f is the evaluation function,
g is the cost function, h is the heuristic, h-star is the optimal cost, c is the
cost of an edge)?
41. Which of these is the A algorithm (f is the evaluation function, g is the
cost function, h is the heuristic, h-star is the optimal cost, c is the cost of
an edge)?
42. Which of these is the A-star algorithm (f is the evaluation function, g is
the cost function, h is the heuristic, h-star is the optimal cost, c is the cost
of an edge)?
43. Which of these is the A-c (consistent) algorithm (f is the evaluation function,
g is the cost function, h is the heuristic, h-star is the optimal cost, c is the
cost of an edge)?
44. Which of these is a property of the A algorithm?
45. Which of these is NOT true about the A-c (consistent) algorithm?
2
46. When do we say that a heuristic function is monotone?
47. Which of these statements is NOT true about breadth-first search?
48. Which of these is true about uniform cost search?
49. Which of these was NOT true about the two-player games we have been
examining in the course?
50. What does the state of a two-player games represent?
51. What is the winning strategy in a two-player game?
52. When do we cut in the alpha-beta algorithm?
53. What is the stationary test for minimax search?
54. Which of these statements is NOT true about the game tree?
55. Which of these is a step in the minimax algorithm?
56. What is the game tree?
57. What is the general control strategy of evolutionary algorithms?
58. What does the evolutionary algorithm store in its global workspace?
59. Which of these is NOT an evolutionary operator?
60. How do we code an individual?
61. How many steps does the evolutionary cycle consist of?
62. Where can we incorporate randomness into the evolutionary algorithm?
63. Where do we use selection in the evolutionary algorithm?
64. What is a good selection algorithm in evolutionary algorithms?
65. What is the connection between crossover and recombination?
66. When does the evolutionary algorithm terminate?
67. Which of these is not a strategy parameter of evolutionary algorithms?
Machine Learning
1. What does it mean for learning to be supervised?
2. What does it mean for learning to be unsupervised?
3. What is an epoch?
4. What is a minibatch?
5. Why do we use separate training and test sets?
6. Why do we use a validation set in addition to the training and test sets?
7. What is a classification problem?
8. What are the hyperparameters of a learning algorithm?
9. When do we use the sigmoid activation function?
10. When do we use the softmax activation function?
11. What is the definition of the ReLU activation function?
12. When do we use the ReLu activation function?
13. When do we use the binary cross-entropy loss function?
14. When do we use the categorical cross-entropy loss function?
15. What would be the activation function and loss for a binary classification
problem?
16. What would be the activation function and loss for a multiclass classification
problem?
17. Which of these are stopwords?
3
18. Which of these words were stemmed?
19. What does a language model do?
20. What is the bag of words model?
21. What is the difference between bag of words and TFIDF?
22. What kind of hyperplane is the Support Vector Machine (SVM) learning?
23. What do we use the confusion matrix for?
24. What is grid search? Why do we use it?
25. When would you use random search instead of grid search?
26. What would be the one-hot encoding of [1, 3, 0]?
27. What does a word embedding do?
28. What is an example of clustering?
29. What is the difference between hard and soft clustering?
30. What is NOT true of the k-means problem?
31. What are the two steps of the k-means algorithm?
32. What is NOT an issue with the k-means algorithm?
33. What does Latent Semantic Analysis do?
34. What is NOT a reason to use dimensionality reduction?
35. What is a principal component in Principal Components Analysis (PCA)?
36. What is the relationship between Principal Component Analysis (PCA)
and Singular Value Decomposition (SVD)?
37. What does an autoencoder do?
38. Why doesn’t the autoencoder just do an identity transformation?
39. How many matrices does Latent Semantic Analysis produce from its input
matrix?