1. What is a key characteristic of deep learning?
A. Uses decision trees for prediction
B. Involves manual feature extraction
C. Uses multi-layered neural networks
D. Requires less data than traditional machine learning
Answer: C. Uses multi-layered neural networks
2. What is the purpose of an activation function in a neural network?
A. To normalize the output
B. To make the model linear
C. To introduce non-linearity
D. To initialize weights
Answer: C. To introduce non-linearity
3. Which of the following is a commonly used activation function?
A. Dropout
B. ReLU
C. Softmax Regression
D. Stochastic Gradient Descent
Answer: B. ReLU
4. What does ReLU stand for?
A. Rectified Linear Unit
B. Regularized Linear Unit
C. Recurrent Linear Update
D. Restricted Learning Unit
Answer: A. Rectified Linear Unit
5. Which of the following deep learning models is best suited for sequential
data?
A. Convolutional Neural Networks (CNN)
B. Recurrent Neural Networks (RNN)
C. Random Forest
D. Logistic Regression
Answer: B. Recurrent Neural Networks (RNN)
6. What is the main purpose of dropout in deep learning?
A. To increase training speed
B. To reduce overfitting
C. To enhance activation functions
D. To increase data size
Answer: B. To reduce overfitting
7. Which layer is used in CNNs to reduce spatial dimensions?
A. Fully Connected Layer
B. Dropout Layer
C. Pooling Layer
D. ReLU Layer
Answer: C. Pooling Layer
8. What does a convolution operation do in CNNs?
A. Flattens the input
B. Applies filters to extract features
C. Normalizes the output
D. Connects all neurons
Answer: B. Applies filters to extract features
9. In deep learning, backpropagation is used to:
A. Initialize weights
B. Propagate input forward
C. Update weights based on error
D. Create new layers
Answer: C. Update weights based on error
10. What is the vanishing gradient problem?
A. When gradients become too large
B. When gradients become too small during backpropagation
C. When training stops unexpectedly
D. When a model overfits
Answer: B. When gradients become too small during backpropagation
11. Which type of neural network is best for image recognition tasks?
A. RNN
B. LSTM
C. CNN
D. GAN
Answer: C. CNN
12. What is the function of the Softmax layer in a neural network?
A. To add regularization
B. To output probabilities for multi-class classification
C. To reduce dimensionality
D. To increase learning rate
Answer: B. To output probabilities for multi-class classification
13. What is transfer learning?
A. Training a new model from scratch
B. Using weights from a pretrained model on a new problem
C. Discarding old training data
D. Manually tuning hyperparameters
Answer: B. Using weights from a pretrained model on a new problem
14. What is a Generative Adversarial Network (GAN)?
A. A network used for classification
B. A pair of networks competing against each other to generate data
C. A type of RNN
D. A network that performs dimensionality reduction
Answer: B. A pair of networks competing against each other to generate data
15. Which optimizer adapts the learning rate for each parameter during
training?
A. SGD
B. Adam
C. RMSprop
D. Both B and C
Answer: D. Both B and C