Week 8 Answers
1) Which of the following is false about CNN?
a) Output should be flattened before feeding it to a fully connected layer
b) There can be only 1 fully connected layer in CNN
c) We can use as many convolutional layers in CNN
d) None of the above
Answer: B
2) The input image has been converted into a matrix of size 64 Ă— 64 and a kernel of size 5 Ă— 5 with a stride of 1 and no padding. What will be the size of the convoluted matrix?
a) 5 Ă— 5
b) 59 Ă— 59
c) 60 Ă— 60
d) None of the above
Answer: D
3) A filter size of 3 Ă— 3 is convolved with a matrix of size 4 Ă— 4 (stride = 1). What will be the size of the output matrix if valid padding is applied?
a) 4 Ă— 4
b) 3 Ă— 3
c) 2 Ă— 2
d) 1 Ă— 1
Answer: C
4) Let us consider a Convolutional Neural Network having three different convolutional layers in its architecture as:
- Layer-1:Â Filter Size 3 Ă— 3, Number of Filters 10, Stride 1, Padding 0
- Layer-2:Â Filter Size 5 Ă— 5, Number of Filters 20, Stride 2, Padding 0
- Layer-3:Â Filter Size 5 Ă— 5, Number of Filters 40, Stride 2, Padding 0
Layer 3 of the above network is followed by a fully connected layer. If we give a 3D image input of dimension 39 Ă— 39 to the network, then which of the following is the input dimension of the fully connected layer?
a) 1960
b) 2200
c) 4563
d) 13690
Answer: A
5) Suppose you have 40 convolutional kernels of size 3 Ă— 3 with no padding and stride 1 in the first layer of a convolutional neural network. You pass an input of dimension 1024 Ă— 1024 Ă— 3 through this layer. What are the dimensions of the data which the next layer will receive?
a) 1020 Ă— 1020 Ă— 40
b) 1022 Ă— 1022 Ă— 40
c) 1022 Ă— 1022 Ă— 3
d) None of the above
Answer: B
6) Consider a CNN model which aims at classifying an image as either a rose, a marigold, a lily, or an orchid (consider the test image can have only one of the classes at a time). The last (fully-connected) layer of the CNN outputs a vector of logits, L, that is passed through an activation function that transforms the logits into probabilities, P. These probabilities are the model predictions for each of the 4 classes. Fill in the blanks with the appropriate option.
a) Leaky ReLU
b) Tanh
c) ReLU
d) Softmax
Answer: D
7) Suppose your input is a 300 Ă— 300 color (RGB) image, and you use a convolutional layer with 100 filters that are each 5 Ă— 5. How many parameters does this hidden layer have (without bias)?
a) 2501
b) 2600
c) 7500
d) 7600
Answer: C
8) Which of the following activation functions can lead to vanishing gradients?
a) ReLU
b) Sigmoid
c) Leaky ReLU
d) None of the above
Answer: B
9) Statement 1: Residual networks can be a solution for the vanishing gradient problem.
Statement 2: Residual networks provide residual connections straight to earlier layers.
Statement 3: Residual networks can never be a solution for the vanishing gradient problem.
Which of the following options is correct?
a) Statement 2 is correct
b) Statement 3 is correct
c) Both Statement 1 and Statement 2 are correct
d) Both Statement 2 and Statement 3 are correct
Answer: C
10) Input to the SoftMax activation function is [0.5, 0.5, 1]. What will be the output?
A. [0.28,0.28,0.44]
b. [0.022,0.956, 0.022]
c. [0.045,0.910,0.045]
d. [0.42, 0.42,0.16]
Answer: A