What changes should the Specialist consider to solve this issue?
(Choose three.)
Choose a higher number of layers
Choose a lower number of layers
Choose a smaller learning rate
Enable dropout
Include all the images from the test set in the training set
Enable early stopping
Explanations:
A neural network with too many layers may overfit the training data, which is likely the case here. Reducing the number of layers can help mitigate overfitting and improve generalization to the test set.
Dropout is a regularization technique that randomly drops units during training, preventing the model from becoming too reliant on specific features. This helps prevent overfitting and improves generalization.
Early stopping monitors the model’s performance on the validation set and stops training when the performance starts degrading, which helps prevent overfitting to the training data.
Increasing the number of layers may worsen overfitting, especially since the model is already overfitting with 50 layers.
While a smaller learning rate might improve training stability, it does not directly address the overfitting issue. Reducing layers or applying regularization would be more effective.
Including test data in training would violate the principle of keeping the test set separate for evaluation. This would result in biased performance metrics.