Which approach is the FASTEST way to improve the model’s accuracy?
Run a SageMaker incremental training based on the best candidate from the current model’s tuning job. Monitor the same metric that was used as the objective metric in the previous tuning, and look for improvements.
Set the Area Under the ROC Curve (AUC) as the objective metric for a new SageMaker automatic hyperparameter tuning job. Use the same maximum training jobs parameter that was used in the previous tuning job.
Run a SageMaker warm start hyperparameter tuning job based on the current model’s tuning job. Use the same objective metric that was used in the previous tuning.
Set the F1 score as the objective metric for a new SageMaker automatic hyperparameter tuning job. Double the maximum training jobs parameter that was used in the previous tuning job.
Explanations:
Incremental training may not drastically improve model accuracy in a short time frame, and it uses the same metric, which may not lead to better performance.
Changing the objective metric to AUC might require additional training and hyperparameter exploration, which is not the fastest approach to improve accuracy.
Running a warm start hyperparameter tuning job reuses the prior tuning information, speeding up the process and increasing the likelihood of improving accuracy quickly.
Doubling the training jobs parameter may increase training time without guaranteeing a fast improvement in accuracy, especially with a new metric like F1 score.