Usually, the larger the model is, the better performance it can offer, if it does not overfit. To train a large model while not causing overfitting, we need a large dataset. However, for specific tasks, like natural hazard detection, there might not be enough images, so training a large model with such a small dataset will cause overfitting. A workaround is to use transfer learning. We can use a large model, usually already trained on a large dataset for another task, and train it again on our small dataset. To adapt a large model to our small dataset with fewer numbers of classes, modifications on the “classification” block are usually needed.

Transfer learning
Transfer Learning
Image source: [1]
 
© Chishiki-AI  |   Cornell University    |   Center for Advanced Computing    |   Copyright Statement    |   Access Statement
CVW material development is supported by NSF OAC awards 1854828, 2321040, 2323116 (UT Austin) and 2005506 (Indiana University)