Comparing Adam and SGD optimizers to train AlexNet for classifying GPR C-scans featuring ancient structures

2021 11th International Workshop on Advanced Ground Penetrating Radar (IWAGPR)(2021)

引用 2|浏览1
暂无评分
摘要
In this study, AlexNet architecture is implemented and trained to classify C-scans featuring ancient structural patterns. The performance of two popular optimizers is examined and compared, namely the Stochastic Gradient Descent (SGD) with momentum and Adaptive Moments Estimate (Adam). The two optimizers were employed to train models using a GPR dataset from several archaeological sites. The results showed that even though SGD was more challenging to achieve learning, it eventually performed better than Adam when Batch Normalization, Dropout, and tuning the batch size and learning rate were performed. Furthermore, the generalization was tested using entirely independent data. SGD performed better, scoring 95% over 90% classification accuracy. The obtained results highlight how important the optimizer’s choice can be in the learning process and is worth investigating when training CNNs models with GPR data.
更多
查看译文
关键词
GPR,CNNs,AlexNet,Grad-CAM,ancient structures,archaeological prospections,SGD,Adam
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要