Explainable Predictions for Brain Tumor Diagnosis Using InceptionV3 CNN Architecture

Punam Bedi, Ningyao Ningshen, S. Jansi Rani,Pushkar Gole

Lecture notes in networks and systems(2023)

引用 0|浏览0
暂无评分
摘要
Brain tumor is one of the deadliest diseases diagnosed in human beings. Doctors can identify the tumor with the Magnetic Resonance Imaging (MRI) technique which uses the magnetic field and computer-generated waves to create Brain images. With the advancement in the domain of Computer Vision, various researchers have come up with different state-of-the-art frameworks that aid doctors in diagnosing Brain tumors quickly. Most of these research works exploited the effectiveness of Convolutional Neural Networks (CNNs) on the image data. However, the major drawback of these studies is that they do not provide human-interpretable explanations for the models’ predictions. In this research work, the InceptionV3 CNN architecture is used to detect Brain tumors from the MRI images. This paper also provides human-interpretable predictions for Brain tumor diagnosis through the Local Interpretable Model-agnostic Explanation (LIME) framework to enhance the trust of doctors in the predictions of InceptionV3 CNN architecture. The MRI images of the Brain tumor have been gathered from the publicly available dataset catalog website Kaggle. InceptionV3 architecture is chosen as it is found to be the best architecture among the various CNN architectures, attaining an accuracy of 99.81%.
更多
查看译文
关键词
inceptionv3 cnn architecture,brain tumor diagnosis,explainable predictions
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要