Explainable Edge AI Framework for IoD-Assisted Aerial Surveillance in Extreme Scenarios

IEEE Internet of Things Journal(2024)

引用 0|浏览5
暂无评分
摘要
Drones are sophisticated machines that can hover over extreme locations, conduct aerial surveillance, collect surveillance data, and disseminate it to the distributed edge for processing and analysis. The distributed edge deploys advanced Artificial Intelligence (AI) models to detect any unwarranted activity or object based on surveillance data. However, these lightweight and low-power Unmanned Aerial Vehicles (UAVs) may experience faults due to unprecedented workload when deployed in extreme surveillance domains. In this paper, we have designed an AI framework to detect any safety concerns with drones deployed for aerial surveillance in extreme locations based on real-time drone critical parameters. We also propose a MapReduce-based object recognition and classification module to process large-scale images captured by drones efficiently. However, conventional AI systems behave like black box systems, leading to a lack of trust and transparency. Thus, we convert the traditional framework of AI into an explainable edge AI framework using SHapley Additive exPlanations (SHAP) that opens Pandora’s black box. The experimental results show the effectiveness of the proposed framework in detecting drone safety concerns through explainable health status tracking alongside ensuring an effective object detection mechanism.
更多
查看译文
关键词
Aerial surveillance,Distributed Edge Computing,Explainable AI,Unmanned Aerial Vehicles (UAVs)
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要