Modified genetic algorithm and fine-tuned long short-term memory network for intrusion detection in the internet of things networks with edge capabilities

APPLIED SOFT COMPUTING(2024)

引用 0|浏览0
暂无评分
摘要
The emergence of smart cities is an example of how new technologies, such as the Internet of Things (IoT), have facilitated the creation of extensive interconnected and intelligent ecosystems. The widespread deployment of IoT devices has enabled the provision of constant environmental feedback, thereby facilitating the automated adaptation of associated systems. This has brought about a fundamental transformation in the way contemporary society functions. The security of emerging technologies such as IoT has become a significant challenge due to the added complexities, misconfigurations, and conflicts between modern and legacy systems. This challenge has a notable impact on the reliability and accessibility of existing infrastructure. Edge computing (EC) is a collaborative computing system that brings data processing and analysis closer to the edge of the network, where the data is generated, rather than in a centralized cloud environment. The utilization of the IoT has become more prevalent in both everyday life and the manufacturing sector, with a particular emphasis on critical infrastructure. The IoT is presently being utilized across diverse domains, including but not limited to industrial, agricultural, healthcare, and logistical sectors. The security of IoT networks has implications for the safety of individuals, the security of the nation, and economic development. Notwithstanding, conventional intrusion detection techniques that rely on centralized cloud-based systems that have been suggested in previous studies for IoT network security are insufficient to meet the requirements for data confidentiality, network capacity, and prompt responsiveness. In addition, the integration of IoT applications into smart devices has been shown to augment their functionalities. However, it is important to note that this integration also brings about potential security vulnerabilities. Furthermore, a significant number of contemporary IoT devices exhibit restricted security capabilities, rendering them vulnerable to intricate attacks and impeding the extensive integration of IoT technologies. Also, a lot of IoT network devices have been put in place that don't have hardware security measures. This means that traditional intrusion detection systems (IDS) aren't enough to protect the IoT network ecosystem. To address these issues, this research suggests the IoT-Defender framework, which combines a Modified Genetic Algorithm (MGA) model with a deep Long Short-Term Memory (LSTM) network to find cyberattacks in IoT networks. This research represents a pioneering attempt to employ the MGA for feature selection and the GA for fine-tuning the LSTM parameters within an EC framework. The parameters of the LSTM model were fine-tuned through the manipulation of the number of hidden layers, utilizing the GA fitness function. The customization of the MGA aimed to enhance its performance in selecting relevant features, optimizing the use of limited resources on IoT devices and edge nodes. The fine-tuning process involved optimizing hyperparameters, architecture, and training strategies to maximize the LSTM network's effectiveness in learning and detecting patterns in IoT network traffic. The synergy between the MGA and LSTM aimed at creating a comprehensive and efficient IDS. The feature selection by the MGA contributes to improving the LSTM's performance by providing it with more relevant and discriminating features. In order to solve the issue of class imbalance, we utilize the focal loss function, which provides greater weights to minority classes, hence improving the model's capacity to learn from those particular classes. The performance of the IoT-Defender model was assessed on the BoT-IoT, UNSW-NB15, and N-BaIoT datasets utilizing a Raspberry Pi IoT device. The results of our study show that the IoT-Defender model works better than other methods. This is shown by its accuracy score of 99.41%, detection rate of 99.78%, precision score of 98.50%, false alarm rate of 2.56%, mean intersection over union (mIoU) of 0.68, and training time of 81.3 seconds on BoT-IoT. The proposed IoT model is designed to be lightweight and can be installed on edge servers to detect cyber-attacks in real-time, specifically in the context of IoT security.
更多
查看译文
关键词
Internet of things,Edge computing,Intrusion detection system,Long short-term memory,Genetic algorithm,Modified genetic algorithm,BoT-IoT,Focal loss function,Class imbalance,UNSW-NB15,N-BaIoT
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要