Preventing Overfitting of LSTMs using Ant Colony Optimization

2021 10th International Congress on Advanced Applied Informatics (IIAI-AAI)(2021)

Cited 1|Views5
No score
Abstract
Overfitting is a general problem for neural networks, where they overfit to only the training data and get low generalization performance. Long Short-Term Memory (LSTM) is a type of neural network which can handle sequences of data, and it often suffers from overfitting because of its complex and expressive structure. There are several methods to prevent overfitting of neural networks, such as dropout or dropconnect. Nevertheless, they fail in some cases. This article studies the usage of ant colony optimization (ACO), a swarm intelligence algorithm for combinatorial optimization problems, on optimizing the structure of LSTMs and preventing overfitting. The proposed method optimizes the structure with ACO while training the LSTM with backpropagation. It was applied to time-series prediction tasks on three real-world datasets, and it performed better than other traditional methods.
More
Translated text
Key words
long short-term memory,ant colony optimization,neuro-evolution,time-series prediction,overfitting
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined