Estimation of the optimal number of neurons in extreme learning machine using simulated annealing and the golden section

E. Gelvez-Almeida,M. Mora, Y. Huerfano-Maldonado, E. Salazar-Jurado, N. Martnez-Jeraldo, R. Lozada-Yavina,Y. Baldera-Moreno, L. Tobar

INTERNATIONAL CONFERENCE DAYS OF APPLIED MATHEMATICS, IX ICDAM(2023)

引用 0|浏览2
暂无评分
摘要
Extreme learning machine is a neural network algorithm widely accepted in the scientific community due to the simplicity of the model and its good results in classification and regression problems; digital image processing, medical diagnosis, and signal recognition are some applications in the field of physics addressed with these neural networks. The algorithm must be executed with an adequate number of neurons in the hidden layer to obtain good results. Identifying the appropriate number of neurons in the hidden layer is an open problem in the extreme learning machine field. The search process has a high computational cost if carried out sequentially, given the complexity of the calculations as the number of neurons increases. In this work, we use the search of the golden section and simulated annealing as heuristic methods to calculate the appropriate number of neurons in the hidden layer of an Extreme Learning Machine; for the experiments, three real databases were used for the classification problem and a synthetic database for the regression problem. The results show that the search for the appropriate number of neurons is accelerated up to 4.5x times with simulated annealing and up to 95.7x times with the golden section search compared to a sequential method in the highest-dimensional database.
更多
查看译文
关键词
extreme learning machine,optimal number,neurons,annealing
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要