A New Triangle: Fractional Calculus, Renormalization Group, and Machine Learning

Volume 7: 17th IEEE/ASME International Conference on Mechatronic and Embedded Systems and Applications (MESA)(2021)

引用 0|浏览0
暂无评分
摘要
Abstract The emergence of the systematic study of complexity as a science has resulted from the growing recognition that the fundamental assumptions upon which Newtonian physics is based are not satisfied throughout most of science, e.g., time is not necessarily uniformly flowing in one direction, nor is space homogeneous. Herein we discuss how the fractional calculus (FC), renormalization group (RG) theory and machine learning (ML) have each developed independently in the study of distinct phenomena in which one or more of the underlying assumptions of Newtonian formalism is violated. FC has been shown to help us better understand complex systems, improve the processing of complex signals, enhance the control of complex networks, increase optimization performance, and even extend the enabling of the potential for creativity. RG allows one to investigate the changes of a dynamical system at different scales. For example, in quantum field theory, divergent parts of a calculation can lead to nonsensical infinite results. However, by applying RG, the divergent parts can be adsorbed into fewer measurable quantities, yielding finite results. To date, ML is a fashionable research topic and will probably remain so into the foreseeable future. How a model can learn efficiently (optimally) is always essential. The key to learnability is designing efficient optimization methods. Although extensive research has been carried out on the three topics separately, few studies have investigated the association triangle between the FC, RG, and ML. To initiate the study of their interdependence, herein the authors discuss the critical connections between them (Fig. 1). In the FC and RG, scaling laws reveal the complexity of the phenomena discussed. The authors emphasize that the FC’s and RG’s critical connection is the form of inverse power laws (IPL), and the IPL index provides a measure of the level of complexity. For FC and ML, the critical connections in big data, wherein variability, optimization, and non-local models are described. The authors introduce the derivative-free and gradient-based optimization methods and explain how the FC could contribute to these study areas. In the end, the association between the RG and ML is also explained. The mutual information, feature extraction, and locality are also discussed. Many of the cross-sectional studies suggest a connection between the RG and ML. The RG has a superficial similarity to deep neural networks (DNNs) structure in which one marginalizes over hidden degrees of freedom. The authors remark in the conclusions that the association triangle between FC, RG, and ML, form a stool on which the foundation to complexity science might comfortably sit for a wide range of future research topics.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要