Hypercomplex neural network in time series forecasting of stock data

CoRR(2024)

Cited 0|Views4
No score
Abstract
The three classes of architectures for time series prediction were tested. They differ by input layers which contain either convolutional, LSTM, or dense hypercomplex layers for 4D algebras. The input was four related Stock Market time series, and the prediction of one of them is expected. The optimization of hyperparameters related to the classes of architectures was performed in order to compare the best neural networks within the class. The results show that in most cases, the architecture with a hypercomplex dense layer provides similar MAE accuracy to other architectures, however, with considerably less trainable parameters. Thanks to it, hypercomplex neural networks can be learned and process data faster than the other tested architectures. Moreover, the order of the input time series has an impact on effectively.
More
Translated text
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined