Robustness of the long short-term memory network in rainfall-runoff prediction improved by the water balance constraint

crossref(2024)

Cited 0|Views0
No score
Abstract
Abstract. While the water balance constraint is fundamental to catchment hydrological models, there is yet no consensus on its role in the long short-term memory (LSTM) network. This paper is concentrated on the part that this constraint plays in the robustness of the LSTM network for rainfall-runoff prediction. Specifically, numerical experiments are devised to examine the robustness of the LSTM and its architecturally mass-conserving variant (MC-LSTM); and the Explainable Artificial Intelligence (XAI) is employed to interrogate how this constraint affects the robustness of the LSTM in learning rainfall-runoff relationships. Based on the Catchment Attributes and Meteorology for Large-sample Studies (CAMELS) dataset, the LSTM, MC-LSTM and EXP-HYDRO models are trained under various amounts of training data and different seeds of parameter initialization over 531 catchments, leading to 95,580 (3×6×10×531) tests. Through large-sample tests, the results show that incorporating the water balance constraint into the LSTM improves the robustness, while the improvement tends to decrease as the amount of training data increases. Under 9 years’ training data, this constraint significantly enhances the robustness against data sparsity in 37 % (196 in 531) of the catchments and improves the robustness against parameter initialization in 73 % (386 in 531) of the catchments. In addition, it improves the robustness in learning rainfall-runoff relationships by increasing the median contribution of precipitation from 45.8 % to 47.3 %. These results point to the compensation effects between training data and process knowledge on the LSTM’s performance. Overall, the in-depth investigations facilitate insights into the use of the LSTM for rainfall-runoff prediction.
More
Translated text
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined