Modeling of hydrological time series is essential for sustainable development and management of lake water resources. This study aims to develop an efficient model for forecasting lake water level variations, examplified by the Poyang Lake (China) case study. Random forests (RF) model was first applied and compared with artificial neural networks, support vector regression, and linear model. Three scenarios were adopted to investigate the effect of time lag and previous water levels as model inputs for real-time forecasting. Variable importance was then analyzed to evaluate the influence of each predictor for water level variations. Results indicated that the RF model exhibits best performance for daily forecasting in terms of root mean square error (RMSE) and coefficient of determination (R2). Moreover, highest accuracy was achieved using discharge series at 4-day-ahead and average water level over the previous week as model inputs, with an average RMSE of 0.25 m for five stations within the lake. In addition, previous water level was the most efficient predictor for water level forecasting, followed by discharge from the Yangtze River. Based on performance of the soft computing methods, RF can be calibrated to provide information or simulation scenarios for water management and decision-making.
- artificial neural networks
- lake water level
- Poyang Lake
- random forests
- support vector regression
- variable importance analysis
- First received 17 December 2015.
- Accepted in revised form 13 June 2016.
- © 2016 The Authors
This is an Open Access article distributed under the terms of the Creative Commons Attribution Licence (CC BY 4.0), which permits copying, adaptation and redistribution, provided the original work is properly cited (http://creativecommons.org/licenses/by/4.0/).