Resumen
Rain gauges continue to be sources of rainfall data despite progress made in precipitation measurements using radar and satellite technology. There has been some work done on assessing the optimum rain gauge network density required for hydrological modelling, but without consensus. This paper contributes to the identification of the optimum rain gauge network density, using scaling laws and bias-corrected 1 km × 1 km grid radar rainfall records, covering an area of 28,371 km2 that hosts 315 rain gauges in south-east Queensland, Australia. Varying numbers of radar pixels (rain gauges) were repeatedly sampled using a unique stratified sampling technique. For each set of rainfall sampled data, a two-dimensional correlogram was developed from the normal scores obtained through quantile-quantile transformation for ordinary kriging which is a stochastic interpolation. Leave-one-out cross validation was carried out, and the simulated quantiles were evaluated using the performance statistics of root-mean-square-error and mean-absolute-bias, as well as their rates of change. A break in the scaling of the plots of these performance statistics against the number of rain gauges was used to infer the optimum rain gauge network density. The optimum rain gauge network density varied from 14 km2/gauge to 38 km2/gauge, with an average of 25 km2/gauge.