Resumen
Commercial microwave links have a great potential to be used as sensors for rain. However, the use of commercial microwave links to monitor the rain depends heavily on the availability of the links’ attenuation measurements. The cellular operators which provide the majority of these measurements usually make use of the standard Network Management Systems (NMS), which log only a quantized version of the minimum and the maximum attenuation values (usually in 15-min intervals). The non-linear min/max transformation, in combination with the quantizer, which are implemented on the channel attenuation measurements, should be considered during the rain-estimation procedures. In this paper, we examine actual NMS produced attenuation measurements that are taken from two commercial microwave links during multiple rain events. Using observations from two rain gauges and a weather radar, we empirically demonstrate that the output of the NMS includes bias, which in turn interferes with the rain-estimation process. We show that the detection and the compensation of this bias have the potential to increase the microwave links’ rain-estimation accuracy considerably.