I. Introduction
Satellite communication systems are increasingly using the Ka-band (30/20 GHz) to deliver broadband services through very small aperture terminal networks, using either traditional fixed satellite service or innovative high throughput satellite [1] configurations. Despite the attractiveness of using the Ka-band, the limitation caused by atmospheric impairments observed at these frequencies is well known, in particular those caused by precipitation. In order to counteract these propagation effects, Ka-band satellite systems make use of fade mitigation techniques that can improve their performance. Such techniques consist in adapting in real time some parameters of the satellite link, such as transmission power, modulation and coding, among others. Thus, the control loop must adapt itself to the dynamic behavior of the propagation channel. With the purpose of implementing these techniques, the characterization of the variability of fade dynamics is more important than in the past. For example, identifying the fade duration parameters that are more prone to suffer higher interannual variability could help to better design fade mitigation techniques, minimizing the effects of outages and improving their capacity of adaptation to different conditions.