To maintain a, high efficiency for the antennas (not taking into account any losses in the Earth’s atmosphere), it is necessary that G > 2.4. At fixed values of the paramters A, and L, a decrease in size of the transmitting antenna is possible only by increasing the microwave radiation frequency, i.e., by, increasing 1. It is obvious that increasing A is possible only if one can take into account all losses generated by the (increased) interactions between the electromagnetic radiation and Earth’s atmosphere. The interaction of microwave radiation with atmosphere is a complex and multistage process which includes the decay of an signal due to radiation absorption by oxygen and water vapor, the dispersion due to rain, mist, clouds, bias of the microwave over the Earth’s surface due to refraction in the ionosphere, magnetic rotation of the electromagnetic wave in a magnetic field, and the nonlinear interactions between radiation and the ionosphere. (2). If at frequencies of 2.5 - 3 GHz the energy absorption is equal to 0.05 db, this absorption increases sharply with frequency up to 30 GHz (Figure 2.). However, in the frequency range of 35 - 38 Hz which correspond to millimeter wavelengths, a so-called “radiowindow” exists in the Earth’s atmosphere where absorption sharply decreases down to 0.4 db. From analyzing all listed phenomena, it follows that, besides absorption, the main cause of microwave energy decay is due to dispersion of the beam when clouds, rain, or mist are present. The contribution of other processes is smaller and can be neglected. Taking into account the fact that the Earth’s surface (as seen by an geostationary satellite) is totally covered by clouds approximately 20% of the time, and rained out 6% of the time, the corresponding losses in microwave beam energy at frequencies of 2.5 - 3 GHz are estimated as being 2% - 6% (Figure 3).
RkJQdWJsaXNoZXIy MTU5NjU0Mg==