Assessing the Resolution Adaptability of the Zhang-McFarlane Cumulus Parameterization with Spatial and Temporal Averaging
One of the biggest challenges in global climate modeling is representing clouds. The model grid size—typically about 100 kilometers—is too coarse to capture cumulus clouds. Researchers at the U.S. Department of Energy’s Pacific Northwest National Laboratory led a study showing that the dependence of simulated clouds on model grid size―be it large or small―can be suppressed through averaging the environment in which they form over a certain space and time period.
The averaging method developed in this study reduces the sensitivity of clouds and precipitation to grid size in climate simulations, making simulations more reliable. Researchers can apply this procedure to a wide range of parameterizations (simplified representations) of cumulus clouds.
Clouds are represented in global climate models by making various assumptions about their interactions with the large-scale environment in which they form. One of those assumptions, that clouds are much smaller than the model grid size, breaks down in models with very fine grids—on the order of 20 kilometers or smaller. This can lead to “double counting” of the transport of moisture and heat using both the cloud parameterization and the atmospheric processes, such as large-scale convective systems, captured by the model grid. Researchers showed that this problem can be substantially mitigated by averaging the cloud environment over about 100 kilometers and approximately a 10-minute period when applying the cloud parameterization. This method greatly reduces the resolution dependence of simulated precipitation.