The California megaflood: Bias correction and model choices matter
Occurring bicentennially, the California megaflood is a 30-day cluster of storms whose statewide precipitation amount informs design and inundation planning efforts. As such, uncertainties in its response to environmental warming shape the ways in which resources are directed towards event preparedness. Here, we use two dynamically downscaled ensembles of Global Climate Models (GCMs): 30 simulations from Phase 6 of the Coupled Model Intercomparison Project spanning 1980-2100, and 10 from the Community Earth System Model’s second Large Ensemble (CESM2-LE) from 1850-2100, to quantify how GCM lineage and bias correction choices affect the projected future of California’s most infamous weather disaster. A simple bias correction (BC) of the GCM boundary conditions is necessary to reasonably capture the observed hydrometeorological statistics of the wettest month of the year (~200 mm of statewide precipitation), which are 30% too wet in simulations without BC, while the 200-year storm is 562 mm in intensity. Meanwhile, the more skillful bias corrected ensemble predicts a lower megaflood intensity of 473 mm. We find that the future likelihood of this megaflood varies substantially depending on the model dataset. Within CESM2-LE, whose boundary conditions are bias corrected before downscaling, historical megafloods (493 mm) occur 2.6 times more frequently by the mid-to-late 21st century, while megafloods are up to 4 times more likely to occur in the wider CMIP6 simulations. Without BC, downscaled CMIP6 simulations predict a relatively smaller increase in future megaflood frequency (factor 3). By considering a large GCM ensemble back to 1850 containing dozens of intense storm clusters, we show that their frequency by mid-to-late century dwarfs any simulated historical-era maxima.