Closing Remarks
This document describes the Climate Change Projections that were created to help Verisk clients better understand the climate change impact to their wildfire risk in the Western U.S. The general approach includes creating data-based relationships between climate parameters and burned area for each of the 49 EPA level III ecoregions in the 13 westernmost states of the U.S., calculating climate change targets for changes in burned area based on changes to ecoregion-specific climate variables, and generating a climate conditioned catalog of events by resampling from the existing 10K catalog so the burned area targets in each ecoregion are met.
Historical wildfire data (FOD and MTBS) from the period 1984-2020 and environmental conditions were used to quantify ecoregion-specific relationships between vapor pressure deficit (VPD) and precedent annual precipitation and area burned in a given year. Generalized Linear Models (GLMs) were built according to whether best fit was accomplished using VPD, precipitation, or both. Wildfires in ecoregions with expansive forested area are best explained with just VPD while in other ecoregions, with predominantly grass and or shrubs, the wildfire activity was best explained with just precipitation. In forested ecoregions, the burned area is highly sensitive to VPD. An important component of the GLMs accounted for how previous burns affect the potential and characteristics of future burns. Two variables, tau and gamma, were used to quantify this feedback. The former represents how many years the vegetation takes to effectively recover and the latter represents the reduction in the amount of area that can subsequently burn either by removal or through impeding the development of larger fires.
The latest climate model output was examined to understand how vapor pressure deficit and precedent precipitation may change in the different ecoregions through mid-century under four different climate change scenarios: SSP1-2.6, SSP2-4.5, SSP3-7.0, and SSP5-8.5. Climate output from six different CMIP6 models was used. The GCM projections for VPD were applied to the ecoregion specific GLMs to define targets that were used to resample the 10,000-year Catalog from the Verisk Wildfire Model for the United States.
Results show that the largest changes in projected burn area will likely occur in the forested regions of the Northwest (Washington and Oregon) and the Inter-mountain region (Montana, Utah, Colorado and northern California). Results of the burn area projections are consistent with those from other studies.
Loss changes at 2030 show little variation across the climate scenarios, with overall industry average annual losses increasing by approximately 25%. At 2050, there is more variation across the scenarios; for SSP2-4.5 average annual loss is expected to increase by 60% overall. Geographically, there is greater variation in losses. In California, which currently contributes two-thirds of the industry annual loss, increases of approximately 20% and 50% are anticipated at 2030 and 2050, respectively, under SSP2-4.5, with higher changes in Montana, Oregon, Washington and Colorado and smaller changes in Arizona, Texas and Oklahoma. In future years, California will remain the largest contributor to countrywide industry loss, owing largely to exposure and already high risk. However, Arizona and Texas, today's second and third highest loss producing states, will likely be replaced by Colorado and Washington, respectively. Additional details, including county level changes, are presented in more detail.
Despite the findings, there is considerable uncertainty. First and foremost is uncertainty about how the climate will evolve. The IPCC scenarios are not forecasts, and because of that, it is not appropriate to assign probabilities to them – at least not preferentially. From that standpoint they are all equally probable. Although the SSP5-8.5 scenario is more pessimistic, it is a scenario that we as a global society have been following over the last 15 years. The Wildfire Climate Projections for the U.S. address this uncertainty to some degree by considering low (SSP2-2.6SSP1) and high (SSP5-8.5) scenarios.
Verisk scientists used output from six different GCMs because they rarely agree at a regional (state) level, so the best strategy was to get a median view (although one way to quantify the GCM uncertainty would be to create targets from all of them). The six were chosen based on their ability to reproduce the current climate and do not necessarily span the possible range of future climate. The GCM-based uncertainty and the resampling-based uncertainty discussed earlier are two kinds of uncertainty. But there are a few others worth mentioning as well as some other limitations of what this study shows.
One other type of uncertainty stems from the historical data informing the GLMs developed for each ecoregion. Although there was burned area data for the period 1982-2020, there were not 39 data points used for each ecoregion-specific generalized linear regression model (GLM). As part of the GLM development, Verisk scientists considered fire-fuel feedbacks in terms of how long an area that had burned will be unavailable to burn. A value of tau=15 years, which is a good median value supported by published studies as well as empirical evidence, was assumed. Thus, each “data point” was really a time series of 15 years of data and so only 24 independent pieces of data could be used.
It is possible that a few more additional data points could change the VPD-burned area relationships as evidenced by the recent activity. The only way to have accomplished this would have been to consider a smaller value of tau although that would have been somewhat inconsistent with published values. Without conducting what-if experiments it is hard to quantify the uncertainty regarding the GLMs but some ad-hoc informal checks indicate that the effect is relatively small – even for ecoregions in California with very active years in 2017, 2018, and 2020. However, some additional confidence that the relationships are accurate can be instilled by the fact that 2050 burned area targets agree in key ecoregions between these results and those in NRC (2011).
As the climate changes, it may very well be the case that vegetation will acclimate, adapt, or change location, e.g. change its period of leaf display or migrate northward. While many studies have addressed the climate change induced migration of species and GCMs even attempt to model it, it is hard to say if or how it will keep pace with climate change itself. Also, following a fire, the succeeding plant community could differ from the pre-fire community and produce different amounts or types of fuels available for future fires. This change could affect how vegetation recovers from fire (and thus the tau variable would shift), as well as the basic GLM relationships used. This uncertainty is one reason these Climate Change Projections do not go beyond 2050 - the underlying fuel types available could be different in the latter half of this century.
One other type of uncertainty worth noting is associated with changes in conditions to enhance or prohibit wildfire. The underlying premise of the VPD-burned area relationships is that the climate conditions provide an environment (e.g., fuel supply) and that fires will ignite at the same frequency and from the same causes they do now (e.g., lightning, power line faults, human carelessness). But that could certainly change. Verisk did not address how lightning frequency, intensity, or geographic and altitudinal distribution may change in this study (that topic alone is worthy of a separate study). Power lines could become more resilient to high winds, and ice and snow accretion, for example
Two other types of uncertainty related to the catalog used for resampling involve our using the 10,000 year catalog. Most Verisk clients currently license the 10,000 year catalog instead of the 100,000 year catalog from the Verisk Wildfire Model for the United States. Thus, creating a set of 10K files based on the 100K catalog through resampling would include many events that clients could not process/use. However, the 10K catalog limits the choices of events. Verisk's 100K catalog has bigger events that could occur more frequently in the future. This choice was a concession in favor of client utility. Related to the resampling approach in general, there is some uncertainty at higher return periods regarding occurrence losses because of the limitation of not including types of events that can never occur in the current climate. While these limitations can alter even the AAL, it is unlikely that the impact is more than a few percent of the estimated change in AAL.
Also, while they are not sources of uncertainty by themselves, there are factors which were held constant, primarily in the interest of isolating the impact of climate change. For example, exposure was not changed. Thus, on one hand the study may underestimate future loss as there is a low chance that exposure will remain constant. But on the other hand, damage functions were not modified, and existing industry losses in the 10,000-year catalog were used to reflect changes in building codes and perhaps improved resiliency. In the future Verisk fully expects that new construction will be hardened against embers (the main cause of house ignition) and that mitigation measures such as defensible space will be more widely adopted. Therefore, future damages for ‘the same’ fire might be lower, partly offsetting increased exposure. Without more wildfire-proof construction and implementation of defensible space, however, the future looks like it will be a time with significantly more risk. The insurance industry can proactively support policy on better construction codes and enforcement that will reduce future risk.
In closing, despite the uncertainties and limitations, the Climate Change Projections for the Verisk Wildfire Model for the U.S. provide very useful guidance for how climate change likely will affect wildfire risk in the Western U.S. While some of the projections in loss changes are large, at least from a hazard perspective, they agree with prior results using different methods from different researchers using and based on different data. And, especially given the potential damage, it is better to have quantitative information now on how the changes in the hazard might change losses in order to give us all time to react and protect against the projected losses.