Difference in water temperature between Pacific and Atlantic oceans found to affect risk of drought and wildfire in southwestern North America.

July 28, 2017

3 Min Read
Ocean temperature differences fuel U.S. wildfires
Rodney Mestas Zack Coleman

An international team of climate researchers from the U.S., South Korea and the U.K. has developed a new wildfire and drought prediction model for southwestern North America. Extending far beyond the current seasonal forecast, the study, published in the journal Scientific Reports, could benefit the economies with a variety of applications in agriculture, water management and forestry.

Over the past 15 years, California and neighboring regions have experienced heightened drought conditions and an increase in the number of wildfires with considerable effects on people's livelihoods, agriculture and terrestrial ecosystems.

This new research shows that in addition to a discernible contribution from natural forces and human-induced global warming, the large-scale difference between Atlantic and Pacific ocean temperatures plays a fundamental role in causing droughts and enhancing wildfire risk.

"Our results document that a combination of processes is at work. Through an ensemble modeling approach, we were able to show that without anthropogenic effects, the droughts in the southwestern U.S. would have been less severe," said co-author Axel Timmermann, director of the newly founded IBS Center for Climate Physics within the Institute for Basics Science (IBS), as well as a distinguished professor at Pusan National University in South Korea. "By prescribing the effects of manmade climate change and observed global ocean temperatures, our model can reproduce the observed shifts in weather patterns and wildfire occurrences."

The new findings show that a warm Atlantic and a relatively cold Pacific enhance the risk for drought and wildfires in the southwestern U.S.

"According to our study, the Atlantic/Pacific temperature difference shows pronounced variations on time scales of more than five years. Like swings of a very slow pendulum, this implies that there is predictability in the large-scale atmosphere/ocean system, which we expect will have a substantial societal benefit," explained Yoshimitsu Chikamoto, lead author of the study and assistant professor at the University of Utah in Logan.

The new drought and wildfire predictability system developed by the researchers expands beyond the typical time scale of seasonal climate forecast models -- used in El Niño predictions, for instance. It was tested with a forecasting time of 10-23 months for wildfires and 10-45 months for droughts.

"Of course, we cannot predict individual rainstorms in California and their local impacts months or seasons ahead, but we can use our climate computer model to determine whether, on average, the next year will have drier or wetter soils or more or less wildfires. Our yearly forecasts are far better than chance," noted Lowell Stott, co-author of the study from the University of Southern California in Los Angeles.

Bringing together observed and simulated measurements on ocean temperatures, atmospheric pressure, water soil and wildfire occurrences, the researchers have a powerful tool in their hands that they are willing to test in other global regions.

"Using the same climate model configuration, we will also study the soil water and fire risk predictability in other parts of our world, such as the Mediterranean, Australia or parts of Asia," Timmermann concluded. "Our team is looking forward to developing new applications with stakeholder groups that can benefit from better soil water forecasts or assessments in future fire risk."

Subscribe to Our Newsletters
Feedstuffs is the news source for animal agriculture

You May Also Like