The need for such tools — especially those that work quickly and simply — has grown with the sheer intensity of wildfires in recent years, driven in part by climate change and by the spread of residential development. California suffered a devastating fire season in 2018, with nearly 1.7 million acres burned across the state.
The Mendocino Complex fire, which started last July in Northern California, burned nearly 500,000 acres to become the largest wildfire on record in the state. The Camp Fire, also in the northern part of the state, became the state’s deadliest when it tore across 150,000 acres in November, killing 85 people and destroying nearly 19,900 structures.
Martha Witter, a fire ecologist with the National Park Service in Southern California, said it was crucial to reduce the lag time between when a fire starts and when firefighters can inform the public about potential evacuations.
She said that need is especially great in Southern California, where dried-out vegetation and accelerating development along the “wildland-urban interface” combine to make the region ripe for devastating fires.
“In Southern California, we have perfect conditions for a fire almost every single year,” Ms Witter said. “So as these areas get developed, it just increases the number of targets.”
Max Moritz, a wildfire expert affiliated with the University of California, Santa Barbara, who has a long career in wildfire management, said improvements in predictive modeling are crucial for fighting fires, especially as the fires become more intense. But he noted that predictive technologies will not change the underlying factors of urban development and a warming planet that are making fires more intense.
“We need better data and better models, but we also need better preparation,” Mr. Moritz said. “We also have to make headway on all the other fronts, if we really want resilient communities in the face of climate change.”