Hurricane Ian left an extraordinarily wide path of destruction across much of South Florida. This was evident in ground reports, but it also appears in satellite data. Using a new method, our team of spatial and environmental analysts were able to quickly provide a rare overview of damage across the state.
Using pre-storm satellite imagery and real-time imagery from four satellite sensors, along with artificial intelligence, we have created a disaster monitoring system that can map damage with 30-meter resolution and put continuously update data.
It’s a preview of what faster, more targeted disaster monitoring might look like in the future – and something that could eventually be rolled out nationwide.
How artificial intelligence spots damage
Satellites are already being used to identify areas at high risk of floods, forest fires, landslides and other disasters, and to locate damage after these disasters. But most satellite disaster management approaches rely on visual assessment of the latest images, one neighborhood at a time.
Our technique automatically compares pre-storm images with current satellite images to quickly spot anomalies over large areas. These anomalies can be sand or water where that sand or water shouldn’t be, or badly damaged roofs that don’t match their pre-storm appearance. Each area with a significant misstatement is highlighted in yellow.
Five days after Ian whipped Florida, the map showed yellow alert polygons across South Florida. We found that it could detect areas of damage with around 84% accuracy.
A natural disaster like a hurricane or tornado often leaves behind large areas of spectral change on the surface, which means changes in the way light reflects off anything on it, such as houses, ground or water. Our algorithm compares the reflectance in models based on images before the storm with the reflectance after the storm.
The system detects both changes in the physical properties of natural areas, such as changes in humidity or light, and the overall intensity of the change. An increase in brightness is often linked to exposed sand or bare earth due to hurricane damage.
Using a machine learning model, we can use these images to predict disturbance probabilities, which measure the influences of natural disasters on land surfaces. This approach allows us to automate disaster mapping and provide comprehensive coverage of an entire state as soon as satellite data is released.
The system uses data from four satellites, Landsat 8 and Landsat 9, both operated by NASA and the US Geological Survey, and Sentinel 2A and Sentinel 2B, launched as part of the European Commission’s Copernicus programme.
Real-time, nationwide monitoring
Extreme storms accompanied by destructive flooding have been documented with increasing frequency over large parts of the globe in recent years.
While disaster response teams can rely on aircraft and drone surveillance to locate damage in small areas, it is much more difficult to get the big picture of a widespread disaster like the hurricanes and other tropical cyclones, and time is running out. Our system offers a quick approach using free government produced images to get the big picture. A current downside is the timing of these images, which are often only released to the public a few days after the disaster.
We are currently working to develop near real-time surveillance of the entire contiguous United States to quickly provide the most up-to-date ground information for the next natural disaster.
Zhe Zhu, Assistant Professor of Natural Resources and Environment, University of Connecticut and Su Ye, postdoctoral researcher in environment and remote sensing, University of Connecticut
This article is republished from The Conversation under a Creative Commons license. Read the original article.
#satellite #mapping #quickly #pinpoint #hurricane #damage #entire #state #spot #people #trapped