Artificial Intelligence: AI And Satellite Data Help Spot Areas Where People May Be Trapped After A Hurricane
New satellite mapping with AI can quickly pinpoint hurricane damage across an entire state to spot where people may be trapped
Researchers at the University of Connecticut have developed a new method that uses satellite images from before the storm and real-time images from four satellite sensors, together with artificial intelligence, to create a disaster monitoring system. The system is detailed enough to spot damage up to a resolution of 30 meters, as well as update the status continuously. (The Conversation)
How it works
Though conventional disaster monitoring also relies on satellite imagery to spot the most disaster-hit areas, it is usually localized and needs to be visually analyzed – a slow and laborious process.
“Our technique automatically compares pre-storm images with current satellite images to spot anomalies quickly over large areas,” write Zhu Zhu, Assistant Professor of Natural Resources and the Environment, University of Connecticut, and Su Ye, postdoctoral researcher in environment and remote sensing, University of Connecticut. “Those anomalies might be sand or water where that sand or water shouldn’t be, or heavily damaged roofs that don’t match their pre-storm appearance.”
The AI algorithm compares the “reflectance” (how light reflects off whatever is there, such as houses, ground or water) in models based on pre-storm images with reflectance after the storm. An increase in brightness often is related to exposed sand or bare land due to hurricane damage.
This data helps measure the impact of a natural disaster on land surfaces. Each area with a significant before-and-after anomaly is flagged in yellow.
“This approach allows us to automate disaster mapping and provide full coverage of an entire state as soon as the satellite data is released,” claim the researchers. “It provides a fast approach using free government-produced images to see the big picture.”
Their system could spot patches of damage with about 84% accuracy five days after Hurricane Ian struck Florida, with damaged areas being highlighted as yellow polygons all across South Florida.
The team from the University of Connecticut is now developing near real-time monitoring of the whole conterminous United States to quickly provide the most up-to-date land information for the next natural disaster.
Related Story: Google’s DeepMind “Nowcasts” Rain
Image credit: Flickr
Latest Alternative Investment News
Morgan Stanley Investment Management (MSIM) has launched its ETF platform with six Calvert ETFs listed on NYSE Arca. The ETFs feature Calvert’s responsible investing approach and offer access to four…
Google (NASDAQ: GOOGL) recently published a research paper on its new AI tool, MusicLM, designed to create music. The tool is based on machine learning and is trained on vast…
Snap (NYSE: SNAP), the parent company of Snapchat, has hinted at future AR glasses powered by generative AI technology. CEO Evan Spiegel stated that AI will be critical to the…
Binance and Mastercard (NYSE: MA) have jointly launched the Binance Card in Brazil, allowing Binance users in the country to make purchases and pay bills with cryptocurrencies like bitcoin and…