Please use this identifier to cite or link to this item:
https://hdl.handle.net/20.500.11851/12148
Title: | Multi-Class Certainty Mapped Network for High Precision Segmentation of High-Altitude Imagery | Authors: | Oğuz, T. Akgün, T. |
Keywords: | Damage Assessment Deep Learning High-Altitude Imagery Natural Disaster Segmentation Uncertainty Aware |
Publisher: | SPIE | Abstract: | Satellites and high-altitude unmanned aerial vehicles are the top platforms for electro-optical remote sensing for both civilian and military applications. Since early 2000s high altitude electro-optical remote sensing platforms have been actively used for real-time and offline damage assessment following natural disasters such as earthquakes, floods, and landslides. High accuracy, multi-class automated object segmentation is one of the key processing blocks that makes such applications practical. Given the typical distances between target areas and high-altitude sensing platforms (10s to 1000s of kms) as well as the critical nature of the resulting assessments, the accuracy of segmentation maps is of key interest. In this work we present the Multi-Class Certainty Mapped Network (MCCM-Net) that uses multi-class per-pixel uncertainty to enhance segmentation performance. MCCM-Net explicitly models multi-class uncertainty as the entropy of class probability distribution. Pixel-level uncertainty is then used to iteratively enhance segmentation maps. Our experiments on publicly available benchmark datasets show that MCCM-Net provides state-of-the-art multi-class pixel-level segmentation performance. © 2025 SPIE. | Description: | Taiwan Space Agency (TASA); The Society of Photo-Optical Instrumentation Engineers (SPIE) | URI: | https://doi.org/10.1117/12.3041825 https://hdl.handle.net/20.500.11851/12148 |
ISBN: | 9781510682689 | ISSN: | 0277-786X |
Appears in Collections: | Scopus İndeksli Yayınlar Koleksiyonu / Scopus Indexed Publications Collection |
Show full item record
CORE Recommender
Items in GCRIS Repository are protected by copyright, with all rights reserved, unless otherwise indicated.