COASTAL HABITAT MAPPING WITH UAV MULTI-SENSOR DATA: AN EXPERIMENT AMONG DCNN-BASED APPROACHES
Keywords: deep learning, semantic segmentation, UAV, multi-sensor data, data fusion, environmental monitoring
Abstract. With recent abundant availability of high resolution multi-sensor UAV data and rapid development of deep learning models, efficient automatic mapping using deep neural network is becoming a common approach. However, with the ever-expanding inventories of both data and deep neural network models, it can be confusing to know how to choose. Most models expect input as conventional RGB format, but that can be extended to incorporate multi-sensor data. In this study, we re-implement and modify three deep neural network models of various complexities, namely UNET, DeepLabv3+ and Dense Dilated Convolutions Merging Network to use both RGB and near infrared (NIR) data from a multi-sensor UAV dataset over a Norwegian coastal area. The dataset has been carefully annotated by marine experts for coastal habitats. We find that the NIR channel increases UNET performance significantly but has mixed effects on DeepLabv3+ and DDCM. The latter two are capable of achieving best performance with RGB-only. The class-wise evaluation shows that the NIR channel greatly increases the performance in UNET for green, red algae, vegetation and rock. However, the purpose of the study is not to merely compare the models or to achieve the best performance, but to gain more insights on the compatibility between various models and data types. And as there is an ongoing effort in acquiring and annotating more data, we aim to include them in the coming year.