ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences
Download
Publications Copernicus
Download
Citation
Articles | Volume V-2-2020
https://doi.org/10.5194/isprs-annals-V-2-2020-467-2020
https://doi.org/10.5194/isprs-annals-V-2-2020-467-2020
03 Aug 2020
 | 03 Aug 2020

ADAPTABLE AUTOREGRESSIVE MOVING AVERAGE FILTER TRIGGERING CONVOLUTIONAL NEURAL NETWORKS FOR CHOREOGRAPHIC MODELING

I. Rallis, N. Bakalos, N. Doulamis, and A. Doulamis

Keywords: Deep learning, Dynamic Scene Analysis, Intangible Cultural Heritage, Choreographic Modeling

Abstract. Choreographic modeling, that is identification of key choreographic primitives, is a significant element for Intangible Cultural Heritage (ICH) performing art modeling. Recently, deep learning architectures, such as LSTM and CNN, have been utilized for choreographic identification and modeling. However, such approaches present sensitivity to capturing errors and fail to model the dynamic characteristics of a dance, since they assume a stationarity between the input-output data. To address these limitations, in this paper, we introduce an AutoRegressive Moving Average (ARMA) filter into a conventional CNN model; this means that the classification output feeds back to the input layer, improving overall classification accuracy. In addition, an adaptive implementation algorithm is introduced, exploiting a first-order Taylor series expansion, to update network response in order to fit dance dynamic characteristics. This way, the network parameters (e.g., weights) are dynamically modified improving overall classification accuracy. Experimental results on real-life dance sequences indicate the out-performance of the proposed approach with respect to conventional deep learning mechanisms.