ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences
Publications Copernicus
Download
Citation
Articles | Volume II-3/W5
https://doi.org/10.5194/isprsannals-II-3-W5-187-2015
https://doi.org/10.5194/isprsannals-II-3-W5-187-2015
19 Aug 2015
 | 19 Aug 2015

ASSESSMENT OF THE THEMATIC ACCURACY OF LAND COVER MAPS

J. Höhle

Keywords: Land Cover Map, Urban, Classification, Machine Learning, Assessment, Accuracy, Confidence Interval

Abstract. Several land cover maps are generated from aerial imagery and assessed by different approaches. The test site is an urban area in Europe for which six classes (‘building’, ‘hedge and bush’, ‘grass’, ‘road and parking lot’, ‘tree’, ‘wall and car port’) had to be derived. Two classification methods were applied (‘Decision Tree’ and ‘Support Vector Machine’) using only two attributes (height above ground and normalized difference vegetation index) which both are derived from the images. The assessment of the thematic accuracy applied a stratified design and was based on accuracy measures such as user’s and producer’s accuracy, and kappa coefficient. In addition, confidence intervals were computed for several accuracy measures. The achieved accuracies and confidence intervals are thoroughly analysed and recommendations are derived from the gained experiences. Reliable reference values are obtained using stereovision, false-colour image pairs, and positioning to the checkpoints with 3D coordinates. The influence of the training areas on the results is studied. Cross validation has been tested with a few reference points in order to derive approximate accuracy measures. The two classification methods perform equally for five classes. Trees are classified with a much better accuracy and a smaller confidence interval by means of the decision tree method. Buildings are classified by both methods with an accuracy of 99% (95% CI: 95%-100%) using independent 3D checkpoints. The average width of the confidence interval of six classes was 14% of the user’s accuracy.