EXPLAIN IT TO ME – FACING REMOTE SENSING CHALLENGES IN THE BIO- AND GEOSCIENCES WITH EXPLAINABLE MACHINE LEARNING
- 1Institute of Geodesy and Geoinformation, University of Bonn, Germany
- 2Institute of Computer Science, University of Osnabrueck, Germany
- 3Institute for Numerical Simulation, University of Bonn, Germany
- 4Department of Electrical and Computer Engineering, University of Massachusetts Amherst, USA
- 5Fraunhofer Center for Machine Learning and Fraunhofer SCAI, Sankt Augustin, Germany
Keywords: Machine Learning, Explainability, Interpretability, Geosciences, Biosciences
Abstract. For some time now, machine learning methods have been indispensable in many application areas. Especially with the recent development of efficient neural networks, these methods are increasingly used in the sciences to obtain scientific outcomes from observational or simulated data. Besides a high accuracy, a desired goal is to learn explainable models. In order to reach this goal and obtain explanation, knowledge from the respective domain is necessary, which can be integrated into the model or applied post-hoc. We discuss explainable machine learning approaches which are used to tackle common challenges in the bio- and geosciences, such as limited amount of labeled data or the provision of reliable and scientific consistent results. We show that recent advances in machine learning to enhance transparency, interpretability, and explainability are helpful in overcoming these challenges.