AUTOMATIC EXTRACTION OF SOLAR AND SENSOR IMAGING GEOMETRY FROM UAV-BORNE PUSH-BROOM HYPERSPECTRAL CAMERA
- 1Geospatial Institute, Saint Louis University, 3694 West Pine Mall, St. Louis, MO 63108, USA
- 2Department of Earth and Atmospheric Sciences, Saint Louis University, St. Louis, MO 63108, USA
- 3Donald Danforth Plant Science Center, Saint Louis, MO 63132, USA
Keywords: unmanned aerial vehicle (UAV), plant phenotyping, remote sensing, zenith angle, azimuth angle
Abstract. Calculating solar-sensor zenith and azimuth angles for hyperspectral images collected by UAVs are important in terms of conducting bi-directional reflectance function (BRDF) correction or radiative transfer modeling-based applications in remote sensing. These applications are even more necessary to perform high-throughput phenotyping and precision agriculture tasks. This study demonstrates an automated Python framework that can calculate the solar-sensor zenith and azimuth angles for a push-broom hyperspectral camera equipped in a UAV. First, the hyperspectral images were radiometrically and geometrically corrected. Second, the high-precision Global Navigation Satellite System (GNSS) and Inertial Measurement Unit (IMU) data for the flight path was extracted and corresponding UAV points for each pixel were identified. Finally, the angles were calculated using spherical trigonometry and linear algebra. The results show that the solar zenith angle (SZA) and solar azimuth angle (SAA) calculated by our method provided higher precision angular values compared to other available tools. The viewing zenith angle (VZA) was lower near the flight path and higher near the edge of the images. The viewing azimuth angle (VAA) pattern showed higher values to the left and lower values to the right side of the flight line. The methods described in this study is easily reproducible to other study areas and applications.