Earth radiances in the form of hyperspectral measurements contain useful information on atmospheric constituents and aerosol properties. The Geostationary Environment Monitoring Spectrometer (GEMS) is an environmental sensor measuring such hyperspectral data in the ultraviolet and visible spectral range over the Asia-Pacific region. After completion of the in-orbit test of GEMS in October 2020, bad pixels are found as one of remaining calibration issues resulting in obvious spatial gaps in the measured radiances as well as retrieved properties. To solve the fundamental cause of the issue, this study takes an approach reproducing the defective spectra with machine learning models using artificial neural network (ANN) and multivariate linear regression (Linear). Here the models are trained with defect-free measurements of GEMS after dimensionality reduction with principal component analysis (PCA). Results show that the PCA-Linear model has small reproduction errors for a narrower spectral gap and is less vulnerable to outliers with an error of 0.5 %-5 %. On the other hand, the PCA-ANN model shows better results emulating strong non-linear relations with an error of about 5 % except for the shorter wavelengths around 300 nm. It is demonstrated that dominant spectral patterns can be successfully reproduced with the models within the level of radiometric calibration accuracy of GEMS, but a limitation remains when it comes to finer spectral features. When applying the reproduced spectra to retrieval processes of cloud and ozone, cloud centroid pressure shows an error of around 1 %, while total ozone column density shows relatively higher variance. As an initial step reproducing spectral patterns for bad pixels, the current study provides the potential and limitations of machine learning methods to improve hyperspectral measurements from the geostationary orbit.
Bibliographical noteFunding Information:
This research has been supported by the Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Education (grant no. 2018R1A6A1A08025520).
© 2023 Yeeun Lee et al.