TY - JOUR
T1 - On a nonlinear extension of the principal fitted component model
AU - Song, Jun
AU - Kim, Kyongwon
AU - Yoo, Jae Keun
N1 - Funding Information:
We want to give our sincere gratitude to the Editor, the Associate Editor, and the two referees for providing helpful comments and suggestions, which lead to significantly improving this article. For Jun Song, this work is supported by the National Research Foundation of Korea (NRF) grant funded by the Korea government (MSIT) (No. 2022R1C1C1003647 , 2022M3J6A1063595 ). For Kyongwon Kim, this work is supported by the National Research Foundation of Korea (NRF) grant funded by the Korea government (MSIT) (No. 2021R1F1A1046976 ). For Jae Keun Yoo, this work was supported by Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Korean Ministry of Education (NRF- 2021R1F1A1059844 ).
Publisher Copyright:
© 2023 Elsevier B.V.
PY - 2023/6
Y1 - 2023/6
N2 - We propose a nonlinear sufficient dimension reduction method called the kernel principal fitted component model using the kernel method under a reproducing kernel Hilbert space. The kernel principal fitted component is a nonlinear extension of the principal fitted component model, and it is found in the theory of mapping low dimensional input space to the higher dimensional feature space so that we can apply well-developed linear methods to the nonlinear dataset. We derive our method coincides with the generalized sliced inverse regression under some mild assumptions and show the dimension reduction subspace extracted from the kernel principal fitted component model is contained in the central class. In the numerical experiments, we present the kernel principal fitted component model with the Gaussian kernel can extract the linear and nonlinear features well for the models from both forward and inverse regression settings. By applying our method to ovarian cancer microarray dataset, we demonstrate the kernel principal fitted component can provide a competitive prediction accuracy and computational efficiency in the high-dimensional classification problem.
AB - We propose a nonlinear sufficient dimension reduction method called the kernel principal fitted component model using the kernel method under a reproducing kernel Hilbert space. The kernel principal fitted component is a nonlinear extension of the principal fitted component model, and it is found in the theory of mapping low dimensional input space to the higher dimensional feature space so that we can apply well-developed linear methods to the nonlinear dataset. We derive our method coincides with the generalized sliced inverse regression under some mild assumptions and show the dimension reduction subspace extracted from the kernel principal fitted component model is contained in the central class. In the numerical experiments, we present the kernel principal fitted component model with the Gaussian kernel can extract the linear and nonlinear features well for the models from both forward and inverse regression settings. By applying our method to ovarian cancer microarray dataset, we demonstrate the kernel principal fitted component can provide a competitive prediction accuracy and computational efficiency in the high-dimensional classification problem.
KW - Principal component model
KW - Principal fitted component model
KW - Reproducing kernel Hilbert space
KW - Sufficient dimension reduction
UR - http://www.scopus.com/inward/record.url?scp=85147799645&partnerID=8YFLogxK
U2 - 10.1016/j.csda.2023.107707
DO - 10.1016/j.csda.2023.107707
M3 - Article
AN - SCOPUS:85147799645
SN - 0167-9473
VL - 182
JO - Computational Statistics and Data Analysis
JF - Computational Statistics and Data Analysis
M1 - 107707
ER -