On a nonlinear extension of the principal fitted component model

Research output: Contribution to journalArticlepeer-review

Abstract

We propose a nonlinear sufficient dimension reduction method called the kernel principal fitted component model using the kernel method under a reproducing kernel Hilbert space. The kernel principal fitted component is a nonlinear extension of the principal fitted component model, and it is found in the theory of mapping low dimensional input space to the higher dimensional feature space so that we can apply well-developed linear methods to the nonlinear dataset. We derive our method coincides with the generalized sliced inverse regression under some mild assumptions and show the dimension reduction subspace extracted from the kernel principal fitted component model is contained in the central class. In the numerical experiments, we present the kernel principal fitted component model with the Gaussian kernel can extract the linear and nonlinear features well for the models from both forward and inverse regression settings. By applying our method to ovarian cancer microarray dataset, we demonstrate the kernel principal fitted component can provide a competitive prediction accuracy and computational efficiency in the high-dimensional classification problem.

Original languageEnglish
Article number107707
JournalComputational Statistics and Data Analysis
Volume182
DOIs
StatePublished - Jun 2023

Bibliographical note

Publisher Copyright:
© 2023 Elsevier B.V.

Keywords

  • Principal component model
  • Principal fitted component model
  • Reproducing kernel Hilbert space
  • Sufficient dimension reduction

Fingerprint

Dive into the research topics of 'On a nonlinear extension of the principal fitted component model'. Together they form a unique fingerprint.

Cite this