## Abstract

The principal graphical model is introduced by incorporating the ideas from the linear sufficient dimension reduction (SDR) methods such as the sliced inverse regression and sliced average variance estimation to a nonparametric graphical model. A nonparametric graphical model is a widely used method to investigate undirected graphs. However, when the number of nodes is large, a nonparametric graphical model suffers from the ‘curse of dimensionality’ because they contain intrinsic high dimensional kernels. The parametric graphical models such as the Gaussian or copula Gaussian graphical models are also well known for their intuitive structure and interpretability. However, they hinge on strong parametric model assumptions. The principal graphical model applies well-known linear SDR techniques to the nonparametric graphical models to enhance performance in high dimensional networks, avoid model assumptions, and maintain interpretability. We use components of linear SDR as modules and implement them in the (p2) pairs of variables in the network to evaluate conditional independence. In the numerical experiment, our methods have competitive accuracy in both low and high-dimensional settings. Our methods are applied to the DREAM 4 challenge gene network dataset and they work well in high dimensional settings with a limited number of observations.

Original language | English |
---|---|

Article number | 107344 |

Journal | Computational Statistics and Data Analysis |

Volume | 166 |

DOIs | |

State | Published - Feb 2022 |

### Bibliographical note

Publisher Copyright:© 2021 Elsevier B.V.

## Keywords

- Conjoined conditional covariance operator
- Reproducing Kernel Hilbert space
- Sliced average variance estimation
- Sliced inverse regression
- Statistical graphical model