65th ISI World Statistics Congress

65th ISI World Statistics Congress

A forward sparse sufficient dimension reduction in binary classification via penalized gradient learning

Conference

65th ISI World Statistics Congress

Format: SIPS Abstract - WSC 2025

Keywords: dimension-reduction, high-dimensional

Session: IPS 1098 - Statistical inference and estimation in high-dimensional data

Tuesday 7 October 9:20 a.m. - 10:30 a.m. (Europe/Amsterdam)

Abstract

Dimension reduction plays an essential role in high-dimensional classification. Among existing approaches, variable selection and sufficient dimension reduction (SDR) represent two primary paradigms, each with complementary advantages and limitations. Variable selection promotes interpretability but may overlook complex low-dimensional structures, while SDR captures such structures but often lacks sparsity and interpretability. In this paper, we propose a novel sparse SDR method called sparse-wOPG, which integrates the benefits of both strategies. Building on the wOPG framework, which performs SDR by learning the gradients of weighted large-margin classifiers in a reproducing kernel Hilbert space, we incorporate a group-wise penalization via an adaptive group lasso penalty. The resulting estimator preserves the low-dimensional structure while achieving sparsity. We implement an efficient optimization procedure under the Majorization-Minimization (MM) framework and establish the selection consistency of the proposed estimator. Numerical experiments on simulated and real dataset demonstrate the promising performance of sparse-wOPG.