Constrained Neural Networks and Their Applications to Nonparametric Regression and Adversarial Learning
Conference
Regional Statistics Conference 2026
Format: IPS Abstract - Malta 2026
Session: IPS 1271 - Advances in Deep Learning for Statistical Inference and Generative Modeling
Wednesday 3 June 2:30 p.m. - 4:10 p.m. (Europe/Malta)
Abstract
In this paper, we study the approximation capacity of ReLU-activated neural networks with constraints on size and Lipschitz constant. We establish approximation results for Lipschitz-size-constrained neural networks in approximating Holder smooth functions defined on Euclidean spaces, as well as on low-dimensional Riemannian manifolds. We apply Lipschitz-size-constrained neural networks in multivariate nonparametric regression models. We also study a novel saturated low-dimensional structure, under which we derive generalization error results with Lipschitz-size-constrained neural networks in nonparametric regression problem. We prove that, the resulting nonparametric least-squares estimator based on Lipschitz-size-constrained neural networks can circumvent the curse of dimensionality when the data has a saturated low-dimensional structure. Moreover, we show that our estimator attains the minimax optimal convergence rate in nonparametric regression, under an exact low-dimensional manifold assumption or when the data is supported on the d-dimensional cubic domain [0,1]^d. As another important application, we apply Lipschitz-size-constrained neural networks in an adversarial learning problem and establish a faster convergence rate of generalization error compared with the state-of-the-art existing results.