Adaptive nonparametric Function-on-Scalar regression with deep Learning
Conference
65th ISI World Statistics Congress
Format: CPS Abstract - WSC 2025
Keywords: deep neural networks, deep-learning, functional data analysis,, nonparametric
Abstract
In recent decades, advances in measurement technology have enabled the repeated collection of data from a single subject over time or across different locations. Analyzing such data as realizations of a random process is known as Functional Data Analysis (FDA).
FDA provides a unified approach to analyzing data observed at different times or with varying numbers of observations per subject.Many traditional multivariate analysis methods have been extended for functional data. This talk focuses on Function-On-Scalar (FOS) regression, in which the predictors are scalar and the response is functional.
Many existing studies focus on linear FOS regression models (e.g., Faraway, 1997; Ramsay and Silverman, 2005). While these models are simple and easy to interpret, they lack the flexibility to capture more complex structures. On the other hand, research on nonlinear FOS regression models has been limited. For example, Scheipl et al. (2015) proposed the Functional Additive Mixed Model (FAMM), which can capture more complex structures compared to linear FOS regression models. However, if the additive model is not correctly specified, estimation accuracy may decline. Moreover, relaxing assumptions on the model can lead to an exponential increase in the number of parameters, reducing estimation efficiency.
To address these issues, Luo et al. (2023) proposed a more flexible method based on neural networks and demonstrated the universal approximation property of their model. This method does not impose specific assumptions (such as additive models) on the true function and can be applied even when the predictor dimensions are relatively high. However, there are still two issues: (1) it imposes a strong assumption that the true function is continuous with respect to the predictors, and (2) the proposed model is described through integrals, making it impractical to model precisely. Furthermore, although Luo et al. (2023) derived the convergence rate of the estimator, there is a critical flaw in the proof, and the correct convergence rate is not established.
In this study, we propose a novel FOS regression method using deep neural networks in order to address these issues.
Moreover, we show the theoretical properties of the proposed estimator.
In our theoretical analysis, we assume that the true function belongs to the anisotropic Besov space. The anisotropic Besov space consists of functions with direction-dependent smoothness and includes well-known function spaces such as the Hölder space. Therefore, we derive the theoretical properties in a more flexible setting compared to previous studies.
In addition, our theoretical results indicate a possibility that the proposed estimator could avoid the curse of dimensionality when the true function possesses high anisotropic smoothness. Through the numerical experiments, we demonstrate the performance of the proposed estimator with comparison to the existing estimators.
Figures/Tables
NN2
