Adaptive variable selection in nonparametric sparse additive models
We consider the problem of recovery of an unknown multivariate signal f observed in a d-dimensional Gaussian white noise model of intensity ε. We assume that f belongs to a class of smooth functions in L2([0, 1]d) and has an additive sparse structure determined by the parameter s, the number of non-zero univariate components contributing to f. We are interested in the case when d = dε →∞as ε → 0 and the parameter s stays “small” relative to d. With these assumptions, the recovery problem in hand becomes that of determining which sparse additive components are non-zero. Attempting to reconstruct most, but not all, non-zero components of f, we arrive at the problem of almost full variable selection in high-dimensional regression. For two different choices of a class of smooth functions, we establish conditions under which almost full variable selection is possible, and provide a procedure that achieves this goal. Our procedure is the best possible (in the asymptotically minimax sense) for selecting most non-zero components of f. Moreover, it is adaptive in the parameter s. In addition to that, we complement the findings of  by obtaining an adaptive exact selector for the class of infinitely-smooth functions. Our theoretical results are illustrated with numerical experiments.
|Keywords||Adaptive variable selection, Exact and almost full selectors, High-dimensional nonparametric regression, Sparse additive signals|
|Journal||Electronic Journal of Statistics|
Butucea, C. (Cristina), & Stepanova, N. (2017). Adaptive variable selection in nonparametric sparse additive models. Electronic Journal of Statistics, 11(1), 2321–2357. doi:10.1214/17-EJS1275