Nonparametric inference in generalized functional linear models

Zuofeng Shang, Guang Cheng

Research output: Contribution to journalArticlepeer-review

27 Scopus citations

Abstract

We propose a roughness regularization approach in making nonparametric inference for generalized functional linear models. In a reproducing kernel Hilbert space framework, we construct asymptotically valid confidence intervals for regression mean, prediction intervals for future response and various statistical procedures for hypothesis testing. In particular, one procedure for testing global behaviors of the slope function is adaptive to the smoothness of the slope function and to the structure of the predictors. As a by-product, a new type of Wilks phenomenon [Ann. Math. Stat. 9 (1938) 60-62; Ann. Statist. 29 (2001) 153-193] is discovered when testing the functional linear models. Despite the generality, our inference procedures are easy to implement. Numerical examples are provided to demonstrate the empirical advantages over the competing methods. A collection of technical tools such as integro-differential equation techniques [Trans. Amer. Math. Soc. (1927) 29 755-800; Trans. Amer. Math. Soc. (1928) 30 453-471; Trans. Amer. Math. Soc. (1930) 32 860-868], Stein's method [Ann. Statist. 41 (2013) 2786-2819] [Stein, Approximate Computation of Expectations (1986) IMS] and functional Bahadur representation [Ann. Statist. 41 (2013) 2608-2638] are employed in this paper.

Original languageEnglish (US)
Pages (from-to)1742-1773
Number of pages32
JournalAnnals of Statistics
Volume43
Issue number4
DOIs
StatePublished - Aug 1 2015
Externally publishedYes

All Science Journal Classification (ASJC) codes

  • Statistics and Probability
  • Statistics, Probability and Uncertainty

Keywords

  • Generalized functional linear models
  • Minimax adaptive test
  • Nonparametric inference
  • Reproducing kernel Hilbert space
  • Roughness regularization

Fingerprint

Dive into the research topics of 'Nonparametric inference in generalized functional linear models'. Together they form a unique fingerprint.

Cite this