Non-asymptotic Analysis for Nonparametric Testing

Yun Yang, Zuofeng Shang, Guang Cheng

Research output: Contribution to journalConference articlepeer-review

2 Scopus citations

Abstract

We develop a non-asymptotic framework for hypothesis testing in nonparametric regression where the true regression function belongs to a Sobolev space. Our statistical guarantees are exact in the sense that Type I and II errors are controlled for any finite sample size. Meanwhile, one proposed test is shown to achieve minimax rate optimality in the asymptotic sense. An important consequence of this non-asymptotic theory is a new and practically useful formula for selecting the optimal smoothing parameter in the testing statistic. Extensions of our results to general reproducing kernel Hilbert spaces and non-Gaussian error regression are also discussed.

Original languageEnglish (US)
Pages (from-to)3709-3755
Number of pages47
JournalProceedings of Machine Learning Research
Volume125
StatePublished - 2020
Event33rd Conference on Learning Theory, COLT 2020 - Virtual, Online, Austria
Duration: Jul 9 2020Jul 12 2020

All Science Journal Classification (ASJC) codes

  • Artificial Intelligence
  • Software
  • Control and Systems Engineering
  • Statistics and Probability

Keywords

  • Kernel ridge regression
  • large deviation bound
  • minimax rate optimality
  • non-asymptotic inference
  • nonparametric testing
  • smoothing spline

Fingerprint

Dive into the research topics of 'Non-asymptotic Analysis for Nonparametric Testing'. Together they form a unique fingerprint.

Cite this