New aspects of Bregman divergence in regression and classification with parametric and nonparametric estimation

Chunming Zhang, Yuan Jiang, Zuofeng Shang

Research output: Contribution to journalArticlepeer-review

18 Scopus citations

Abstract

In statistical learning, regression and classification concern different types of the output variables, and the predictive accuracy is quantified by different loss functions. This article explores new aspects of Bregman divergence (BD), a notion which unifies nearly all of the commonly used loss functions in regression and classification. The authors investigate the duality between BD and its generating function. They further establish, under the framework of BD, asymptotic consistency and normality of parametric and nonparametric regression estimators, derive the lower bound of their asymptotic covariance matrices, and demonstrate the role that parametric and nonparametric regression estimation play in the performance of classification procedures and related machine learning techniques. These theoretical results and new numerical evidence show that the choice of loss function affects estimation procedures, whereas has an asymptotically relatively negligible impact on classification performance. Applications of BD to statistical model building and selection with non-Gaussian responses are also illustrated.

Original languageEnglish (US)
Pages (from-to)119-139
Number of pages21
JournalCanadian Journal of Statistics
Volume37
Issue number1
DOIs
StatePublished - Mar 2009
Externally publishedYes

All Science Journal Classification (ASJC) codes

  • Statistics and Probability
  • Statistics, Probability and Uncertainty

Keywords

  • Asymptotic normality
  • Bayes optimal rule
  • Consistency
  • Local polynomial regression
  • Loss function
  • Prediction error

Fingerprint

Dive into the research topics of 'New aspects of Bregman divergence in regression and classification with parametric and nonparametric estimation'. Together they form a unique fingerprint.

Cite this