Consistent model selection criteria for quadratically supported risks

Yongdai Kim, Jong June Jeon

Research output: Contribution to journalArticlepeer-review

11 Scopus citations

Abstract

In this paper, we study asymptotic properties of model selection criteria for high-dimensional regression models where the number of covariates is much larger than the sample size. In particular, we consider a class of loss functions called the class of quadratically supported risks which is large enough to include the quadratic loss, Huber loss, quantile loss and logistic loss. We provide sufficient conditions for the model selection criteria, which are applicable to the class of quadratically supported risks. Our results extend most previous sufficient conditions for model selection consistency. In addition, sufficient conditions for pathconsistency of the Lasso and noncon-vex penalized estimators are presented. Here, pathconsistency means that the probability of the solution path that includes the true model converges to 1. Pathconsistency makes it practically feasible to apply consistent model selection criteria to high-dimensional data. The data-adaptive model selection procedure is proposed which is selection consistent and performs well for finite samples. Results of simulation studies as well as real data analysis are presented to compare the finite sample performances of the proposed data-adaptive model selection criterion with other competitors.

Original languageEnglish
Pages (from-to)2467-2496
Number of pages30
JournalAnnals of Statistics
Volume44
Issue number6
DOIs
StatePublished - Dec 2016

Keywords

  • Generalized information criteria
  • High dimension
  • Model selection
  • Quadratically supported risks
  • Selection consistency

Fingerprint

Dive into the research topics of 'Consistent model selection criteria for quadratically supported risks'. Together they form a unique fingerprint.

Cite this