Oracle Inequalities for Convex Loss Functions with Nonlinear Targets |
| |
Authors: | Mehmet Caner Anders Bredahl Kock |
| |
Affiliation: | 1. Department of Economics, Ohio State University, Columbus, Ohio, USA;2. Translational Data Analytics, Ohio State University, Columbus, Ohio, USA;3. Department of Statistics, Ohio State University, Columbus, Ohio, USA;4. Department of Economics, Aarhus University and CREATES, Aarhus, Denmark |
| |
Abstract: | This article considers penalized empirical loss minimization of convex loss functions with unknown target functions. Using the elastic net penalty, of which the Least Absolute Shrinkage and Selection Operator (Lasso) is a special case, we establish a finite sample oracle inequality which bounds the loss of our estimator from above with high probability. If the unknown target is linear, this inequality also provides an upper bound of the estimation error of the estimated parameter vector. Next, we use the non-asymptotic results to show that the excess loss of our estimator is asymptotically of the same order as that of the oracle. If the target is linear, we give sufficient conditions for consistency of the estimated parameter vector. We briefly discuss how a thresholded version of our estimator can be used to perform consistent variable selection. We give two examples of loss functions covered by our framework. |
| |
Keywords: | Convex loss function Elastic net Empirical loss minimization Lasso Nonparametric estimation Oracle inequality Variable selection |
|
|