Lasso-like methods have become pretty common in applied statistics but the Dantzig selector remains unpopular despite having great properties (minimax optimality). Why hasn't it become more popular?
Asked
Active
Viewed 2,138 times
8
1 Answers
6
The $\ell_\infty$ loss term is VERY sensitive to outliers.
Most (all?) of the theory for the Dantzig selector is under the assumption of normal / Gaussian errors. With this error distribution, there isn't much difference between $\ell_2$ loss and $\ell_\infty$ loss. However, with real data, we would like to be less sensitive to outliers.
-
1I don't think this is true. The $\ell_\infty$ loss in Dantzig is not applied to individual errors, but to the vector of correlations between the residual error and each of the explanatory variables. I don't see how this would make it more sensitive to outliers than a $\ell_2$ error term on the residual error. In fact, note that in the extreme case when you force the $\ell_\infty$ bounds in Dantzig to be zero, you obtain exactly least-squares linear regression. – david Nov 03 '19 at 17:14
flarepackage forRimplements the Dantzig selector. I have no experience with it though. – COOLSerdash May 16 '13 at 21:32