主 题: Lasso: model selection consistency and Blasso algorithm
报告人: Prof. Bin Yu (University of Carlifornia, Berkeley)
时 间: 2006-06-29 下午 2:00 - 3:30
地 点: 理科一号楼 1114
To deal with the high-dimensional data in the IT age,
statistical methods have to take into account computation
in the design stage of the methods. Machine learning procedures
like Support Vector Machines and Boosting algorithms
are such schemes. Moreover, they produce models that are sparse
in some sense, regularized and with impressive empirical
performances on real data sets. Sparisy and regularziation are helpful for
prediction and sparsity for model interpretation.
Lasso, L1 penalized L2 regularization, is a method
originated in the statistics community which also gives
sparsity and regularizes. In this talk, we connect Lasso
solutions to model selection by showing its model selection
consistency under an "irrepresentable condition" on the design matrix
and introduce the Blasso algorithm to give an approximate
path of Lasso in a unified manner for general convex loss function
with a convex penalty. Blasso can also be viewed
as a boosting algorithm with a new backward step. Simulation
and real data results will also be presented to understand
the irrepresentable condition and the effectiveness of the Blasso
algorithm. (This is joint work with Peng Zhao.)