Penn Arts & Sciences Logo

Thursday, October 20, 2011 - 1:00pm

Philippe Rigollet

Princeton University

Location

Drexel University

Korman Center, Room 245

We consider a general, not necessarily linear, regression problem with Gaussian noise and study an aggregation problem that consists in finding a linear combination of approximating functions, which is at the same time sparse and has small mean squared error (MSE). We introduce a new estimation procedure, called Exponential Screening (ES) that shows remarkable adaptation properties: it adapts to the linear combination that optimally balances MSE and sparsity, whether the latter is measured in terms of the number of non-zero entries in the combination or in terms of the global weight of the combination. The power of this adaptation result is illustrated by showing that ES solves optimally and simultaneously all the problems of aggregation in Gaussian regression considered previously. Tight minimax lower bounds establish optimal rates of sparse estimation and that the ES procedure is optimal. A numerical implementation of ES that results in a stochastic greedy algorithm is discussed and compared to state-of-the-art procedures for sparse estimation. Finally several extensions of the principle of sparsity pattern aggregation to other estimation problems that are structurally sparse will be presented.