Logo

Summer School on Modern Statistics

Date Monday, June 25 Tuesday, June 26 Wednesday, June 27 Thursday, June 28 Friday, June 29
10:00-11:30 Tony Cai Tony Cai Tony Cai Tony Cai Ming Yuan
14:00-15:40 Ming Yuan Tony Cai Ming Yuan Ming Yuan Ming Yuan

Address

Room 601, Pao Yue-Kong Library, Shanghai Jiao Tong University

Statistics I: Lectures on High Dimensional Statistical Inference

Tony Cai, The Wharton School, University of Pennsylvania

These lectures will focus on the recovery of high dimensional sparse signals in various settings: 1. Wavelet thresholding for nonparametric regression; 2. Detection and estimation of sparse signals under a random mixture model; 3. Compressed sensing/High-dimensional linear regression.

These and other related problems have also attracted much recent interest in other fi elds including applied mathematics and electrical engineering. We begin with discussions on important results in nonparametric function estimation in the framework of the infinite dimensional Gaussian sequence model. Minimaxity, adaptive minimaxity, and oracle inequalities are covered in the context of the sequence model. In particular, Pinsker’s results on linear minimaxity for estimation over an ellipsoid and the wavelet thresholding theory developed by Donoho and Johnstone will be discussed. We will then consider detection and estimation of sparse signals under a random mixture model. Finally, we will present recent results on compressed sensing and high dimensional linear regression. We speci cally cover in detail the constrained l1 minimization methods and present a unified and elementary analysis on sparse signal recovery in three settings: noiseless, bounded noise and Gaussian noise. I will also briefly discuss the latest results on optimal estimation of large covariance matrices at the end.

References

[1] Bickel, P. J., Ritov, Y. and Tsybakov, A. B. (2009). Simultaneous analysis of Lasso and Dantzig selector, The Annals of Statistics, 37, 1705-1732. PDF
[2] Cai, T. (1999). Adaptive wavelet estimation: a block thresholding and oracle inequality approach. The Annals of Statistics 27, 898-924. PDF
[3] Cai, T. (2008). On information pooling, adaptability and supereciency in nonparametric function estimation. J. Multivariate Analysis 99, 412-436. PDF
[4] Cai, T., Wang, L. and Xu, G. (2010). Shifting inequality and recovery of sparse signals. IEEE Transactions on Signal Processing, 58, 1300-1308. PDF
[5] Candes, E. T. and Tao, T. (2007). The Dantzig selector: statistical estimation when p is much larger than n (with discussion), The Annals of Statistics 35, 2313-2351. PDF
[6] Johnstone, I. M. (1999). Gaussian Estimation: Sequence and Wavelet Models. Monograph. Available at http://www-stat.stanford.edu/~imj/GE12-27-11.pdf.
[7] Tsybakov, A. B. (2009). Introduction to Nonparametric Function Estimation.Springer. PDF

Statistics II: Statistical Learning Theory

Ming Yuan, Georgia Institute of Technology

Machine learning has seen the explosion of its popularity in the past couple of decades and drawn interests of researchers from areas such as approximation theory, computer science, functional analysis, numerical analysis, probability and statistics among many others. Although machine learning has originally focused on developing algorithms and their computational analysis, it has become clear in recent years that many problems in machine learning sit comfortably in a statistical framework. Statistical analysis can reveal new insights of these problems and lead to improved algorithms. In these lectures, I will give an overview of some of the technical tools commonly used and review some of the recent advances in statistical learning theory.