Statistical optimization has received quite some interests recently. It refers to the case where hidden and local convexity can be discovered with large probability for nonconvex problems, making polynomial algorithms possible. It relies on careful analysis of the geometry near global optima.
In this talk,I will explore this direction by focusing on sparse regression problems in high dimensions. A computational framework named iterative local adaptive majorize-minimization(I-LAMM) is proposed to simultaneously control algorithmic complexity and statisticalerror. I-LAMM e?ectively turns the nonconvex penalized regression problem into a series of convex programs by utilizing the locally strong convexity of the problem when restricting the solution set in an l1 cone. Computationally, we establish a phase transition phenomenon: it enjoys linear rate of convergence after a sub-linear burn-in. Statistically, it provides solutions with otimal statistical errors. Extensions to various models such as robust regression models and matrix models will be discussed.
June 30th, 2017
10:00 ~ 11:45
Qiang Sun, University of Toronto
Qiang is currently an assistant professor at University of Toronto within the Department of Statistical Sciences there and holds a visiting appointment in the Department of Operations Research and Financial Engineering at Princeton University. He earned his doctoral degree in Biostatistics from the University of North Carolina, Chapel Hill. His research interests span a broad spectrum, including hypothesis-driven imaging genetics, statistical optimization for big data, nonasymptotic inference and robustness in high dimensions. He publishes papers in both statistical and scientific journals such as JASA, AoS, JRSSB and EST.
Room 102, School of Information Management & Engineering, Shanghai University of Finance & Economics