Sporadic informal observations over several decades (and most recently in Lewis-Overton, 2013) suggest that quasi-Newton methods for smooth optimization can also work surprisingly well on nonsmooth functions.
This talk explores this phenomenon from several perspectives. First, we compare experimentally the two most popular quasi-Newton updates, BFGS and SR1, in the nonsmooth setting. Secondly, we study how repeated BFGS updating at a single fixed point can serve as a separation oracle (for the subdifferential). Lastly, we show how Powell’s original 1976 BFGS convergence proof for smooth convex functions in fact extends to some nonsmooth settings.
Dec. 05th, 2017
14:00 ~ 16:00
Jiayi Guo, Cornell University
Jiayi Guo is a Ph.D. student in School of Operations Research and Information Engineering at Cornell University, under the supervision of Professor Adrian Lewis in the same department. He is expecting to graduate on May 2018. Broadly conceived, his research area is optimization. Currently, his work explores different variations of iterative methods to solve continuous optimization problems on non-smooth functions. In general, Jiayi is interested in the interplay between optimization, simulation, and numerical analysis.
Before coming to Cornell, he received his B.S. of Mathematics and B.S. of Computer Science dual degree (2012) at University of Illinois at Urbana-Champaign.
Room 308, School of Information Management & Engineering, Shanghai University of Finance & Economics