This talk concerns a stochastic zeroth-order optimization (S-ZOO) problem. The objective is to minimize the expectation of a cost function whose gradient is not directly accessible. For this problem, traditional optimization algorithms mostly yield query complexities that grow polynomially with dimensionality (the number of decision variables). Consequently, these methods may not perform well in solving massive-dimensional problems arising in many modern applications, such as in training AI models. In view of this limitation, we propose a sparsity-inducing stochastic gradient-free (SI-SGF) algorithm, which provably yields a dimension-free (up to a logarithmic term) query complexity. Such insensitivity to the dimensionality growth is proven, for the first time, to be achievable when neither gradient sparsity nor gradient compressibility is satisfied. Our numerical results demonstrate a consistency between our theoretical prediction and the empirical performance, especially in training traffic monitoring AI models.
Dr. Hongcheng Liu is an Assistant Professor of Industrial and Systems Engineering at the University of Florida (UF). His research interest lies in stochastic optimization, high-dimensional statistical and machine learning, and their engineering applications. He received his Ph.D. in 2015 from the Pennsylvania State University. Prior to joining the UF, he worked as a postdoctoral researcher in Radiation Oncology at Stanford University between 2015 and 2017.