北 京 大 数 据 研 究 院
BEIJING INSTITUTE OF BIG DATA RESEARCH

学术预告 | 机器学习与数据科学博士生系列论坛(第十一期)

机器学习实验室博士生系列论坛(第十一期)—— The Gradient Complexity of First-Order Methods for Smooth Minimax Optimization

报告人:Yuze Han (PKU)

时间:2021-08-04 15:10-16:10

地点:北大静园六院一楼大会议室&腾讯会议 914 9325 5784

Abstract: The minimax optimization problem arises ubiquitously in machine learning, with examples including matrix games, Generative Adversarial Networks, robust optimization and reinforcement learning. If the objective function is convex-concave, Sion's minimax theorem guarantees the existence of a saddle point under certain regularity conditions. Therefore, the primal-dual gap forms the basis for a standard optimality criterion. For the nonconvex-concave case, the stationarity of the primal function can be used as the optimality measure. Many algorithms have been proposed to solve minimax problems. First-order methods have been particularly popular due to the huge scale of many modern applications and the dimension-free convergence rates. Additionally, some of them are designed for problems with special structures such as bilinear problems, discrete zero-sum games and finite-sum problems. In this talk, we will give an overview of recent theoretical results on gradient complexity bounds of deterministic and stochastic first-order methods for smooth minimax optimization. In particular, we will also focus on the special case when the problem has a finite-sum structure.