机器学习与数据科学博士生系列论坛(第六十五期)—— Algorithms and Applications of Stochastic Bilevel Optimization with Finite-Time Convergence Guarantees

发文时间:2024-01-04

Speaker(s):谢楚焓(北京大学)

Time:2024-01-04 16:00-17:00

Venue:腾讯会议 551-1675-5419

摘要:
Stochastic bilevel optimization (SBO) is a mathematical optimization framework that deals with optimization problems involving two levels of decision-making. The upper-level problem seeks to find the optimal solution by optimizing over a set of decision variables, taking into account the lower-level problem as a constraint or objective, while the lower-level problem is typically a constrained optimization problem that depends on the decision variables chosen by the upper-level problem.

In recent years, there has been a wide range of machine learning tasks that fit into the SBO framework, such as meta-learning, distributionally robust optimization, and actor-critic. In this talk, we will first introduce these examples, and then focus on iterative algorithms that solve the general SBO problem, with corresponding finite-time convergence guarantees. We show that with advanced variance-reduction techniques, the convergence rate for SBO can achieve the optimal rate as in traditional single-level problems. We will also discuss the relationship between SBO and other optimization schemes and methods, including two-timescale stochastic approximation (TTSA) and stochastic compositional optimization (SCO), as well as potential research directions.

论坛简介:该线上论坛是由张志华教授机器学习实验室组织,每两周主办一次(除了公共假期)。论坛每次邀请一位博士生就某个前沿课题做较为系统深入的介绍,主题包括但不限于机器学习、高维统计学、运筹优化和理论计算机科学。