机器学习与数据科学博士生系列论坛(第六十四期)—— An Introduction to Gradient-free Methods in Convex Optimization

发文时间:2023-12-21

Speaker(s):陈坤(北京大学)

Time:2023-12-21 16:00-17:00

Venue:腾讯会议 551-1675-5419

摘要:
Gradient-free/zeroth-order methods for convex optimization were developed in a wide range of works in the last decade, mainly driven by many applications in the field of reinforcement learning and statistics, such as convex bandit. Recently, several generic approaches based on optimal first-order methods were proposed, which allows us to obtain black-box zeroth-order algorithms for optimization problems. These algorithms also behave well in terms of the oracle complexity, iteration complexity, and level of admissible noise.

In this talk, we will introduce the major approach for several kinds of zeroth-order optimization problems, which is based on the first-order methods by approximating the gradient. We will then briefly introduce some specific optimization problems, such as convex bandit, as well as some techniques in first-order methods that can be applied to the gradient-free case.

论坛简介:该线上论坛是由张志华教授机器学习实验室组织,每两周主办一次(除了公共假期)。论坛每次邀请一位博士生就某个前沿课题做较为系统深入的介绍,主题包括但不限于机器学习、高维统计学、运筹优化和理论计算机科学。