机器学习与数据科学博士生系列论坛(第六十九期)—— Nearly-Linear Gradient-Based Algorithms for Decomposable Convex Optimization

发文时间:2024-04-11

Speaker(s):忻宇辰(北京大学)

Time:2024-04-11 16:00-17:00

Venue:腾讯会议 446-9399-1513

摘要:
Many fundamental problems in machine learning are abstractly captured by decomposable convex optimization. One common approach to this problem involves sampling one term at every iteration to make progress, which relies on the condition number of the problem.

In this talk, we will introduce an algorithm for decomposable convex optimization, based on a recent work by Dong, Jiang, Lee, Padmanabhan, and Ye (NeurIPS 2022). This algorithm solves the problem in nearly-linear gradient computations, with no assumptions on the condition number.

论坛简介:该线上论坛是由张志华教授机器学习实验室组织,每两周主办一次(除了公共假期)。论坛每次邀请一位博士生就某个前沿课题做较为系统深入的介绍,主题包括但不限于机器学习、高维统计学、运筹优化和理论计算机科学。