当前位置: 首 页 - 科学研究 - 学术报告 - 正文

必威、所2021年系列学术活动(第140场):张进 助理教授 南方科技大学

发表于: 2021-10-22   点击: 

报告题目:Towards Gradient-based Bilevel Optimization in Machine Learning

报 告 人: 张进 助理教授 南方科技大学

报告时间:2021 年10 月28 日 上午 10:40 - 11:20

报告地点:腾讯会议 ID:959 559 128 会议密码:9999

校内联系人:李欣欣 xinxinli@jlu.edu.cn


报告摘要:Recently, Bi-Level Optimization (BLO) techniques have received extensive attentions from machine learning communities. In this talk, we will discuss some recent advances in the applications of BLO. First, we study a gradient-based bi-level optimization method for learning tasks with convex lower level. In particular, by formulating bi-level models from the optimistic viewpoint and aggregating hierarchical objective information, we establish Bi-level Descent Aggregation (BDA), a flexible and modularized algorithmic framework for bi-level programming. Second, we focus on a variety of BLO models in complex and practical tasks are of non-convex follower structure in nature. In particular, we propose a new algorithmic framework, named Initialization Auxiliary and Pessimistic Trajectory Truncated Gradient Method (IAPTT-GM), to partially address the lower level non-convexity. By introducing an auxiliary as initialization to guide the optimization dynamics and designing a pessimistic trajectory truncation operation, we construct a reliable approximation to the original BLO in the absence of lower level convexity hypothesis. Extensive experiments justify our theoretical results and demonstrate the superiority of the proposed BDA and IAPTT-GM for different tasks, including hyper-parameter optimization and meta learning.


报告人简介:张进,南方科技大学助理教授,本科硕士均毕业于大连理工大学,2014年博士毕业于加拿大维多利亚大学。2015至2018年间任职于香港浸会大学数学系,2019年初加入南方科技大学。主要研究领域为最优化理论,双层规划(均衡约束数学规划)理论、算法及在机器学习、理论经济学中应用。主持多项国家级基金项目,代表性论文发表在Mathematical Programming、SIAM Journal on Optimization、SIAM Journal on Numerical Analysis、Journal of Machine Learning Research、International Conference on Machine Learning等运筹优化、机器学习领域的顶级期刊与会议上。张进博士于2020年获得第七届中国运筹学会青年科技奖,并于2021年入选深圳市优秀科技创新人才培养优秀青年计划。