报告题目:Unifying Non-Convex Low-Rank Matrix Recovery Algorithms by Riemannian Gradient Descent
报告时间:2022年8月15日,星期一,上午10:00-11:00
报告地点:腾讯会议:535-234-643 会议密码:0422
报告摘要:The problem of low-rank matrix recovery from linear samples arises from numerous practical applications in machine learning, imaging, signal processing, computer vision, etc. Non-convex algorithms are usually very efficient and effective for low-rank matrix recovery with a theoretical guarantee, despite of possible local minima. In this talk, non-convex low-rank matrix recovery algorithms are unified under the framework of Riemannian gradient descent. We show that many popular non-convex low-rank matrix recovery algorithms are special cases of Riemannian gradient descent with different Riemannian metrics and retraction operators. Moreover, we identify the best choice of metrics and construct the most efficient non-convex algorithms for low-rank matrix recovery, by considering properties of sampling operators for different tasks such as matrix completion and phase retrieval.
报告人简介:蔡剑锋,香港科技大学数学系教授。2000年获复旦大学学士学位,2007年获香港中文大学博士学位。曾先后在新加坡国立大学,美国洛杉矶加州大学,和美国爱荷华大学工作。2015年加入香港科技大学数学系。研究兴趣是数据科学和成像技术中的算法设计和分析。在2017年和2018年被评选为全球高被引学者。目前已经在国际顶级数学期刊J. Amer. Math. Soc. 和国际著名期刊和会议如 Appl. Comput. Hamon. Anal., SIAM J. Optim., SIAM J. Imaging Sci., IEEE Trans. Image Process., IEEE Trans. Signal Process., CVPR, ICCV等发表100余篇论文。
邀请人: 黄猛
(转自数学科学学院网站)