数学交叉科学研究所学术报告(应益明教授,SUNY Albany)
来源:系统管理员 发布时间:2023-12-06
报告题目:Statistical Learning Theory for Contrastive Representation Learning
报告人:应益明教授,SUNY Albany
报告时间:2023年12月8日(周五)10:30-11:30
报告地点:腾讯会议ID:210-356-418
报告摘要:In this talk, I will review the background of Contrastive Representation Learning (CRL) and discuss the recent progress in establishing its learning theory foundation. CRL has demonstrated impressive empirical performance as a self-supervised learning model, surpassing even supervised learning models in various domains like computer vision and natural language processing. The talk addresses two crucial theoretical questions: (1) How does the generalization behavior of downstream tasks benefit from the representation function built from CRL? (2) Particularly, how does the number of negative (dissimilar) examples impact its learning performance? Our analysis reveals that generalization bounds for contrastive learning are not dependent on the number of negative examples, up to logarithmic terms. This analysis utilizes structural results on empirical covering numbers and Rademacher complexities to exploit the Lipschitz continuity of loss functions.
报告人简介:
邀请人:向道红