Generalization Analysis for Contrastive Representation Learning
题目:Generalization Analysis for Contrastive Representation Learning
报告人:应益明教授(University at Albany, State University of New York)
时间:2023年6月30日(星期五)9:00
地点:腾讯会议
会议ID:925-524-124
会议密码:310058
Abstract: In this talk, I will present recent progress in establishing the learning theory foundation for Contrastive Representation Learning (CRL). CRL has demonstrated impressive empirical performance as a self-supervised learning model, surpassing even supervised learning models in various domains like computer vision and natural language processing. The talk addresses two crucial theoretical questions: 1) How does the generalization behavior of downstream tasks benefit from the representation function built from CRL? 2) Particularly, how does the number of negative examples impact its learning performance? Our analysis reveals that generalization bounds for contrastive learning are not dependent on the number of negative examples, up to logarithmic terms. This analysis utilizes structural results on empirical covering numbers and Rademacher complexities to exploit the Lipschitz continuity of loss functions.
Bio: Yiming Ying is a Professor at SUNY Albany and the founding director of ML@UAlbany, a machine learning lab at UAlbany. His research focuses on Statistical Learning Theory, Trustworthy Machine Learning, and Optimization. Dr. Ying is the recipient of the University at Albany's Presidential Award for Excellence in Research and Creative Activities (2022) and the SUNY Chancellor's Award for Excellence in Scholarship and Creative Activities (2023). He serves as an associate editor for several journals such as Transactions of Machine Learning Research, Neurocomputing, and Mathematical Foundation of Computing, and as an Area Chair for major machine learning conferences such as NeurIPS, ICML, and AISTATS.