数据科学|SPLBoost: An Improved Robust Boosting Algorithm Based on Self-paced Learning
时间:12月21日16:00-17:00
报告人:王尧博士(西安交通大学)
It is known that Boosting can be interpreted as an optimization technique to minimize an underlying loss function. Specifically, the underlying loss being minimized by the traditional AdaBoost is the exponential loss, which is proved to be very sensitive to random noise/outliers. Therefore, several Boosting algorithms, e.g., LogitBoost and SavageBoost, have been proposed to improve the robustness of AdaBoost by replacing the exponential loss with some designed robust loss functions. In this work, we present a new way to robustify AdaBoost, i.e., incorporating the robust learning idea of Self-paced Learning (SPL) into Boosting framework. Specifically, we design a new robust Boosting algorithm based on SPL regime, i.e., SPLBoost, which can be easily implemented by slightly modifying off-the-shelf Boosting packages. Extensive experiments and a theoretical characterization are also carried out to illustrate the merits of the proposed SPLBoost.
王尧,西安交通大学应用数学专业博士, 西安交通大学管理科学与工程专业博士后。现为西安交通大学管理学院智能决策与机器学习研究中心副教授、博士生导师。主要研究方向为机器学习方法在图像视频数据分析、知识图谱、精准医疗以及推荐系统方面的应用,已在NSR, IEEE TIP, IEEE TNNLS, IEEE TGRS, IEEE TSP, IEEE TCYB, ICML, ICCV, CVPR等国际权威期刊与顶级学术会议上发表论文40余篇, 研究成果曾获2018年陕西省科学技术一等奖。现主持国家重点研发计划项目子课题与国家自然科学基金面上项目各1项。
联系人:郭正初(guozhengchu@zju.edu.cn)