报告题目：The Beauty of Optimization to Machine Learning - the evolving needs of optimization techniques
报告人： 姚权铭 博士
Dr. Quanming Yao is currently a leading researcher in 4Paradigm and managing the company's research group. He obtained his Ph.D. degree at the Department of Computer Science and Engineering at Hong Kong University of Science and Technology (HKUST) in 2018 and received his bachelor degree at HuaZhong University of Science and Technology (HUST) in 2013. He is Qiming Star (HUST, 2012), Tse Cheuk Ng Tai Research Excellence Prize (CSE, HKUST, 2014-2015), Google Fellowship (machine learning, 2016) and Ph.D. Research Excellence Award (School of Engineering, HKUST, 2018-2019). He has 23 top-tier journal and conference papers, including ICML, NeurIPS, JMLR, TPAMI, KDD, ICDE, CVPR, IJCAI, and AAAI; he was an outstanding reviewer of Neurocomputing in 2017; served as program committee of many prestigious conferences, including ICML, NeurIPS, CVPR, AAAI, and IJCAI; one of the committees of AutoML competition in NeurIPS-2018, IJCNN-2019 and IJCAI-2019.
Optimization techniques have acted as the main primary fuel which enables machine learning brings benefits from larger data and more complex models. In this talk, I will present how we push the frontier of optimization for machine learning and how my research interests evolve from sparse & low-rank learning, then to deep learning, now to automated machine learning (AutoML). Specifically, I will use three representative works, i.e., (1). nonconvex to convex transformation (N2C) from nonconvex sparse & low-rank learning, (2). Co-teaching for robust and deep learning from noisy labels and (3). searching for scoring functions for automated knowledge graph embedding (AutoKGE), to show how the fundamental and underneath optimization techniques have evolving from theoretical to practical, from simple to complex and finally from white-box to black-box.