Home

A B C D E F G H I J K L M N O P Q R S T U V W X Y Z

Wang, Liwei

Professor

Research Interests: Machine learning, pattern recognition

Office Phone: 86-10-6275 6657

Email: wanglw@cis.pku.edu.cn

Wang, Liwei is a professor in the Department of Computer Science and technology, School of EECS. He obtained the Ph.D. degree from School of Mathematical Sciences, Peking University in 2005; the B.S. and M.S. degrees from Department of Electronic Engineering, Tsinghua University in 1999 and 2002, respectively. His research interests include machine learning and pattern recognition.

In the past five years, Dr. Wang has published 27 papers on top-tier machine learning journals and conferences such as Journal of Machine Learning Research, NIPS and COLT as well as important pattern recognition artificial intelligence journals and conferences such as IEEE Transactions on Pattern Analysis and Machine Intelligence, TIP, IJCAI and AAAI. Dr. Wang was recognized by IEEE Intelligent Systems as one of AI’s 10 to Watch in 2010. He is the first Asian researcher who received this honor. He received the NSFC excellent young researcher grant in 2012 and was also supported by program for New Century Excellent Talents in University by the Ministry of Education. His paper on the margin theory of AdaBoost published in COLT in 2008 is the first COLT paper by authors from mainland China since COLT was founded in 1988. His research achievements are summarized as follows:

1) Margin theory of Boosting: Boosting, represented by Adaboost, is one of the most important achievements in machine learning. However, the ability of Adaboost to prevent overfit seems contrary to Occam's razor principle. This phenomenon is called the mystery of AdaBoost and has been viewed as the most unsolved problem in machine learning by Prof. Leo Breiman, a fellow of the American Academy of Sciences. To unlock this mystery, he proposed Equilibrium margin theory which gives a periodical explanation to long-term debate over the Margin theory of Boosting from academic.

2) New paradigms of machine learning: Traditional supervised learning meets great challenge in the Age of Big Data. Active learning, which aims at reducing the manual annotations required for learning, becomes one of new paradigms of machine learning. He proved that label complexity of active learning is less than the passive learning in exponential order in some smoothness conditions, which has reached the international forefront of the current theoretical research. Privacy protection is a prerequisite for learning from sensitive data. On the basis of the concept of differential privacy, he has proposed novel mechanisms both for query answering and outputting synthetic database in terms of smooth queries, which outperform all existing methods.

3) Pattern recognition methods: Traditional criteria for measuring distance between images are sensitive to noisy in images. To enhance robustness of these criteria, he proposed an image Euclidean distance (IMED) criterion and further embedded the IMED criterion into learning algorithms, which can speed up these algorithms in square order. Related papers were cited by more than 200 times, including many famous national academicians. He also established the theory of machine learning based on the dissimilarity of objects, which is one of two earliest theories in this field.