人员简介

WANG, Benyou

Biography

POSITION/TITLE

Research Scientist

Assistant Professor

RESEARCH FIELD

Natural Language Processing, Information Retrieval, Applied Machine Learning, Quantum Machine Learning

EMAIL

wangbenyou@cuhk.edu.cn

PERSONAL WEBSITE

https://wabyking.github.io/old.html

EDUCATION BACKGROUND 

Ph.D. Information Engineering (Information Science and Technology), University of Padua, 2022
M.Eng. Pattern Recognition and Intelligent Systems, Tianjin University, 2017
B.Eng. Software Engineering, Hubei University of Automotive Technology, 2014

Major Achievements/Honors

NLPCC 2022 Best Paper

Best Explainable NLP Paper in NAACL 2019

Best Paper Award Honorable Mention in SIGIR 2017

Huawei Spark Award

BIOGRAPHY

Benyou Wang received Ph.D. degree at the University of Padua, Italy, funded by the Marie Curie Fellowship. He obtained a master's degree from Tianjin University. In his research career, he has visited University of Copenhagen (Denmark), University of Montreal (Canada), University of Amsterdam (the Netherlands), Huawei Noah's Ark Laboratory (Shenzhen, China), Institute of Theoretical Physics (Chinese Academy of Sciences, Beijing China), and Language Institute (Chinese Academy of Social Sciences, Beijing, China). He is committed to building explainable, robust, and efficient natural language processing approaches that are with both technical rationality and linguistic motivation. So far, he and his collaborators have won the Best Paper Nomination Award in SIGIR 2017 (the top conference in information retrieval) and Best Explainable NLP Paper in NAACL 2019 (a top conference on natural language processing). He had many articles in top international conferences NeurIPS/ICLR/SIGIR/WWW/NAACL and top international journals such as IEEE Transactions on Information System (TOIS), IEEE Transaction on Cybernetics (TOC), and Theoretical Computer Science (TCS). According to Google Scholar, he had achieved nearly 1000 citations with an h-index of 16 when he receives his Ph.D. degree.

ACADEMIC PUBLICATIONS

 [1] Jianquan Li, Xiangbo Wu, Xiaokang Liu,  Prayag Tiwari, Qianqian Xie. Benyou Wang. Can Language Models Make Fun? A Case Study in Chinese Comical Crosstalk. ACL 2023. (CCF A)

[2] Yajiao Liu, Xin Jiang, Yichun Yin, Yasheng Wang, Fei Mi, Qun Liu, Xiang Wan, Benyou Wang. One Cannot Stand for Everyone! Leveraging Multiple User Simulators to train Task-oriented Dialogue Systems. ACL 2023. (CCF A)

[3] Benyou Wang, Qianqian Xie, Jiahuan Pei, Zhihong Chen, Prayag Tiwari, Zhao Li, Fu Jie. Pre-trained language models in biomedical domain: A systematic survey. ACM Computing Surveys. (SCI Q1 top)

[4] Benyou Wang, Yuxin Ren, Lifeng Shang, Xin Jiang, Qun Liu. Exploring extreme parameter compression for pre-trained language models. ICLR 2022

[5] Peng Zhang, Wenjie Hui, Benyou Wang, Donghao Zhao, Dawei Song, Christina Lioma, Jakob Grue Simonsen. Complex-valued Neural Network-based Quantum Language Models.  ACM Transactions on Information Systems. 2022. (CCF A)

[6] Benyou Wang, Emanuele Di Buccio, Massimo Melucci. Word2Fun: Modelling Words as Functions for Diachronic Word Representation. NeurIPS 2021 (CCF A)

[7] Benyou Wang, Lifeng Shang, Christina Lioma, Xin Jiang, Hao Yang, Qun Liu, Jakob Grue Simonsen. On position embeddings in BERT. ICLR 2020.

[8] Benyou Wang, Donghao Zhao, Christina Lioma, Qiuchi Li, Peng Zhang, Jakob Grue Simonsen. Encoding word order in complex embeddings. ICLR 2019

[9] Benyou Wang, Qiuchi Li, Massimo Melucci, Dawei Song. Semantic Hilbert Space for Text Representation Learning. The Web Conference 2018 (WWW). (CCF A)

[10] Qiuchi Li, Benyou Wang, Massimo Melucci. CNM: An Interpretable Complex-valued Network for Matching. NAACL 2018.  Best Explainable NLP Paper.(CCF B)