I am a postdoc at the Language Technologies Institute (LTI) of Carnegie Mellon University, working with Prof. Graham Neubig, Teruko Mitamura, Jaime Carbonell. I'm a member of NeuLab.

I obtained a Ph.D. degree in the School of Computer Science at Fudan University (11.2019), advised by Prof. Xipeng Qiu and Prof. Xuanjing Huang

Specifically, I favor the following research perspectives for natural language:

  • Interpretable Analysis for NLP-oriented Neural Networks
  • Learning transferable, disentangled, interpretable representations (i.e., Multi-task, Transfer, Meta-learning)
  • Text Summarization
  • Sequence Labeling (i.e., NER, CWS)
Gratefully, during my Ph.D., I have obtained multiple fellowship awards (Baidu Scholar, Tencent AI Lab Fellowship, Microsoft Research Asia Fellow, IBM Ph.D. Fellowship), which are one of the highest fellowship honors for Chinese Ph.D. students in computer science. I'm the only one who simultaneously wins these awards. I'm on Twitter.

Jan. , 2020

[Research][NEW] Prof. Graham Neubig and I have released a NN4NLP-concept project, it's a typology of important concepts that you should know to implement SOTA NLP models using neural nets. This project is originally used for CMU NLP class (CS 11-747) but more than that. If you are a new NLPer and wondering how to start your NLP journey, highly recommended to check it out. (Here is a Chinese blog of it.)

Jan. , 2020

[Research][NEW] I set up a "NER-Paperlist" project. If you are interested in the NER task, highly recommended to check this.

Nov. , 2017

[Collaboration] I am a visiting scholar for one year at the Montreal Institute for Learning Algorithms ( MILA ), in Canada.

Mar. 20, 2018

[Award] I got Tencent AI Lab PhD Fellowship (40,000 USD), which was awarded to 5 students around the world. I'm lucky to be the only awared-winner in Chinese University and share the honor with these awesome examples: Daniel Fried (UCB), Yuke Zhu (Stanford), Zhuoran Yang (Princeton), and Noam Brown (CMU).

Oct. 19, 2017

[Award] I got Microsoft Research Asia Fellow(10,000 USD), which was awarded to 10 students in Asia Pacific.

Mar. 24, 2017

[Award] I got IBM PhD Fellowship(6000 USD), which was awarded to about 80 students around the world.

Dec. 22, 2016

[Award] I got Baidu Scholarship(30,000 USD), which was awarded to 10 Chinese students around the world. Here is promotional VIDEO about me.

Oct. 8, 2018

[Travel & Talk] I gave a talk with the title "Frontiers in NLP 2018" in MILA.

April 23, 2018

[Travel & Talk] I gave a talk with the title "Multi-task Learning for Natural Lanugage Processing" in MILA.

May 18, 2017

[Travel & Talk] I gave a talk with the title "Neural Representation Learning in NLP". Here is the slide and video.

July 4, 2016

[Travel & Talk] I was invited to give a seminar on Convolutional Neural Network and Recursive Neural Network hosted by Chinese Information Processing Society of China, in Beijing. And I am honoured to be rated as one of the best-loved speakers by the audiences.

Highlights


Paper lists:     NER Paper List   Summarization Paper List   ACL/EMNLP/NAACL 2019 Paper List
Concepts:       NER Concepts   Summarization Concepts   ACL/EMNLP/NAACL 2019 Concepts

PhD Thesis

[NEW] Neural Representation Learning for Natural Language Processing,
Pengfei Liu
PhD thesis, Fudan University, 2019 [bib] [ Acknowledge] [Proj]

Publications [Google Scholar] [By Date] [By Topic]

(*: equal contribution)

NLP Task-oriented Interpretable Analysis for Neural Networks

[NEW] Rethinking Generalization of Neural Models: A Named Entity Recognition Case Study,
Jinlan Fu*, Pengfei Liu*, Qi Zhang, Xuanjing Huang
Proceedings of the 33th AAAI Conference on Artificial Intelligence (AAAI 2020). [ pdf ][ Project ]

[NEW] A Closer Look at Data Bias in Neural Extractive Summarization Models ,
Ming Zhong*, Danqing Wang*, Pengfei Liu*, Xipeng Qiu, Xuanjing Huang
Conference on Empirical Methods in Natural Language Processing (EMNLP 2019) Workshop on New Frontiers in Summarization [ pdf]

[NEW] Searching for Effective Neural Extractive Summarization: What Works and What’s Next,
Ming Zhong*, Pengfei Liu*, Danqing Wang, Xipeng Qiu, Xuanjing Huang
The annual meeting of the Association for Computational Linguistics(ACL 2019, Long). [ pdf, Proj. with code ]

Graph Neural Networks and Transformer

[NEW] Contextualized non-local neural networks for sequence learning,
Pengfei Liu, Shuaicheng Chang, Jian Tang, Jackie Chi Kit Cheung
Proceedings of the 33th AAAI Conference on Artificial Intelligence (AAAI 2019). [ pdf ] [ story ]

[NEW] Learning multi-task communication with message passing,
Pengfei Liu, Jie Fu*, Yue Dong*, Xipeng Qiu, Jackie Chi Kit Cheung
Proceedings of the 33th AAAI Conference on Artificial Intelligence (AAAI 2019). [ pdf ] [ story ]

[NEW] Star Transformer,
Qipeng Guo, Xipeng Qiu, Pengfei Liu, Yunfan Shao, Xiangyang Xue, Zheng Zhang
The North American Chapter of the Association for Computational Linguistics (NAACL 2019). [ pdf ] [ code ]

[NEW] Multi-Scale Self-Attention for Text Classification,
Qipeng Guo, Xipeng Qiu, Pengfei Liu, Xiangyang Xue, Zheng Zhang
Proceedings of the 33th AAAI Conference on Artificial Intelligence (AAAI 2020). [ pdf ]

[NEW] Dynamic Interaction Networks for Image-Text Multimodal Learning,
Wenshan Wang*, Pengfei Liu*, Su Yang, Weishan Zhang
Neurocomputing. [ pdf ]

Multi-task / Transfer / Meta Learning

Adversarial Multi-task Learning for Text Classification,
Pengfei Liu, Xipeng Qiu, and Xuanjing Huang
The annual meeting of the Association for Computational Linguistics(ACL 2017, Long). [ pdf, project with dataset and code, Slide ]

Recurrent Neural Network for Text Classification with Multi-Task Learning,
Pengfei Liu,, Xipeng Qiu, Xuanjing Huang
Proceedings of International Joint Conference on Artificial Intelligence (IJCAI 2016). [ pdf, demo[Chinese], demo[English], poster, slide, code ]

[NEW] Learning Sparse Sharing Architectures for Multiple Tasks,
Tianxiang Sun, Yunfan Shao, Xiaonan Li, Pengfei Liu, Hang Yan, Xipeng Qiu, Xuanjing Huang
Proceedings of the 33th AAAI Conference on Artificial Intelligence (AAAI 2020). [ pdf ]

[NEW] Zero-shot Text-to-SQL Learning with Auxiliary Task,
Shuaichen Chang, Pengfei Liu, Yun Tang, Jing Huang, Xiaodong He, Bowen Zhou
Proceedings of the 33th AAAI Conference on Artificial Intelligence (AAAI 2020). [ pdf ]

[NEW] Meta-Learning Multi-task Communication,
Pengfei Liu, Xuanjing Huang
arXiv Preprint 2018. [ pdf, blog ]

Deep Multi-Task Learning with Shared Memory,
Pengfei Liu, Xipeng Qiu and Xuanjing Huang
Conference on Empirical Methods in Natural Language Processing (EMNLP 2016, Long). [ pdf ]

Meta Multi-Task Learning for Sequence Modeling,
Junkun Chen, Xipeng Qiu, Pengfei Liu, Xuanjing Huang
Proceedings of the 32th AAAI Conference on Artificial Intelligence (AAAI 2018). [ pdf ]

Semantic Composition

Dynamic Compositional Neural Networks over Tree Structure,
Pengfei Liu, Xipeng Qiu, Xuanjing Huang
Proceedings of International Joint Conference on Artificial Intelligence(IJCAI 2017). [ pdf, project with code ]

Idiom-Aware Compositional Distributed Semantics,
Pengfei Liu, kaiyu Qian, Xipeng Qiu and Xuanjing Huang
Conference on Empirical Methods in Natural Language Processing (EMNLP 2017, Long). [ pdf ]

Adaptive Semantic Compositionality for Sentence Modelling,
Pengfei Liu, Xipeng Qiu, Xuanjing Huang
Proceedings of International Joint Conference on Artificial Intelligence (IJCAI 2017). [ pdf ]

Modelling Interaction of Sentence Pair with Coupled-LSTMs,
Pengfei Liu, Xipeng Qiu and Xuanjing Huang
Conference on Empirical Methods in Natural Language Processing (EMNLP 2016, Long). [ pdf ]

Deep Fusion LSTMs for Text Semantic Matching,
Pengfei Liu, Xipeng Qiu, Jifan Chen, and Xuanjing Huang
The annual meeting of the Association for Computational Linguistics (ACL 2016, Long). [ pdf, code]

Implicit Discourse Relation Detection via a Deep Architecture with Gated Relevance Network,
Jifan Chen, Qi Zhang, Pengfei Liu,, Xipeng Qiu and Xuanjing Huang
The annual meeting of the Association for Computational Linguistics (ACL 2016, Long). [ pdf ]

Discourse Relations Detection via a Mixed Generative-Discriminative Framework,
Jifan Chen, Qi Zhang, Pengfei Liu,, Xuanjing Huang
Proceedings of the 30th AAAI Conference on Artificial Intelligence (AAAI 2016). [ pdf ]

Syntax-based Attention Model for Natural Language Inference,
Pengfei Liu, Xipeng Qiu and Xuanjing Huang
arXiv Preprint 2016. [ pdf ]

Multi-Scale

[NEW] Multi-Scale Self-Attention for Text Classification,
Qipeng Guo, Xipeng Qiu, Pengfei Liu, Xiangyang Xue, Zheng Zhang
Proceedings of the 33th AAAI Conference on Artificial Intelligence (AAAI 2020). [ pdf ]

Multi-Timescale Long Short-Term Memory Neural Network for Modelling Sentences and Documents,
Pengfei Liu,, Xipeng Qiu, Xinchi Chen, Shiyu Wu, Xuanjing Huang
Conference on Empirical Methods in Natural Language Processing (EMNLP 2015, Long). [ pdf, slide ]

Learning Context-Sensitive Word Embeddings with Neural Tensor Skip-Gram Model,
Pengfei Liu,, Xipeng Qiu, Xuanjing Huang
Proceedings of International Joint Conference on Artificial Intelligence (IJCAI 2015). [ pdf, poster, slide ]

Others

[NEW] DropAttention: A Regularization Method for Fully-Connected Self-Attention Networks,
Lin Zehui, Pengfei Liu, Luyao Huang, Junkun Chen, Xipeng Qiu, Xuanjing Huang
arXiv Preprint 2019 [ pdf]

[NEW] TIGS: An Inference Algorithm for Text Infilling with Gradient Search,
Dayiheng Liu, Jie Fu, Pengfei Liu, Jiancheng Lv
The annual meeting of the Association for Computational Linguistics(ACL 2019, Long). [ pdf ]

Long Short-Term Memory Neural Networks for Chinese Word Segmentation,
Xinchi Chen, Xipeng Qiu, Chenxi Zhu, Pengfei Liu,, Xuanjing Huang
Conference on Empirical Methods in Natural Language Processing (EMNLP 2015, Long). [ pdf ]