INVITED TALK I



                                  Xuanjing Huang

                                  Professor, PhD Supervisor

                                  Fudan University



Biography: Xuanjing Huang is a Professor of the School of Computer Science, Fudan University, Shanghai, China. She received her Ph.D. degree in Computer Science from Fudan University in 1998. From 2008 to 2009, she is a visiting scholar in CIIR, UMass Amherst. Her research interest includes natural language processing, information retrieval, artificial intelligence, deep learning and data intensive computing. She has published more than 100 papers in major conferences including ACL, SIGIR, IJCAI, AAAI, NIPS, ICML, CIKM, EMNLP, WSDM and COLING. In the research community, she served as the PC Co-Chair of CCL 2019, NLPCC 2017, CCL 2016, SMP 2015 and SMP 2014, the organizer of WSDM 2015, competition chair of CIKM 2014, tutorial chair of COLING 2010, SPC or PC member of past WSDM, SIGIR, WWW, CIKM, ACL, IJCAI, KDD, EMNLP, COLING and many other conferences.

Title: Representation learning in natural language processing.

Abstract: Recently, deep learning provides some powerful new techniques which are successfully applied in NLP tasks, ranging from text classification to sequence labeling, from machine translation to question answering. These neural-based models can not only compete with or in some cases outperform traditional statistical approaches, but also, can be trained with a single end-to-end model, which do not require task-specific feature engineering. In this talk, I will first give a brief overview of current research status about deep learning in NLP, especially neural representation learning, which means to convert text spans, for example, words, phrases, sentences and sentence pairs into real-valued vectors. Next, I will introduce the frontiers in neural representation learning for NLP, ranging from models beyond RNN, such as graph neural networks, transformer and the pre-trained embeddings, to various learning schemes such as transfer learning, multi-task learning and meta learning.

INVITED TALK II


Luo Si

Senior Researcher

Alibaba Inc.



Biography: Dr. Luo Si is a Distinguished Engineer / Vice President of Alibaba Group Inc. He is also the Chief Scientist of Natural Language Processing with Alibaba DAMO Academy. He leads a cross-country team in China, USA and Singapore with the focus on developing cutting edge technologies in natural language processing, machine translation, text mining and information retrieval. The work attracts hundreds of millions of users and generates millions of revenues each day. Luo has published more than 150 journal and conference papers with substantial citations. His research has obtained many industry awards from Yahoo!, Google and Alibaba as well as NSF career award. Prior to joining Alibaba in 2014, he was a tenured Professor with Purdue University. He obtained BS, MS and Ph.D. degrees in computer science from Tsinghua University and Carnegie Mellon University.

Title: Natural Language Processing R&D for E-commerce and Beyond.

Abstract: Natural Language Processing (NLP) and related technologies are critical for the success of Internet business like e-commerce. Alibaba’s NLP R&D aims at supporting the business demands of Alibaba’s eco-system, creating new opportunities for Alibaba’s partners and advancing the state-of-the-art of NLP technologies. This talk will introduce our efforts to build NLP technique platform and machine translation (MT) platform that power Alibaba’s eco-system. Furthermore, some recent research work will be presented on product title compression with user-log information, sentiment classification with questions & answers, machine reading comprehension in real-world custom service, and cascade ranking for large-scale e-commerce search. The R&D work attracts hundreds of millions of users and generates significant business value every day.

Program File

The detailed program file is available here.
Copyright © 2019 Chinese and Oriental Languages Information Processing Society