Customer-obsessed science
Research areas
-
December 1, 20258 min read“Network language models” will coordinate complex interactions among intelligent components, computational infrastructure, access points, data centers, and more.
-
-
November 20, 20254 min read
-
October 20, 20254 min read
-
October 14, 20257 min read
Featured news
-
EMNLP 2021 Workshop on Simple and Efficient Natural Language Processing (SustaiNLP)2021Pretrained transformer-based encoders such as BERT have been demonstrated to achieve state-of-the-art performance on numerous NLP tasks. Despite their success, BERT style encoders are large in size and have high latency during inference (especially on CPU machines) which make them unappealing for many online applications. Recently introduced compression and distillation methods have provided effective ways
-
EMNLP 20212021Even though large pre-trained multilingual models (e.g. mBERT, XLM-R) have led to significant performance gains on a wide range of cross-lingual NLP tasks, success on many downstream tasks still relies on the availability of sufficient annotated data. Traditional fine-tuning of pre-trained models using only a few target samples can cause over-fitting. This can be quite limiting as most languages in the
-
2021 International Workshop on Spoken Dialog System Technology2021User ratings play a significant role in spoken dialogue systems. Typically, such ratings tend to be averaged across all users and then utilized as feedback to improve the system or personalize its behavior. While this method can be useful to understand broad, general issues with the system and its behavior, it does not take into account differences between users that affect their ratings. In this work,
-
EMNLP 20212021For voice assistants like Alexa, Google Assistant and Siri, correctly interpreting users’ intentions is of utmost importance. However, users sometimes experience friction with these assistants, caused by errors from different system components or user errors such as slips of the tongue. Users tend to rephrase their query until they get a satisfactory response. Rephrase detection is used to identify the
-
EMNLP 20212021Paraphrase generation is a longstanding NLP task that has diverse applications for downstream NLP tasks. However, the effectiveness of existing efforts predominantly relies on large amounts of golden labeled data. Though unsupervised endeavors have been proposed to address this issue, they may fail to generate meaningful paraphrases due to the lack of supervision signals. In this work, we go beyond the
Collaborations
View allWhether you're a faculty member or student, there are number of ways you can engage with Amazon.
View all