-
AAAI 2022 Workshop on Combining Learning and Reasoning: Programming Languages, Formalisms, and Representations (CLeaR)2022Language-enabled AI systems can answer complex, multihop questions to high accuracy, but supporting answers with evidence is a more challenging task which is important for the transparency and trustworthiness to users. Prior work in this area typically makes a trade-off between efficiency and accuracy; state-of-the-art deep neural network systems are too cumbersome to be useful in large-scale applications
-
AAAI 20222022Question answering over semi-structured tables has attracted significant attention in the NLP community. However, most of the existing work focus on questions that can be answered with short-form answer, i.e. the answer is often a table cell or aggregation of multiple cells. This can mismatch with the intents of users who want to ask more complex questions that require free-form answers such as explanations
-
AAAI 20222022Recent years have seen significant advances in multi-turn Spoken Language Understanding (SLU), where dialogue contexts are used to guide intent classification and slot filling. However, how to selectively incorporate dialogue contexts, such as previous utterances and dialogue acts, into multi-turn SLU still remains a substantial challenge. In this work, we propose a novel contextual SLU model for multiturn
-
AAAI 20222022Transformers are state-of-the-art in a wide range of NLP tasks and have also been applied to many real-world products. Understanding the reliability and certainty of transformer model predictions is crucial for building trustable machine learning applications, e.g., medical diagnosis. Although many recent transformer extensions have been proposed, the study of the uncertainty estimation of transformer models
-
ICON 20212022Text Style Transfer (TST) aims to alter the underlying style of the source text to another specific style while keeping the same content. Due to the scarcity of high-quality parallel training data, unsupervised learning has become a trending direction for TST tasks. In this paper, we propose a novel VAE based Text Style Transfer with pivOt Words Enhancement leaRning (VT-STOWER) method which utilizes Variational
Related content
-
July 22, 2020Amazon scientists are seeing increases in accuracy from an approach that uses a new scalable embedding scheme.
-
July 22, 2020Dialogue simulator and conversations-first modeling architecture provide ability for customers to interact with Alexa in a natural and conversational manner.
-
July 8, 2020New method extends virtual adversarial training to sequence-labeling tasks, which assign different labels to different words of an input sentence.
-
July 7, 2020Watch the replay of the live interview with Alexa evangelist Jeff Blankenburg.
-
July 6, 2020After nearly 40 years of research, the ACL 2020 keynote speaker sees big improvements coming in three key areas.
-
July 2, 2020Amazon researchers coauthor 17 conference papers, participate in seven workshops.