Context-aware self-attentive natural language understanding for task-oriented chatbots
2019
Natural Language Understanding (NLU) is a core component of dialog systems. It typically involves two tasks - intent classification(IC) and slot labeling (SL), which are then followed by a dialogue management (DM) component. Such NLU systems cater to utterances in isolation, thus pushing the problem of con-text management to DM. However, contextual information is critical to the correct prediction of intents and slots in a conversation. Prior work on contextual NLU has been limited in terms of the types of contextual signals used and the understanding of their impact on the model. In this work, we propose a context-aware self-attentive NLU (CASA-NLU) model that uses multiple signals, such as previous intents, slots, dialog acts and utterances over a variable context window, in addition to the current user utterance.CASA-NLUoutper-forms a recurrent contextual NLU baseline on two conversational datasets, yielding a gain of up to 7% on the IC task for one of the datasets. Moreover, a non-contextual variant of CASA-NLUachieves state-of-the-art performance forIC tasks on standard public datasets - SNIPSand ATIS.
Research areas