Customer-obsessed science
Research areas
-
November 28, 20254 min readLarge language models are increasing the accuracy, reliability, and consistency of the product catalogue at scale.
-
November 20, 20254 min read
-
October 20, 20254 min read
-
October 14, 20257 min read
-
October 2, 20253 min read
Featured news
-
EMNLP 20222022With ever-growing digital adoption in the society and increasing demand for businesses to deliver to customers doorstep, the last mile hop of transportation planning poses unique challenges in emerging geographies with unstructured addresses. One of the crucial inputs to facilitate effective planning is the task of geolocating customer addresses. Existing systems operate by aggregating historical delivery
-
EMNLP 20222022In conversational AI agents, Query Rewriting (QR) plays a crucial role in reducing user frictions and satisfying their daily demands. User frictions are caused by various reasons, such as errors in the conversational AI system, users’ accent or their abridged language. In this work, we present a novel Constrained Generation Framework (CGF) for query rewriting at both global and personalized levels. It is
-
EMNLP 20222022Unexpected responses or repeated clarification questions from conversational agents detract from the users’ experience with technology meant to streamline their daily tasks. To reduce these frictions, Query Rewriting (QR) techniques replace transcripts of faulty queries with alternatives that lead to responses that satisfy the users’ needs. Despite their successes, existing QR approaches are limited in
-
EMNLP 2022 Workshop on Generation, Evaluation & Metrics (GEM)2022The main focus of data augmentation research has been on the enhancement of generation models, leaving the examination and improvements of synthetic data evaluation methods less explored. In our work, we explore a number of sentence similarity measures in the context of data generation filtering, and evaluate their impact on the performance of the targeted Natural Language Understanding problem for the
-
EMNLP 2022 Workshop on Massively Multilingual NLU2022Encoding both language-specific and language-agnostic information into a single high-dimensional space is a common practice of pre-trained Multi-lingual Language Models (pMLM). Such encoding has been shown to perform effectively on natural language tasks requiring semantics of the whole sentence (e.g., translation). However, its effectiveness appears to be limited on tasks requiring partial information
Collaborations
View allWhether you're a faculty member or student, there are number of ways you can engage with Amazon.
View all