Customer-obsessed science
Research areas
-
December 8, 20258 min readNew service lets customers mix their own data with the data used to train Amazon Nova at each major stage of model development, enabling deep domain understanding while preventing "catastrophic forgetting".
-
December 5, 20256 min read
-
-
-
November 20, 20254 min read
Featured news
-
arXiv2021As cloud computing resources become more adopted, the infrastructures in which they are used naturally grow in the amount of resources and overall complexity, becoming harder to manage. Infrastructure-as-Code (IaC) is presented as a solution to this problem, allowing developers to manage and provision these cloud resources programmatically. The infrastructure is then maintained through a code base, allowing
-
NeurIPS 2021 Workshop on Efficient Natural Language and Speech Processing2021Neural language models (LM) trained on diverse corpora are known to work well on previously seen entities, however, updating these models with dynamically changing entities such as place names, song titles and shopping items requires re-training from scratch and collecting full sentences containing these entities. We aim to address this issue, by introducing entity-aware language models (EALM), where we
-
IEEE/ACM Transactions on Audio, Speech, and Language Processing2021In many real-world settings, machine learning models need to identify user inputs that are out-of-domain (OOD) so as to avoid performing wrong actions. This work focuses on a challenging case of OOD detection, where no labels for in-domain data are accessible (e.g., no intent labels for the intent classification task). To this end, we first evaluate different language model based approaches that predict
-
CVPR 20212021We consider the task of 3D pose estimation and tracking of multiple people seen in an arbitrary number of camera feeds. We propose TesseTrack1, a novel top-down approach that simultaneously reasons about multiple individuals’ 3D body joint reconstructions and associations in space and time in a single end-to-end learnable framework. At the core of our approach is a novel spatio-temporal formulation that
-
ICNLP 20212021Fine-tuning self-supervised pre-trained language models such as BERT has significantly improved state-of-the-art performance on natural language processing tasks. Similar finetuning setups can also be used in commercial large scale Spoken Language Understanding (SLU) systems to perform intent classification and slot tagging on user queries. Finetuning such powerful models for use in commercial systems requires
Collaborations
View allWhether you're a faculty member or student, there are number of ways you can engage with Amazon.
View all