Customer-obsessed science
Research areas
-
November 20, 20254 min readA new evaluation pipeline called FiSCo uncovers hidden biases and offers an assessment framework that evolves alongside language models.
-
October 20, 20254 min read
-
October 14, 20257 min read
-
October 2, 20253 min read
-
Featured news
-
ICASSP 20232023End-to-end speech recognition models are improved by incorporating external text sources, typically by fusion with an external language model. Such language models have to be retrained whenever the corpus of interest changes. Furthermore, since they store the entire corpus in their parameters, rare words can be challenging to recall. In this work, we propose augmenting a transducer-based ASR model with
-
ICASSP 20232023The deployment of Federated Learning (FL) systems poses various challenges such as data heterogeneity and communication efficiency. We focus on a practical FL setup that has recently drawn attention, where the data distribution on each device is not static but dynamically evolves over time. This setup, referred to as Continual Federated Learning (CFL), suffers from catastrophic forgetting, i.e., the undesired
-
ICASSP 20232023This paper describes Distill-Quantize-Tune (DQT), a pipeline to create viable small-footprint multilingual models that can perform NLU directly on extremely resource-constrained Edge devices. We distill semantic knowledge from a large-sized teacher (transformer-based), that has been trained on huge amount of public and private data, into our Edge candidate (student) model (Bi-LSTM based) and further compress
-
ICASSP 20232023Human Activity Recognition (HAR) is widely applied on wearable devices in our daily lives. However, acquiring high-quality wearable sensor data set with ground-truths is challenging due to the high cost in collecting data and necessity of domain experts. In order to achieve generalization from limited data, we study augmentation-based Self-Supervised Learning (SSL) for data from wearable devices. However
-
ICASSP 20232023In this work, we present Slimmable Neural Networks applied to the problem of small-footprint keyword spotting. We show that slimmable neural networks allow us to create super-nets from Convolutional Neural Networks and Transformers, from which sub-networks of different sizes can be extracted. We demonstrate the usefulness of these models on in-house voice assistant data and Google Speech Commands, and focus
Collaborations
View allWhether you're a faculty member or student, there are number of ways you can engage with Amazon.
View all