Customer-obsessed science
Research areas
-
December 5, 20256 min readA multiagent architecture separates data perception, tool knowledge, execution history, and code generation, enabling ML automation that works with messy, real-world inputs.
-
-
-
November 20, 20254 min read
-
October 20, 20254 min read
Featured news
-
KDD 2021 Workshop on Fragile Earth2021Since 2015, Amazon has reduced the weight of its outbound packaging by 36%, eliminating over 1,000,000 tons of packaging material worldwide, or the equivalent of over 2 billion shipping boxes, thereby reducing carbon footprint throughout its fulfillment supply chain. In this position paper, we share insights on using deep learning to identify the optimal packaging type best suited to ship each item in a
-
Interspeech 20212021Wav2vec-C introduces a novel representation learning technique combining elements from wav2vec 2.0 and VQ-VAE. Our model learns to reproduce quantized representations from partially masked speech encoding using a contrastive loss in a way similar to wav2vec 2.0. However, the quantization process is regularized by an additional consistency network that learns to reconstruct the input features to the wav2vec
-
ICML 2020 Workshop on AutoML2021While training highly overparameterized neural networks is common practice in deep learning, research into post-hoc weight-pruning suggests that more than 90% of parameters can be removed without loss in predictive performance. To save resources, zero-shot and one-shot pruning attempt to find such a sparse representation at initialization or at an early stage of training. Though efficient, there is no justification
-
ICML 2021 Time Series Workshop, ICLR 20222021Realistic synthetic time series data of sufficient length enables practical applications in time series modeling tasks, such as forecasting, but remains a challenge. In this paper we present PSAGAN, a generative adversarial network (GAN) that generates long time series samples of high quality using progressive growing of GANs and self-attention. We show that PSA-GAN can be used to reduce the error in two
-
Entropy Journal2021Variational inference is a powerful framework, used to approximate intractable posteriors through variational distributions. The de facto standard is to rely on Gaussian variational families, which come with numerous advantages: they are easy to sample from, simple to parametrize, and many expectations are known in closed-form or readily computed by quadrature. In this paper, we view the Gaussian variational
Collaborations
View allWhether you're a faculty member or student, there are number of ways you can engage with Amazon.
View all