Customer-obsessed science
Research areas
-
December 8, 20258 min readNew service lets customers mix their own data with the data used to train Amazon Nova at each major stage of model development, enabling deep domain understanding while preventing "catastrophic forgetting".
-
December 5, 20256 min read
-
-
-
November 20, 20254 min read
Featured news
-
Physical Review Letters2021We study the performance of classical and quantum machine learning (ML) models in predicting outcomes of physical experiments. The experiments depend on an input parameter x and involve execution of a (possibly unknown) quantum process E. Our figure of merit is the number of runs of E required to achieve a desired prediction performance. We consider classical ML models that perform a measurement and record
-
ICML 2021 Workshop on Uncertainty and Robustness in Deep Learning , NeurIPS 20212021Automatically detecting anomalies in event data can provide substantial value in domains such as healthcare, DevOps, and information security. In this paper, we frame the problem of detecting anomalous continuous-time event sequences as out-of-distribution (OoD) detection for temporal point processes (TPPs). We show how this problem can be approached using tools from the goodness-of-fit (GoF) testing literature
-
Physical Review A2021The promise of quantum computing with imperfect qubits relies on the ability of a quantum computing system to scale cheaply through error correction and fault tolerance. While fault tolerance requires relatively mild assumptions about the nature of qubit errors, the overhead associated with coherent and non-Markovian errors can be orders of magnitude larger than the overhead associated with purely stochastic
-
NeurIPS 20212021For supervised learning with tabular data, decision tree ensembles produced via boosting techniques generally dominate real-world applications involving iid training/test sets. However for graph data where the iid assumption is violated due to structured relations between samples, it remains unclear how to best incorporate this structure within existing boosting pipelines. To this end, we propose a generalized
-
NeurIPS 2021 Workshop on Efficient Natural Language and Speech Processing2021Training neural machine translation (NMT) models in federated learning (FL) settings could be inefficient both computationally and communication-wise, due to the large size of translation engines as well as the multiple rounds of updates required to train clients and a central server. In this paper, we explore how to efficiently build NMT models in an FL setup by proposing a novel solution. In order to
Collaborations
View allWhether you're a faculty member or student, there are number of ways you can engage with Amazon.
View all