Customer-obsessed science


Research areas
-
August 11, 2025Trained on millions of hours of data from Amazon fulfillment centers and sortation centers, Amazon’s new DeepFleet models predict future traffic patterns for fleets of mobile robots.
-
Featured news
-
AAAI 2025 Workshop on AI for Time Series Analysis2025Time series forecasting has long been a focus of research across diverse fields, including economics, energy, healthcare, and traffic management. Recent works have introduced innovative architectures for time series models, such as the Time-Series Mixer (TSMixer), which leverages multilayer perceptrons (MLPs) to enhance prediction accuracy by effectively capturing both spatial and temporal dependencies
-
IEEE Symposium on Security and Privacy 20252025We propose plausible post-quantum (PQ) oblivious pseudorandom functions (OPRFs) based on the Power-Residue PRF (Damgård CRYPTO’88), a generalization of the Legendre PRF. For security parameter λ, we consider the PRF Goldk(x) that maps an integer x modulo a public prime p = 2λ ·g+ 1 to the element (k + x) g mod p, where g is public and log g ≈ 2λ. At the core of our constructions are efficient novel methods
-
CVPR 2025 Workshop on Efficient Large Vision Models2025Diffusion models enables high-quality virtual try-on (VTO) with their established image synthesis abilities. Despite the extensive end-to-end training of large pre-trained models involved in current VTO methods, real-world applications often prioritize limited training and inferencing/serving/deployment budgets for VTO. To solve this obstacle, we apply Doob’s h-transform efficient fine-tuning (DEFT) for
-
EMNLP 2024 Workshop on Customizable NLP, Transactions on Machine Learning Research2025Precise estimation of downstream performance in large language models (LLMs) prior to training is essential for guiding their development process. Scaling laws analysis utilizes the statistics of a series of significantly smaller sampling language models (LMs) to predict the performance of the target LLM. For downstream performance prediction, the critical challenge lies in the emergent abilities in LLMs
-
Amazon Technical Reports2025We present Amazon Nova Premier, our most capable multimodal foundation model and teacher for model distillation. Nova Premier processes text, images, and videos with a one-million token context window enabling analysis of large codebases, long documents, and long videos in a single prompt. It also enables customers to use Amazon Bedrock to create customized variants of Amazon Nova Pro, Nova Lite, and Nova
Academia
View allWhether you're a faculty member or student, there are number of ways you can engage with Amazon.
View all