Customer-obsessed science
-
May 10, 2024Using large language models to discern commonsense relationships can improve performance on downstream tasks by as much as 60%.
-
April 30, 2024Using causal random forests and Bayesian structural time series to extrapolate from sparse data ensures that customers get the most useful information as soon as possible.
-
April 16, 2024First model to work across a wide range of products uses a second U-Net encoder to capture fine-grained product details.
-
-
May 13 - 17, 2024
-
May 20 - 25, 2024
-
June 9 - 14, 2024
-
March 18, 2024
Tokenizing time series data and treating it like a language enables a model whose zero-shot performance matches or exceeds that of purpose-built models. Update: Amazon scientists how now released the training code for Chronos, which is available on GitHub.
-
2024Referring Expression Generation (REG) is the task of generating a description that unambiguously identifies a given target in the scene. Different from Image Captioning (IC), REG requires learning fine-grained characteristics of not only the scene objects but also their surrounding context. Referring expressions are usually not singular; an object can often be uniquely referenced in numerous ways, for in-stance
-
SIGIR 2024, KDD 2023 Workshop on Knowledge Augmented Methods for NLP2024How can we enhance the node features acquired from Pretrained Models (PMs) to better suit downstream graph learning tasks? Graph Neural Networks (GNNs) have become the state-of-the-art approach for many high-impact, real-world graph applications. For feature-rich graphs, a prevalent practice involves directly utilizing a PM to generate features. Nevertheless, this practice is suboptimal as the node features
-
2024Existing text-to-image generative models reflect or even amplify societal biases ingrained in their training data. This is especially concerning for human image generation where models are biased against certain demographic groups. Existing attempts to rectify this issue are hindered by the inherent limitations of the pre-trained models and fail to substantially improve demographic diversity. In this work
News and features
-
April 26, 2024Awardees, who represent 51 universities in 15 countries, have access to Amazon public datasets, along with AWS AI/ML services and tools.
-
April 09, 2024How the team behind Echo Frames delivered longer battery life and improved sound quality inside the slim form factor of a pair of eyeglasses.
-
March 21, 2024The principal economist and his team address unique challenges using techniques at the intersection of microeconomics, statistics, and machine learning.