Customer-obsessed science
Research areas
-
September 26, 20259 min readTo transform scientific domains, foundation models will require physical-constraint satisfaction, uncertainty quantification, and specialized forecasting techniques that overcome data scarcity while maintaining scientific rigor.
-
-
September 2, 20253 min read
-
-
August 21, 20257 min read
Featured news
-
NeurIPS 20232023The main challenge of offline reinforcement learning, where data is limited, arises from a sequence of counterfactual reasoning dilemmas within the realm of potential actions: What if we were to choose a different course of action? These circumstances frequently give rise to extrapolation errors, which tend to accumulate exponentially with the problem horizon. Hence, it becomes crucial to acknowledge that
-
NeurIPS 2023 Workshop on I Can’t Believe It’s Not Better (ICBINB): Failure Modes in the Age of Foundation Models2023Numerous Natural Language Processing (NLP) tasks require precisely labeled data to ensure effective model training and achieve optimal performance. However, data annotation is marked by substantial costs and time requirements, especially when requiring specialized domain expertise or annotating a large number of samples. In this study, we investigate the feasibility of employing large language models (LLMs
-
NeurIPS 2023 Workshop on Federated Learning in the Age of Foundation Models2023Edge device participation in federating learning (FL) has been typically studied under the lens of device-server communication (e.g., device dropout) and assumes an undying desire from edge devices to participate in FL. As a result, current FL frameworks are flawed when implemented in real-world settings, with many encountering the free-rider problem. In a step to push FL towards realistic settings, we
-
NeurIPS 2023 Workshop on Distribution Shifts (DistShifts)2023Recent work using pretrained transformers has shown impressive performance when fine-tuned with data from the downstream problem of interest. However, they struggle to retain that performance when the data characteristics changes. In this paper, we focus on continual learning, where a pre-trained transformer is updated to perform well on new data, while retaining its performance on data it was previously
-
WACV 20242023Developing a client-side segmentation algorithm for online sports streaming holds significant importance. For instance, in order to assess the video quality from an end-user perspective such as artifact detection, it is important to initially segment the content within the streaming playback. The challenge lies in localizing the content due to the intricate scene changes between content and non-content
Collaborations
View allWhether you're a faculty member or student, there are number of ways you can engage with Amazon.
View all