Customer-obsessed science


Research areas
-
July 22, 2025Generating diverse synthetic prior distributions leads to a tabular foundation model that outperforms task-specific baselines.
Featured news
-
2025Text-to-SQL simplifies database interactions by enabling non-experts to convert their natural language (NL) questions to Structured Query Language (SQL) queries. With advancements in Large Language Models (LLM), in-context learning (ICL) has emerged as a popular choice for building Text-to-SQL systems. Real world, industry-scale databases, often comprise thousands of tables and hundreds of columns, and
-
COLING 2025 Workshop on Evaluation of Multi-Modal Generation2025Multimodal generative AI usually involves generating image or text responses given inputs in another modality. The evaluation of image-text relevancy is essential for measuring response quality or ranking candidate responses. In particular, binary relevancy evaluation, i.e., “Relevant” vs. “Not Relevant”, is a fundamental problem. However, this is a challenging task considering that texts have diverse formats
-
2025Visual artifacts are often introduced into streamed video content, due to prevailing conditions during content production and delivery. Since these can degrade the quality of the user’s experience, it is important to automatically and accurately detect them in order to enable effective quality measurement and enhancement. Existing detection methods often focus on a single type of artifact and/or determine
-
2025Accurately classifying the relevance of Query-Product pairs is critical in online retail stores such as Amazon, as displaying irrelevant products can harm user experience and reduce engagement. While Large Language Models (LLMs) excel at this task due to their broad knowledge and strong reasoning abilities. However, their high computational demands constrain their practical deployment in real-world applications
-
QIP 20252025A central challenge in quantum simulation is to prepare low-energy states of strongly interacting many-body systems. In this work, we study the problem of preparing a quantum state that optimizes a random all-to-all, sparse or dense, spin or fermionic k-local Hamiltonian. We prove that a simplified quantum Gibbs sampling algorithm achieves a Ω(1 k)-fraction approximation of the optimum, giving an exponential
Academia
View allWhether you're a faculty member or student, there are number of ways you can engage with Amazon.
View all