Customer-obsessed science
Research areas
-
January 13, 20267 min readLeveraging existing environment simulators and reward functions based on verifiable ground truth boosts task success rate, even with small models and small training datasets.
-
December 29, 20256 min read
-
December 29, 20259 min read
-
December 8, 20258 min read
-
December 5, 20256 min read
Featured news
-
2025Video summarization aims to generate a condensed textual version of an original video. Summaries may consist of either plain text or a shortlist of salient events, possibly including temporal or spatial references. Video Large Language Models (VLLMs) exhibit impressive zero-shot capabilities in video analysis. However, their performance varies significantly according to the LLM prompt, the characteristics
-
ESREL SRA-E 20252025The rapid rise of generative AI (GenAI) has sparked the sustainability community to explore its potential applications, such as climate impact modeling and renewable energy optimization. However, deploying these GenAIpowered solutions in enterprise environments raises risk concerns. In particular, chatbots and similar GenAI applications face risks of misinformation and disinformation stemming from knowledge
-
KDD 2025 Workshop on AI for Supply Chain2025Effective attribution of causes to outcomes is crucial for optimizing complex supply chain operations. Traditional methods, often relying on waterfall logic or correlational analysis, frequently fall short in identifying the true drivers of performance issues. This paper proposes a comprehensive framework leveraging data-driven causal discovery to construct and validate Structural Causal Models (SCMs).
-
The Web Conf 2025 Workshop on Resource-Efficient Learning for the Web2025Web search engines process billions of queries daily, making the balance between computational efficiency and ranking quality crucial. While neural ranking models have shown impressive performance, their computational costs, particularly in feature extraction, pose significant challenges for large-scale deployment. This paper investigates how different configurations of feature selection and document filtering
-
NeuS 20252025The “state” of State Space Models (SSMs) represents their memory, which fades exponentially over an unbounded span. By contrast, Attention-based models have “eidetic” (i.e., verbatim, or photographic) memory over a finite span (context size). Hybrid architectures combine State Space layers with Attention, but still cannot recall the distant past and can access only the most recent tokens eidetically. Unlike
Collaborations
View allWhether you're a faculty member or student, there are number of ways you can engage with Amazon.
View all