Amazon today announced that a team from the University of Michigan has won the Alexa Prize SimBot Challenge. The SimBot Challenge's goal is to advance the development of next-generation virtual assistants that help humans complete real-world tasks by continuously learning.
Teams competing in the interactive university challenge developed virtual robots in an engaging puzzle game that customers could invoke with a prompt.
The University of Michigan’s SEAGULL team of nine students, advised by Professor Joyce Chai, earned $500,000 for its first-place performance. Their work, along with that of the other participants, is now captured in a series of research papers.
SEAGULL was recognized for its SimBot’s excellence in providing an engaging experience with relevant and appropriate responses to user requests and effectively performing the requested tasks in order to complete the missions. Judges found that SEAGULL’s SimBot completed tasks with relative ease, was able to understand complex commands, and provided excellent guidance and suggestions.
“Winning the SimBot Challenge is a testament to our team's unwavering dedication and perseverance,” said SEAGULL team leader Yichi Zhang, a PhD student in computer science and engineering. “Each team member contributed their expertise to develop different components of the system, ensuring our bot is truly functional. Seeing all these components seamlessly come together and perform well is incredibly rewarding.”
Alexa Prize teams, like SEAGULL, are helping to solve long-lasting challenges in robotics, human-AI interaction, and conversational embodied AI.
Amazon provided SimBot Challenge participants with training data, software tools, machine learning models, and the Unity-based 3-D Embodied AI simulator Alexa Arena. The teams used these inputs to innovate, launch, and experiment with their new AI ideas online and improve their research throughout the competition.
In their paper, the team noted they set out to create an “interactive embodied agent … which can complete complex tasks in the Arena simulation environment through dialog with users.” To achieve this, the team relied on “a modular system that combines neural and symbolic components”; a “natural language understanding module [that] employs a hierarchical pipeline to convert user utterances into logical symbolic representations of their intentions and semantics” and “a neural vision module [that] detects object classes, states, and spatial relations.” The team also “developed tools and pipelines to augment our vision and language data, continually enhancing our system’s robustness and performance.”
“Alexa Prize teams, like SEAGULL, are helping to solve long-lasting challenges in robotics, human-AI interaction, and conversational embodied AI,” said Reza Ghanadan, a senior principal scientist in Alexa AI and head of Alexa Prize. “One significance of this research is that it may potentially lead to the development of new mechanisms to create more robust AI models that are inherently grounded in the real world, operate reliably in the environment, and can collaborate safely with humans to complete complex tasks.”
“This challenge taught us that generalization is really the key to the next generation of embodied AI,” said SEAGULL team co-lead Jianing (Jed) Yang, who is also a PhD student in computer science and engineering at Michigan. “One can add a lot of heuristics very quickly to achieve a perfect score on an existing task, but being able to generalize to unseen tasks and environments is the truly difficult part.”
Five university teams were selected to participate in the final live-interactions phase of the Alexa Prize SimBot Challenge, which took place this past spring. Teams from the University of California (UC), Santa Barbara, and UC Santa Cruz were awarded $100,000 for second and $50,000 for third place, respectively.
“To develop the next generation of embodied robot assistants, it is crucial to prioritize user centricity and proactivity in human-robot interactions,” said Jiachen Li, a first-year PhD student and team leader of UC Santa Barbara’s GauchoAI. “We learned that this means our robots should go beyond simply following human instructions and also possess the ability to anticipate user intent during these interactions.”
The GauchoAI team was advised by Xifeng Yan, Narayanamurti Professor of Computer Science at UC Santa Barbara. Xin (Eric) Wang, assistant professor of computer science and engineering, advised UC Santa Cruz’s SlugJARVIS team.
“The skills and technologies developed for the SimBot Challenge have real-world applications,” said Jing Gu, a first-year PhD student at UC Santa Cruz and leader of SlugJARVIS. “Our team's success could lead to opportunities in the fields of home automation, robotics, and AI. Participating in the challenge and reaching the finals is a valuable learning experience.”