NASA's Orion spacecraft shown splashing down in the Pacific Ocean, west of Baja California, at 9:40 a.m. PST Sunday, Dec. 11.
NASA's Orion spacecraft shown splashing down in the Pacific Ocean, west of Baja California, at 9:40 a.m. PST Sunday, Dec. 11.
NASA

The story behind how Amazon integrated Alexa into NASA’s Orion spacecraft

From physical constraints to acoustic challenges, learn how Amazon collaborated with NASA and Lockheed Martin to get Alexa to work in space.

In September 2018, Amazon’s principal solutions architect Philippe Lantin received a call from his manager.

“He said that there was something unique on the horizon, and that their team was being roped into a one-in-a-lifetime opportunity,” says Lantin.

This was no understatement: on the horizon was an opportunity for Amazon to collaborate with Lockheed Martin Space, and integrate Alexa into NASA’s Orion spacecraft. Orion is the first human-rated spacecraft to visit the moon in more than 40 years.

“NASA is trying to engage the public more as we enter this new era of space travel, where we are setting the stage for extra-planetary exploration,” says Lantin. “Given that over 100 million Alexa-enabled devices have already been sold, having Alexa answer questions like 'Alexa, how far to the moon?' and 'Alexa, how fast is Orion going?' is a great way to get people around the world involved in NASA’s missions.”

Setting up an Echo device on Earth is simple: all you need is a Wi-Fi connection and the Alexa app. However, things are far more complicated in space.

“We had several constraints we had to contend with,” says Lantin.
The Alexa team had to operate within a key physical constraint: the shape of the device. The contours of a smart speaker greatly influences it acoustics. To give just one example, the round shape of the Echo Dot offers a full cavity behind the woofer for a better bass response.

Related content
NASA is using unsupervised learning and anomaly detection to explore the extreme conditions associated with solar superstorms.

However, when it came to NASA’s Orion spacecraft, Alexa’s acoustic engineers had to work with what was provided by Lockheed Martin and NASA.

“We were somewhat limited by the form factor, which was a small briefcase-like enclosure that was 1.5 feet by one foot and about five inches in depth.” says Lantin.

There were other physical constraints. Equipment developed for the mission had to be resilient to extreme shocks and vibrations, be at least minimally resistant to radiation emissions in space, and utilize highly specific and custom-built components such as power and data cables.

Limited Internet connectivity

The team also had to deal with issues related to the lack of Internet connectivity. Typically, Echo devices use on-device keyword spotting designed to detect when a customer says the wake word. This on-device buffer exists in temporary memory. After the wake word is detected, the device streams the audio to the cloud for speech recognition and natural language processing.

Orion components

“However, for the Orion mission, our ability to communicate with the Alexa cloud was severely constrained,” says Lantin. “NASA’s spacecraft uses the Deep Space Network to communicate with earth. The bandwidth available to us on the downlink connection is slightly better than dial-up modem speeds with latencies of up to five seconds. To further complicate matters, NASA prioritizes traffic for navigation and telemetry for the first payload — traffic for Alexa was consigned to the secondary payload.”

The team also wanted to demonstrate a fully autonomous experience, one that can be used in future missions where Earth connectivity is no longer a practical option for real-time communications. They used Alexa Local Voice Control to get around the limited internet connectivity. Alexa Local Voice control allows select devices to process voice commands locally, rather than sending information to the cloud.

Lantin says that while the team was motivated by demonstrating technology leadership and scientific innovation in a very challenging environment, the real motivator was making a difference in the lives of millions of customers at home on earth.

“At Amazon, we take pride in delivering customer-focused science,” says Lantin. “That was a huge motivator for us at every step along the way. Consider the innovations we drove to Alexa Local Voice Control. These improvements will allow people on earth to do so much more with Alexa in situations where they have limited or no Internet connectivity. Think about when you are in a car and passing through a tunnel, or driving to a remote camping site. You can do things like tune the radio, turn on the AC and continue to use voice commands, even if you have a feeble signal or no cellular connection.”

Lantin says that the acoustic innovations enabled for Orion will also translate directly into improved listening experiences for people interacting with the mission on earth.

Rohit Prasad, Alexa senior vice president and head scientist, on the initial collaboration with Lockheed Martin

“We are planning to have celebrities, politicians, STEM students and a variety of other personalities interacting with Alexa,” says Lantin. “ And so, we also spent a good deal of time thinking about what people might want to ask Alexa about during the mission.”

The nuances of acoustics aboard Orion

Scott Isabelle is a solutions architect at Amazon. Prior to Amazon, Isabelle was a distinguished member of the technical staff at Motorola, where among other projects, he developed systems for enhancing voice quality in mobile devices, methods for generating adaptive ringtones, and a two-microphone system for noise suppression.

“One of the most important things for a voice AI is being in an environment where it is able to pick up your voice,” says Isabelle.

Related content
Parallel processing of microphone inputs and separate detectors for periodicity and dynamics improve performance.

However, this is easier said than done on Orion, where the conical shape of the space capsule, and its metallic surfaces result in increased reverberation.

“The voice can keep bouncing around losing very little energy. This wouldn’t happen in a typical room where soft material like curtains and sofa cushions can absorb some of the sound. In the capsule, the reverberations off the metal surfaces can play up the wrong frequencies that are critical to automatic speech recognition. This can make it really difficult for Alexa to pick up wake word invocations. ”

Alexa also has to contend with increased noise levels aboard Orion.

NASA | Exploration Mission-1 — pushing farther into deep space

The ideal signal to noise ratio (SNR) for systems involving intelligent voice assistants is in the range 20 to 30 decibels (dB). To place this in context, a SNR of 35 dB is what you would find in a face-to-face conversation between two people standing one meter apart in a typical room (higher SNRs are better). However, the SNR onboard the Orion capsule can be much lower than 20 dB, posing an acoustic challenge.

To enhance the comfort of astronauts during crewed missions, NASA would ordinarily place acoustic blankets to damp down the reverberation in the hard-walled cabin, and some of the noise created by engines and pumps.

“However, because this is an uncrewed mission we have to work within an environment with more reverberation and noise than we would like,” says Isabelle.

re:MARS 2022 — Open space: A revolution in robots for space exploration

There’s another challenge that results from the lack of humans on board. For Orion, commands to Alexa have to be sent from ground control. The low-bandwidth connections utilized for the transmission can make it challenging to transmit voices at the wide range of frequencies essential for differentiating between sounds.

During a typical phone call, our voice is typically transmitted in the narrow band, which ranges from 300 HZ to 3,000 HZ. For Alexa to make out individual words aboard the noisier environment of the space capsule, the voice would have to be transmitted at 8,000 HZ.

“Voice commands from mission control are transmitted to Alexa via a speaker,” says Isabelle. “Flight-qualified speakers are typically designed for narrow-band communications. And so for this mission we were required to use a speaker that could operate in the flight environment.”

Alexa in Space | Alexa Innovators | Build with Alexa

The team relied on what Isabelle calls “brute force” to overcome these acoustic challenges.

Related content
A combination of audio and visual signals guide the device’s movement, so the screen is always in view.

“We designed the speaker playback system to play at extremely loud volumes, which allowed us to increase the SNR to where we wanted it to be.”

The team also took advantage of the physical form factor of Alexa on board to overcome the challenges presented by the noisy environment. The speakers, the light ring and the microphones in the briefcase-like enclosure for Alexa are close to each other, which allows acoustic engineers to overcome some of the obstacles presented by the background noise and reverberation.

Finally, the team deployed two microphones in combination with an array processing algorithm. The latter combined the signals from the two microphones in a way that helps Alexa make sense of the commands being issued from mission control. Because the speakers and microphones are in fixed positions relative to each other — as opposed to a room, where people can be located in any number of locations — the algorithms could be more easily designed to distinguish between speech and the surrounding noise.

Related content
Zoox principal software engineer Olivier Toupet on company’s autonomous robotaxi technology

While the Orion mission will not have any crew members on board, the initial mission will lay the groundwork for Alexa to be integrated into future crewed missions — to the moon, Mars, and beyond. Having Alexa onboard in these future missions would allow crew members to be more efficient in day-to-day tasks, and benefit from the comforts of having Alexa on board such as the ability to play relaxing music and to keep in touch with family and friends back home.

Future crewed missions would have their own unique set of challenges, where Alexa would have to respond to commands from astronauts, who might (literally) be free-floating at multiple points within the capsule. Isabelle and Lantin are already looking forward to overcoming the challenges enabled by crewed missions.

“For someone who grew up watching Star Trek, working on this project has been a dream come true,” says Lantin. “It’s great to be able to build the future. But it’s just as exciting to be able to draw on all of this great work, and be able to enjoy all these new Alexa capabilities during my next vacation, and my day-to-day life right here at home.”

Editor's note

This is a reprint of an article that initially ran on the Alexa Skills Kit Blog. To learn more about the technical innovations that helped get Alexa into space and some inspiring facts about the Artemis I mission, visit the Skills Kit blog.

Research areas

Related content

US, WA, Seattle
Amazon is looking for a passionate, talented, and inventive Senior Applied Scientist with a strong machine learning background to help build industry-leading language technology. Our mission is to provide a delightful experience to Amazon’s customers by pushing the envelope in Natural Language Processing (NLP), Generative AI, Large Language Model (LLM), Natural Language Understanding (NLU), Machine Learning (ML), Retrieval-Augmented Generation, Responsible AI, Agent, Evaluation, and Model Adaptation. As part of our AI team in Amazon AWS, you will work alongside internationally recognized experts to develop novel algorithms and modeling techniques to advance the state-of-the-art in human language technology. Your work will directly impact millions of our customers in the form of products and services, as well as contributing to the wider research community. You will gain hands on experience with Amazon’s heterogeneous text and structured data sources, and large-scale computing resources to accelerate advances in language understanding. The Science team at AWS Bedrock builds science foundations of Bedrock, which is a fully managed service that makes high-performing foundation models available for use through a unified API. We are adamant about continuously learning state-of-the-art NLP/ML/LLM technology and exploring creative ways to delight our customers. In our daily job we are exposed to large scale NLP needs and we apply rigorous research methods to respond to them with efficient and scalable innovative solutions. At AWS Bedrock, you’ll experience the benefits of working in a dynamic, entrepreneurial environment, while leveraging AWS resources, one of the world’s leading cloud companies and you’ll be able to publish your work in top tier conferences and journals. We are building a brand new team to help develop a new NLP service for AWS. You will have the opportunity to conduct novel research and influence the science roadmap and direction of the team. Come join this greenfield opportunity! Amazon Bedrock team is part of Utility Computing (UC) About the team AWS Utility Computing (UC) provides product innovations — from foundational services such as Amazon’s Simple Storage Service (S3) and Amazon Elastic Compute Cloud (EC2), to consistently released new product innovations that continue to set AWS’s services and features apart in the industry. As a member of the UC organization, you’ll support the development and management of Compute, Database, Storage, Internet of Things (Iot), Platform, and Productivity Apps services in AWS, including support for customers who require specialized security solutions for their cloud services. Diverse Experiences AWS values diverse experiences. Even if you do not meet all of the qualifications and skills listed in the job description, we encourage candidates to apply. If your career is just starting, hasn’t followed a traditional path, or includes alternative experiences, don’t let it stop you from applying. Why AWS? Amazon Web Services (AWS) is the world’s most comprehensive and broadly adopted cloud platform. We pioneered cloud computing and never stopped innovating — that’s why customers from the most successful startups to Global 500 companies trust our robust suite of products and services to power their businesses. Inclusive Team Culture Here at AWS, it’s in our nature to learn and be curious. Our employee-led affinity groups foster a culture of inclusion that empower us to be proud of our differences. Ongoing events and learning experiences, including our Conversations on Race and Ethnicity (CORE) and AmazeCon (gender diversity) conferences, inspire us to never stop embracing our uniqueness. Mentorship & Career Growth We’re continuously raising our performance bar as we strive to become Earth’s Best Employer. That’s why you’ll find endless knowledge-sharing, mentorship and other career-advancing resources here to help you develop into a better-rounded professional. Work/Life Balance We value work-life harmony. Achieving success at work should never come at the expense of sacrifices at home, which is why we strive for flexibility as part of our working culture. When we feel supported in the workplace and at home, there’s nothing we can’t achieve in the cloud. Hybrid Work We value innovation and recognize this sometimes requires uninterrupted time to focus on a build. We also value in-person collaboration and time spent face-to-face. Our team affords employees options to work in the office every day or in a flexible, hybrid work model near one of our U.S. Amazon offices. We are open to hiring candidates to work out of one of the following locations: Seattle, WA, USA
US, WA, Seattle
Alexa Personality Fundamentals is chartered with infusing Alexa's trustworthy, reliable, considerate, smart, and playful personality. Come join us in creating the future of personality forward AI here at Alexa. Key job responsibilities As a Data Scientist with Alexa Personality, your work will involve machine learning, Large Language Model (LLM) and other generative technologies. You will partner with engineers, applied scientists, voice designers, and quality assurance to ensure that Alexa can sing, joke, and delight our customers in every interaction. You will take a central role in defining our experimental roadmap, sourcing training data, authoring annotation criteria and building automated benchmarks to track the improvement of our Alexa's personality. We are open to hiring candidates to work out of one of the following locations: Bellevue, WA, USA | Seattle, WA, USA
US, CA, Palo Alto
The Amazon Search Mission Understanding (SMU) team is at the forefront of revolutionizing the online shopping experience through the Amazon search page. Our ambition extends beyond facilitating a seamless shopping journey; we are committed to creating the next generation of intelligent shopping assistants. Leveraging cutting-edge Large Language Models (LLMs), we aim to redefine navigation and decision-making in e-commerce by deeply understanding our users' shopping missions, preferences, and goals. By developing responsive and scalable solutions, we not only accomplish the shopping mission but also foster unparalleled trust among our customers. Through our advanced technology, we generate valuable insights, providing a guided navigation system into various search missions, ensuring a comprehensive and holistic shopping experience. Our dedication to continuous improvement through constant measurement and enhancement of the shopper experience is crucial, as we strategically navigate the balance between immediate results and long-term business growth. We are seeking an Applied Scientist who is not just adept in the theoretical aspects of Machine Learning (ML), Artificial Intelligence (AI), and Large Language Models (LLMs) but also possesses a pragmatic, hands-on approach to navigating the complexities of innovation. The ideal candidate will have a profound expertise in developing, deploying, and contributing to the next-generation shopping search engine, including but not limited to Retrieval-Augmented Generation (RAG) models, specifically tailored towards enhancing the Rufus application—an integral part of our mission to revolutionize shopping assistance. You will take the lead in conceptualizing, building, and launching groundbreaking models that significantly improve our understanding of and capabilities in enhancing the search experience. A successful applicant will display a comprehensive skill set across machine learning model development, implementation, and optimization. This includes a strong foundation in data management, software engineering best practices, and a keen awareness of the latest developments in distributed systems technology. We are looking for individuals who are determined, analytically rigorous, passionate about applied sciences, creative, and possess strong logical reasoning abilities. Join the Search Mission Understanding team, a group of pioneering ML scientists and engineers dedicated to building core ML models and developing the infrastructure for model innovation. As part of Amazon Search, you will experience the dynamic, innovative culture of a startup, backed by the extensive resources of Amazon.com (AMZN), a global leader in internet services. Our collaborative, customer-centric work environment spans across our offices in Palo Alto, CA, and Seattle, WA, offering a unique blend of opportunities for professional growth and innovation. Key job responsibilities Collaborate with cross-functional teams to identify requirements for ML model development, focusing on enhancing mission understanding through innovative AI techniques, including retrieval-Augmented Generation or LLM in general. Design and implement scalable ML models capable of processing and analyzing large datasets to improve search and shopping experiences. Must have a strong background in machine learning, AI, or computational sciences. Lead the management and experiments of ML models at scale, applying advanced ML techniques to optimize science solution. Serve as a technical lead and liaison for ML projects, facilitating collaboration across teams and addressing technical challenges. Requires strong leadership and communication skills, with a PhD in Computer Science, Machine Learning, or a related field. We are open to hiring candidates to work out of one of the following locations: Palo Alto, CA, USA | Seattle, WA, USA
US, WA, Bellevue
The Artificial General Intelligence (AGI) team is looking for a passionate, talented, and inventive Applied Science Manager with a strong deep learning background, to lead the development of industry-leading technology with multimodal systems. Key job responsibilities As an Applied Science Manager with the AGI team, you will lead the development of novel algorithms and modeling techniques to advance the state of the art with multimodal systems. Your work will directly impact our customers in the form of products and services that make use of vision and language technology. You will leverage Amazon’s heterogeneous data sources and large-scale computing resources to accelerate development with multimodal Large Language Models (LLMs) and Generative Artificial Intelligence (GenAI) in Computer Vision. About the team The AGI team has a mission to push the envelope with multimodal LLMs and GenAI in Computer Vision, in order to provide the best-possible experience for our customers. We are open to hiring candidates to work out of one of the following locations: Bellevue, WA, USA | Seattle, WA, USA | Sunnyvale, CA, USA
US, MA, Boston
The Artificial General Intelligence (AGI) - Automations team is developing AI technologies to automate workflows, processes for browser automation, developers and ops teams. As part of this, we are developing services and inference engine for these automation agents, and techniques for reasoning, planning, and modeling workflows. If you are interested in a startup mode team in Amazon to build the next level of agents then come join us. Scientists in AGI - Automations will develop cutting edge multimodal LLMs to observe, model and derive insights from manual workflows to automate them. You will get to work in a joint scrum with engineers for rapid invention, develop cutting edge automation agent systems, and take them to launch for millions of customers. Key job responsibilities - Build automation agents by developing novel multimodal LLMs. A day in the life An Applied Scientist with the AGI team will support the science solution design, run experiments, research new algorithms, and find new ways of optimizing the customer experience.; while setting examples for the team on good science practice and standards. Besides theoretical analysis and innovation, an Applied Scientist will also work closely with talented engineers and scientists to put algorithms and models into practice. We are open to hiring candidates to work out of one of the following locations: Boston, MA, USA
US, MA, Boston
The Artificial General Intelligence (AGI) - Automations team is developing AI technologies to automate workflows, processes for browser automation, developers and ops teams. As part of this, we are developing services and inference engine for these automation agents, and techniques for reasoning, planning, and modeling workflows. If you are interested in a startup mode team in Amazon to build the next level of agents then come join us. Scientists in AGI - Automations will develop cutting edge multimodal LLMs to observe, model and derive insights from manual workflows to automate them. You will get to work in a joint scrum with engineers for rapid invention, develop cutting edge automation agent systems, and take them to launch for millions of customers. Key job responsibilities - Build automation agents by developing novel multimodal LLMs. A day in the life An Applied Scientist with the AGI team will support the science solution design, run experiments, research new algorithms, and find new ways of optimizing the customer experience.; while setting examples for the team on good science practice and standards. Besides theoretical analysis and innovation, an Applied Scientist will also work closely with talented engineers and scientists to put algorithms and models into practice. We are open to hiring candidates to work out of one of the following locations: Boston, MA, USA
US, CA, Palo Alto
The Amazon Search Mission Understanding (SMU) team is at the forefront of revolutionizing the online shopping experience through the Amazon search page. Our ambition extends beyond facilitating a seamless shopping journey; we are committed to creating the next generation of intelligent shopping assistants. Leveraging cutting-edge Large Language Models (LLMs), we aim to redefine navigation and decision-making in e-commerce by deeply understanding our users' shopping missions, preferences, and goals. By developing responsive and scalable solutions, we not only accomplish the shopping mission but also foster unparalleled trust among our customers. Through our advanced technology, we generate valuable insights, providing a guided navigation system into various search missions, ensuring a comprehensive and holistic shopping experience. Our dedication to continuous improvement through constant measurement and enhancement of the shopper experience is crucial, as we strategically navigate the balance between immediate results and long-term business growth. We are seeking an Applied Scientist who is not just adept in the theoretical aspects of Machine Learning (ML), Artificial Intelligence (AI), and Large Language Models (LLMs) but also possesses a pragmatic, hands-on approach to navigating the complexities of innovation. The ideal candidate will have a profound expertise in developing, deploying, and contributing to the next-generation shopping search engine, including but not limited to Retrieval-Augmented Generation (RAG) models, specifically tailored towards enhancing the Rufus application—an integral part of our mission to revolutionize shopping assistance. You will take the lead in conceptualizing, building, and launching groundbreaking models that significantly improve our understanding of and capabilities in enhancing the search experience. A successful applicant will display a comprehensive skill set across machine learning model development, implementation, and optimization. This includes a strong foundation in data management, software engineering best practices, and a keen awareness of the latest developments in distributed systems technology. We are looking for individuals who are determined, analytically rigorous, passionate about applied sciences, creative, and possess strong logical reasoning abilities. Join the Search Mission Understanding team, a group of pioneering ML scientists and engineers dedicated to building core ML models and developing the infrastructure for model innovation. As part of Amazon Search, you will experience the dynamic, innovative culture of a startup, backed by the extensive resources of Amazon.com (AMZN), a global leader in internet services. Our collaborative, customer-centric work environment spans across our offices in Palo Alto, CA, and Seattle, WA, offering a unique blend of opportunities for professional growth and innovation. Key job responsibilities Collaborate with cross-functional teams to identify requirements for ML model development, focusing on enhancing mission understanding through innovative AI techniques, including retrieval-Augmented Generation or LLM in general. Design and implement scalable ML models capable of processing and analyzing large datasets to improve search and shopping experiences. Must have a strong background in machine learning, AI, or computational sciences. Lead the management and experiments of ML models at scale, applying advanced ML techniques to optimize science solution. Serve as a technical lead and liaison for ML projects, facilitating collaboration across teams and addressing technical challenges. Requires strong leadership and communication skills, with a PhD in Computer Science, Machine Learning, or a related field. We are open to hiring candidates to work out of one of the following locations: Palo Alto, CA, USA | Seattle, WA, USA
US, WA, Seattle
The Artificial General Intelligence (AGI) team is looking for a passionate, talented, and inventive Senior Applied Scientist with a strong deep learning background, to lead the development of industry-leading technology with multimodal systems. Key job responsibilities As a Senior Applied Scientist with the AGI team, you will lead the development of novel algorithms and modeling techniques to advance the state of the art with multimodal systems. Your work will directly impact our customers in the form of products and services that make use of vision and language technology. You will leverage Amazon’s heterogeneous data sources and large-scale computing resources to accelerate development with multimodal Large Language Models (LLMs) and Generative Artificial Intelligence (GenAI) in Computer Vision. About the team The AGI team has a mission to push the envelope with multimodal LLMs and GenAI in Computer Vision, in order to provide the best-possible experience for our customers. We are open to hiring candidates to work out of one of the following locations: Cambridge, MA, USA | New York, NY, USA | Seattle, WA, USA | Sunnyvale, CA, USA
US, WA, Bellevue
The Artificial General Intelligent team (AGI) seeks an Applied Scientist with a strong background in machine learning and production level software engineering to spearhead the advancement and deployment of cutting-edge ML systems. As part of this team, you will collaborate with talented peers to create scalable solutions for an innovative conversational assistant, aiming to revolutionize user experiences for millions of Alexa customers. The ideal candidate possesses a solid understanding of machine learning fundamentals and has experience writing high quality software in production setting. The candidate is self-motivated, thrives in ambiguous and fast-paced environments, possess the drive to tackle complex challenges, and excel at swiftly delivering impactful solutions while iterating based on user feedback. Join us in our mission to redefine industry standards and provide unparalleled experiences for our customers. Key job responsibilities You will be expected to: · Analyze, understand, and model customer behavior and the customer experience based on large scale data · Build and measure novel online & offline metrics for personal digital assistants and customer scenarios, on diverse devices and endpoints · Create, innovate and deliver deep learning, policy-based learning, and/or machine learning based algorithms to deliver customer-impacting results · Build and deploy automated model training and evaluation pipelines · Perform model/data analysis and monitor metrics through online A/B testing · Research and implement novel machine learning and deep learning algorithms and models. We are open to hiring candidates to work out of one of the following locations: Bellevue, WA, USA | Boston, MA, USA
ZA, Cape Town
We are a new team in AWS' Kumo organisation - a combination of software engineers and AI/ML experts. Kumo is the software engineering organization that scales AWS’ support capabilities. Amazon’s mission is to be earth’s most customer-centric company and this also applies when it comes to helping our own Amazon employees with their everyday IT Support needs. Our team is innovating for the Amazonian, making the interaction with IT Support as smooth as possible. We achieve this through multiple mechanisms which eliminate root causes altogether, automate issue resolution or point customers towards the optimal troubleshooting steps for their situation. We deliver the support solutions plus the end-user content with instructions to help them self-serve. We employ machine learning solutions on multiple ends to understand our customer's behavior, predict customer's intent, deliver personalized content and automate issue resolution through chatbots. As an applied scientist on our team, you will help to build the next generation of case routing using artificial intelligence to optimize business metric targets addressing the business challenge of ensuring that the right case gets worked by the right agent within the right time limit whilst meeting the target business success metric. You will develop machine learning models and pipelines, harness and explain rich data at Amazon scale, and provide automated insights to improve case routing that impact millions of customers every day. You will be a pragmatic technical leader comfortable with ambiguity, capable of summarizing complex data and models through clear visual and written explanations. About AWS Diverse Experiences AWS values diverse experiences. Even if you do not meet all of the qualifications and skills listed in the job description, we encourage candidates to apply. If your career is just starting, hasn’t followed a traditional path, or includes alternative experiences, don’t let it stop you from applying. Why AWS? Amazon Web Services (AWS) is the world’s most comprehensive and broadly adopted cloud platform. We pioneered cloud computing and never stopped innovating — that’s why customers from the most successful startups to Global 500 companies trust our robust suite of products and services to power their businesses. Inclusive Team Culture Here at AWS, it’s in our nature to learn and be curious. Our employee-led affinity groups foster a culture of inclusion that empower us to be proud of our differences. Ongoing events and learning experiences, including our Conversations on Race and Ethnicity (CORE) and AmazeCon (gender diversity) conferences, inspire us to never stop embracing our uniqueness. Mentorship & Career Growth We’re continuously raising our performance bar as we strive to become Earth’s Best Employer. That’s why you’ll find endless knowledge-sharing, mentorship and other career-advancing resources here to help you develop into a better-rounded professional. Work/Life Balance We value work-life harmony. Achieving success at work should never come at the expense of sacrifices at home, which is why flexible work hours and arrangements are part of our culture. When we feel supported in the workplace and at home, there’s nothing we can’t achieve in the cloud. Sales, Marketing and Global Services (SMGS) AWS Sales, Marketing, and Global Services (SMGS) is responsible for driving revenue, adoption, and growth from the largest and fastest growing small- and mid-market accounts to enterprise-level customers including public sector. Amazon knows that a diverse, inclusive culture empowers us all to deliver the best results for our customers. We celebrate diversity in our workforce and in the ways we work. As part of our inclusive culture, we offer accommodations during the interview and onboarding process. If you’d like to discuss your accommodation options, please contact your recruiter, who will partner you with the Applicant-Candidate Accommodation Team (ACAT). You may also contact ACAT directly by emailing acat-africa@amazon.com. We want all Amazonians to have the best possible Day 1 experience. If you’ve already completed the interview process, you can contact ACAT for accommodation support before you start to ensure all your needs are met Day 1. Key job responsibilities Deliver real world production systems at AWS scale. Work closely with the business to understand the problem space, identify the opportunities and formulate the problems. Use machine learning, data mining, statistical techniques, Generative AI and others to create actionable, meaningful, and scalable solutions for the business problems. Establish scalable, efficient, automated processes for large scale data analyses, model development, model validation and model implementation Analyze complex support case datasets and metrics to drive insight Design, build, and deploy effective and innovative ML solutions to optimize case routing Evaluate the proposed solutions via offline benchmark tests as well as online A/B tests in production. Drive collaborative research and creative problem solving across science and software engineering team Propose and validate hypothesis to deliver and direct our product road map Work with engineers to deliver low latency model predictions to production We are open to hiring candidates to work out of one of the following locations: Cape Town, ZAF