re:MARS revisited: Optimizing AI/ML workloads for sustainability
Session focused on tips and tools that can help customers reduce the carbon footprint of artificial intelligence and machine learning workloads.
In June 2022, Amazon re:MARS, the company’s in-person event that explores advancements and practical applications within machine learning, automation, robotics, and space (MARS), took place in Las Vegas. The event brought together thought leaders and technical experts building the future of artificial intelligence and machine learning and included keynote talks, innovation spotlights, and a series of breakout session talks.
Now, in our re:MARS revisited series, Amazon Science is taking a look back at some of the keynotes and breakout session talks from the conference. We've asked presenters three questions about their talks, and we provide the full video of their presentations.
On June 27, Amogh Gaikwad, solutions developer with Amazon Web Services (AWS), presented the talk "Optimizing AI/ML workloads for sustainability". His session focused on best practices for efficiently retraining multiple machine learning models using minimal computational resources and computationally efficient built-in algorithms.
What was the central theme of your presentation?
Building and training machine learning models with higher accuracy can be an energy-intensive process, requiring large computational resources that necessitate substantial energy consumption. This session explores guidance from the sustainability pillar of the AWS Well-Architected Framework to reduce the carbon footprint of AI/ML workloads.
This guidance covers best practices for efficiently retraining multiple models using minimal computational resources and leveraging computationally efficient built-in algorithms. Additionally, customers can learn about the AWS tools available for monitoring models during training and deployment.
In what applications do you expect this work to have the biggest impact?
This guidance will have the biggest impact on machine learning applications that require large, energy intensive computational resources. In addition, the guidance is applicable for applications where the focus is on reducing carbon emissions and designing machine learning workloads with sustainability in mind.
What are the key points you hope audiences take away from your talk?
- How to design ML workloads using the Well-Architected machine learning lifecycle and sustainability best practices
- How to optimize resources for developing, training, and tuning ML models
- How to reduce the environmental impact of machine learning workloads in production
- Knowledge of AWS tools for monitoring machine learning workloads