how-aws-sees-the-ai-landscape-for-sustainability-evolving Corporate /esg/podcasts/how-aws-sees-the-ai-landscape-for-sustainability-evolving.xml content esgSubNav
In This List

How AWS sees the AI landscape for sustainability evolving

Listen: How AWS sees the AI landscape for sustainability evolving

In this episode of the ESG Insider podcast, we explore the role artificial intelligence can play in advancing sustainability outcomes — and how the energy demands from generative AI programs could change over time. 

We talk with Hussein Shel, Chief Technologist and Head of Upstream Digital Transformation, Energy and Utility at Amazon Web Services (AWS), a cloud-computing and technology services company and a subsidiary of Amazon. 

AI has been a major focus at sustainability events throughout 2024 and will be a topic at the UN’s COP29 climate change conference in Baku, Azerbaijan, which begins Nov. 11.  

In the interview, Hussein explains how AWS is leveraging AI, machine learning and more efficient computing hardware to address sustainability challenges, particularly in optimizing energy usage and integrating renewables onto the grid.  

"Most of these models are getting more and more optimized,” Hussein says. “They're becoming more and more intelligent ... reducing potentially the consumption of energy needed to retrain." 

This interview took place on the sidelines of The Nest Climate Campus, where ESG Insider was an official podcast during Climate Week NYC. 

Listen to our interview with the head of the Electric Power Research Institute on how AI is driving up electricity demand here.

This piece was published by S&P Global Sustainable1, a part of S&P Global.    

Copyright ©2024 by S&P Global 

DISCLAIMER 

By accessing this Podcast, I acknowledge that S&P GLOBAL makes no warranty, guarantee, or representation as to the accuracy or sufficiency of the information featured in this Podcast. The information, opinions, and recommendations presented in this Podcast are for general information only and any reliance on the information provided in this Podcast is done at your own risk. This Podcast should not be considered professional advice. Unless specifically stated otherwise, S&P GLOBAL does not endorse, approve, recommend, or certify any information, product, process, service, or organization presented or mentioned in this Podcast, and information from this Podcast should not be referenced in any way to imply such approval or endorsement. The third party materials or content of any third party site referenced in this Podcast do not necessarily reflect the opinions, standards or policies of S&P GLOBAL. S&P GLOBAL assumes no responsibility or liability for the accuracy or completeness of the content contained in third party materials or on third party sites referenced in this Podcast or the compliance with applicable laws of such materials and/or links referenced herein. Moreover, S&P GLOBAL makes no warranty that this Podcast, or the server that makes it available, is free of viruses, worms, or other elements or codes that manifest contaminating or destructive properties. 

S&P GLOBAL EXPRESSLY DISCLAIMS ANY AND ALL LIABILITY OR RESPONSIBILITY FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, CONSEQUENTIAL OR OTHER DAMAGES ARISING OUT OF ANY INDIVIDUAL'S USE OF, REFERENCE TO, RELIANCE ON, OR INABILITY TO USE, THIS PODCAST OR THE INFORMATION PRESENTED IN THIS PODCAST.

Read S1 Energy Transition
LEARN MORE
Check our Transition Tracker
LEARN MORE

Transcript provided by Kensho.

Lindsey Hall: Hi. I'm Lindsey Hall, Head of Thought Leadership at S&P Global Sustainable1.

Esther Whieldon: And I'm Esther Whieldon, a Senior Writer on the Sustainable1 Thought Leadership team.

Lindsey Hall: Welcome to ESG Insider, an S&P Global podcast, where Esther and I take you inside the environmental, social and governance issues that are shaping the rapidly evolving sustainability landscape.

Esther Whieldon: In just a few days officials from countries and nonprofits around the world will convene for the UN's big climate change conference, COP29 in Baku, Azerbaijan. This gathering of the conference of the parties runs November 11 through 22 and includes a thematic day focused on science, technology and innovation.

That day will explore how to leverage digital tools for climate action while also mitigating their environmental impact. Artificial intelligence is one technology that has been a major focus at sustainability events throughout this year.

Lindsey Hall: Earlier this year on the podcast, we heard from the president of the Electric Power Research Institute about how generative AI is driving a rapid increase in electricity demand that could result in higher emissions. That's because in addition to increasing renewable generation, utilities are also turning to fossil fuel generation to keep up with this demand. Much of this new demand is coming from data centers. AI continued to generate buzz at Climate Week NYC as many forms touted the technology as an enabler of sustainable outcomes.

Esther Whieldon: In today's episode, we'll explore the climate-related opportunities and challenges that AI presents with Hussein Shel, chief technologist and head of upstream digital transformation, energy and utility at Amazon Web Services or AWS.

AWS is a cloud computing and technology services company and is a subsidiary of Amazon. We'll also hear how AWS is developing specialized hardware that allows for more efficient computing. I sat down with Hussein during Climate Week NYC on the sidelines of the Nest Climate Campus where ESG Insider was an official podcast. Hussein starts off by describing the different machine learning and AI technologies.

Hussein Shel: I'll start with just the differentiation between traditional AI and machine learning that's been going on for decades and generative AI, which is most of the hype that you hear now today with some of the famous applications like ChatGPT, et cetera.

Machine learning and AI, we've been using them for decades. Every day, you use AI without even realizing it, whether you're in Amazon.com shopping and you got a recommendation or when you're using your cell phone and any assistant that's in there. All of these machine learning models and applied AI have been used in the public for many years.

Generative AI is a different discipline within artificial intelligence and machine learning that focuses on generating new content based on a very large language models. That content could be text, could be audio, video or pictures simply by transforming text to the media that you want.

What we're doing is we're doing a lot of work with customers. Most of it is experimental today because people are trying to understand what is possible with this technology. Some of it is actually combining your traditional machine learning models and AI along with the new applications of generative AI.

What I mean by that is, for example, some applications we're doing around leveraging synthetic data, i.e., fake data to augment real-life sensor data or any kind of data to be able to improve on the existing predictive models using machine learning, so that's a combination of both generative AI and applied machine learning that we've been using for decades.

What we're doing in AWS, specifically around the energy demand, our mission is to provide the cleanest data center for using cloud services for our customers and for our own operations. What we're doing in that space is we're focusing heavily on investments around improving our infrastructure, improving how we use energy efficiently in our data centers.

All of that is powered by machine learning and AI models, by the way, as well as, of course, in the broader Amazon in our fulfillment center, all of the robots that move packages together are basically run by machine learning models. We are also doing some work around water stewardship, where we want to be a water-positive company by the end of 2030, and we're about 41% close to being there.

We are the largest enterprise of renewable energy purchasing for the last four consecutive years. We have just met our 100% renewable operations for all of our data centers or renewable matching for all of our data centers, which was supposed to be by 2030, so we're seven years -- back in '23, so we're seven years ahead. All of that translates into sustainable operations for us and for our customers.

Esther Whieldon: Hussein outlined some of the ways AWS is innovating to make machine learning and AI models run more efficiently.

Hussein Shel: I'll talk a little bit about our infrastructure investment when it comes to how do you run and train these models that are used by our customers and partners today. It's fundamentally important to understand that a significant amount of compute is required to process and train these large models. That's basically where most of the energy demand is in a typical data center.

What we are doing when it comes to that is we are building our own custom silicon. We've announced in '22 our Trainium, which is a very specific chipset tailored and optimized for running and training machine learning and AI models at a rate of 30% better energy efficiency at a much lower cost.

Our Trainium 2 aims to go for 4x the performance, meaning less compute time needed at 4x, with 2x the energy efficiency improvement. Those are fundamental improvements and abilities that allow our customers to then when they use these applications, the carbon footprint is not as significant as alternative hardware products.

Now on the consumer side or on the customer side, think of a scenario where we're working with a customer who is optimizing a particular asset. That asset could be a renewable asset or solar or any kind of hardware producing some sort of energy source.

All of these assets have sensor data that are coming, that are testing pressure and temperature and all kinds of readings. A lot of times, there's not enough information generated by these assets or not enough clean data that comes out of this.

Some of it gets corrupted, some of it gets lost. Generative AI capabilities allow you to take some of this data that is based on real-life parameters and really augment it with synthetic and fake data that you can generate from these models as close as you can to reality.

Now you have enough data to train your traditional machine learning models to be able to predict better or create more optimizations, et cetera. Some of the applications we're looking into as well is around optimizing power flow for energy consumption when it comes to grid optimization.

Esther Whieldon: Like on transmission lines?

Hussein Shel: Exactly. Looking at data and understanding supply and demand and letting a machine-based model at least provide insights that are very hard to do by the human on their own. Integration of renewable assets into the grid. A lot of it requires simulation and that simulation requires lots of compute, requires data that could potentially be missing. You can combine both traditional and new technologies to bring that together and come up with insights.

Esther Whieldon: This challenge of integrating renewables into the grid, AI can basically help solve that problem.

Hussein Shel: Absolutely. I don't want to limit AI, particularly for generative AI or I think we were going to need all kinds of machine learning and AI capabilities in order to help solve that problem. We already do some of that. We're working with customers on the utility side to look at 24/7 matching, meaning how can they provide the reporting and the means for their consumers of data to know that all of their power came from renewable sources.

That uses a lot of machine learning to allow their customers to understand these companies to provide their customers with reports that says, here's all of the lineage of the data that came in and the power consumption that you've consumed. Here's the models that are predicting what you're going to consume in the next day or week or month or whatever the case.

Esther Whieldon: One of the things I've heard around AI is the concern, and I think you talked a little bit about this here, but the concern of carbon footprints really going up because of how much energy is being used and also how much natural gas is going to have to be built versus renewables or other things -- source of that. It sounds like it helps with that to some extent. Talk to me about how it's going to play out going forward.

Hussein Shel: Yes, so we'll talk about what's going on today and where I believe it's going to happen in the future given the accelerated improvement in algorithms and the way these models are being built and designed and run on infrastructure that requires energy. Today, as I mentioned, all of our data centers are running on renewable energy sources, and those come from different sources.

We are also investing in our silicon so that we can reduce the footprint by improving performance and reducing costs for our customers, so they're not running them. Typically, those models are not running those large cycles of compute continuously. They're simply running them when you need to train a very large model, so you run it for a week, for two weeks.

What's going on and what's going to happen in the near future, most of these models are getting more and more optimized, meaning that as you throw more compute and throw more data at it, they're actually optimizing how much they should run and how little they should run based on the new data that you provided. They're becoming more and more intelligent on how you would run, therefore, reducing potentially the consumption of energy needed to retrain.

Most of these large language models today, at least the latest versions are trained on internet scale data. There's not much more data you can go after. If you think about it, we're going to get to a point where a lot of that data is going to be very specific to the domain that these models are going to be trained on, meaning you're going to have more specializations, more fine-tuning.

There's even the concept of now small language models that are focusing on very specific domains. The compute requirements are going, in my opinion, they're reducing as the optimization of these models increases in the field and therefore, of course, lowering the energy demand and the carbon footprint required as we even improve on the hardware itself.

Esther Whieldon: Is there anything we didn't get to that you wanted to make sure I mention?

Hussein Shel: No, I think this is great. I'm really excited about what we're going to do with our customers, what they're going to take us. Like I said, there's a lot of experimentation. There's a lot of learning. I encourage everyone to learn and understand what this technology can do.

Because I believe in the next five to 10 years, based at least on the orders of magnitude of improvement of these algorithms that we've seen in the last three years, from a toddler level of thinking and reasoning to almost a PhD, passing LSAT and very difficult tests. Those technologies are going to become part of our day-to-day lives, and I can't wait to see that happening.

Esther Whieldon: Yes. It's like this frontier that we know is happening, but we're not quite sure where it's going to go, right?

Hussein Shel: Right. I look forward to seeing that, especially in the sustainability angle and in climate.

Esther Whieldon: Maybe one other thing. Are there any ways we haven't talked about that you can see this technology helping with sustainability issues with companies handle the information and reporting or any other areas?

Hussein Shel: Yes. No, absolutely. I think just like all the other technologies, whether it's high-performance computing, simulation type that exist today, I think with AI, you're going to be able to throw more data because these models are optimized on very large complex data that we've never done before. Think about all of the research that has been done, all of the information that it's just simply impossible for our climate scientists to sift through it on their own.

Imagining these technologies becoming your research buddy in the palm of your hand that's able to reason and able to think about the different data that's involved when it comes to climate change and sustainability. It's going to, again, help in simulation and bringing different data sets, structured, unstructured and creating these network of connections and maps that we've never been able to do by human intelligence alone.

Esther Whieldon: Then on the converse side of this, are there any safeguards or things that we need to be super mindful of as we go into this new frontier?

Hussein Shel: Absolutely. We need to think about responsible AI and the ability to provide access to marginalized communities and individuals that are able to use this technology the same way as everyone else. We in AWS, we take approach, a very personal approach to responsible AI.

We work with many entities, whether it's the White House or other entities, Responsible AI Institute or many of the partners so that we influence any of the compliance or regulatory conditions that are needed, and we can continue to do -- and through our tools themselves, you're able to provide the transparency to these models.

They're not black boxes. You can figure out if it's introducing biases. You can identify those by our tools. You can look at how you improve the implementation of these models and the chain of thoughts that they go through. That's very important so that it's very transparent and it's available for our customers to see.

Esther Whieldon: Yes, you can gut check it a little bit to make sure it's not kind of go in a direction you don't want it to go.

Hussein Shel: I think, personally, you're always going to have that human intervention in the loop. It's not going to be an autonomous thing going on its own. There are checks and balances that can be put within the technology to go do that. I look forward to seeing more and more of that.

Esther Whieldon: Great. Well, thank you so much for taking the time to talk with me.

Hussein Shel: Thank you so much. I appreciate it.

Esther Whieldon: Today, we heard how AWS is developing specialized hardware that allows for more efficient computing. Hussein also mentioned that machine learning and AI models will become smarter and optimized over time, which could reduce energy demands. Lindsey, I found this interesting because while AI is driving up demand, it won't necessarily do so indefinitely.

Lindsey Hall: AI and its impact on sustainability strategies is a topic we'll continue to track on this podcast. Please stay tuned as next week, we'll learn about key outcomes from the UN's big Biodiversity Conference, COP 16, which wrapped up in Cali, Colombia at the beginning of November.

Thanks so much for listening to this episode of ESG Insider. If you like what you heard today, please subscribe, share and leave us a review wherever you get your podcast. 

Esther Whieldon: And a special thanks to our agency partner, The 199. See you next time.

Copyright ©2024 by S&P Global  

This piece was published by S&P Global Sustainable1, a part of S&P Global.     

DISCLAIMER  

By accessing this Podcast, I acknowledge that S&P GLOBAL makes no warranty, guarantee, or representation as to the accuracy or sufficiency of the information featured in this Podcast. The information, opinions, and recommendations presented in this Podcast are for general information only and any reliance on the information provided in this Podcast is done at your own risk. This Podcast should not be considered professional advice. Unless specifically stated otherwise, S&P GLOBAL does not endorse, approve, recommend, or certify any information, product, process, service, or organization presented or mentioned in this Podcast, and information from this Podcast should not be referenced in any way to imply such approval or endorsement. The third party materials or content of any third party site referenced in this Podcast do not necessarily reflect the opinions, standards or policies of S&P GLOBAL. S&P GLOBAL assumes no responsibility or liability for the accuracy or completeness of the content contained in third party materials or on third party sites referenced in this Podcast or the compliance with applicable laws of such materials and/or links referenced herein. Moreover, S&P GLOBAL makes no warranty that this Podcast, or the server that makes it available, is free of viruses, worms, or other elements or codes that manifest contaminating or destructive properties.  

S&P GLOBAL EXPRESSLY DISCLAIMS ANY AND ALL LIABILITY OR RESPONSIBILITY FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, CONSEQUENTIAL OR OTHER DAMAGES ARISING OUT OF ANY INDIVIDUAL'S USE OF, REFERENCE TO, RELIANCE ON, OR INABILITY TO USE, THIS PODCAST OR THE INFORMATION PRESENTED IN THIS PODCAST.