Claude, Replika, Copilot, or more notably—ChatGPT. These names are not at all unfamiliar; matter of fact, ChatGPT reached 100 million users within 2 months of launching in 2023. This was a record-breaking number for the largest and quickest user growth, generating around 1.2 billion dollars in revenue during its first year of launch. Now in 2024, ChatGPT has 250 million weekly active users. ChatGPT is just one of many artificial intelligence chatbots. It is a deep learning model that uses machine learning to learn and respond to inputs given by the user. For example, people use ChatGPT to write emails, provide customer service, or explain complex ideas. People are inclined to use these programs due to their effectiveness and efficiency, often saving people minutes upon hours in their tasks. It is evident that ChatGPT, amongst other AI applications, is becoming more and more integrated into society.
Despite how time-efficient these AI models seem, most people do not consider the resources it requires for operating these services. Machine learning is a highly energy-intensive process as it is extremely difficult for a machine to imitate people and intelligent behaviors. Machine learning uses human-produced and real pictures, text, and numbers as data to train itself in prediction and pattern-recognition. “A lot of energy is consumed when you’re iterating over the same dataset multiple times,” says Vice President of Arteris (system software and technology firm) Frank Schirrmeister. Each time AI is trained, it consumes an enormous amount of energy. For example, the training of GPT3 was estimated to consume around 1,300 megawatts per hour. This is equivalent to about the same amount of electricity consumed by 130 US homes annually. That is also around 1.625 million hours—185.5 years of continuous viewing—of Netflix (0.0008 megawatts per hour of streaming). On top of that, the training of these AI chatbots is not a one-time thing; rather, it is an ongoing process. Several chatbots are trained, and they are not only trained once; rather, they are trained over and over again to “perfection”.
After training, these companies release their modes to be used by people worldwide. This process is also energy intensive because it requires computing resources which are usually coupled by large data centers. Although inference (the process of AI using what it has learned to provide a response) consumes less energy than training, the sheer volume of users—such as the 250 million weekly active users—contribute to the high energy demand of AI. The amount of large data centers is growing significantly alongside the incorporation of AI. Large data centers run tens of thousands of servers at a time, which requires extensive cooling mechanisms. This leads to additional energy consumption and excessive water consumption to maintain these environments. These hyperscale data centers use the same amount of power as 400,000 electric cars annually, contributing to a large carbon footprint and water scarcity.
Despite all the current downsides, large companies are working towards mitigating the unnecessary resource consumption of AI and data centers. Companies like Microsoft and Google are both striving towards implementing 100% sustainable energy sources and closed-loop water systems to minimize waste and carbon emissions. Google has taken further steps towards this by, ironically, using AI. With their new DeepAI model, they were able to use machine learning to automatically optimize energy and ultimately reduce carbon emissions. Energy consumption has decreased by 40% while also boosting total computing power by 350%. There are several other ways that machine learning is used to march towards a sustainable future, so the question is: if it’s worth the risk?
Works Cited
Curry, David. “CHATGPT Revenue and Usage Statistics (2024).” Business of Apps, 13 Nov. 2024, www.businessofapps.com/data/chatgpt-statistics/.
“Chat GPT: What Is It?” University of Central Arkansas: UCA , 2023, uca.edu/cetal/chat-gpt/.
Brown, Sara. “Machine Learning, Explained.” MIT Sloan, 21 Apr. 2021, mitsloan.mit.edu/ideas-made-to-matter/machine-learning-explained.
Bailey, Brian. “AI Power Consumption Exploding.” Semiconductor Engineering, 15 Aug. 2022, semiengineering.com/ai-power-consumption-exploding/.
Vincent, James. “How Much Electricity Does AI Consume?”
The Verge, 16 Feb. 2024, www.theverge.com/24066646/ai-electricity-energy-watts-generative-consumption.
Spencer, Thomas. “What the Data Centre and Ai Boom Could Mean for the Energy Sector – Analysis.” IEA, 18 Oct. 2024, www.iea.org/commentaries/what-the-data-centre-and-ai-boom-could-mean-for-the-energy-sector.
Evans, Richard, and Jim Gao. “DeepMind AI Reduces Google Data Centre Cooling Bill by 40%.” Google DeepMind, 20 July 2016, deepmind.google/discover/blog/deepmind-ai-reduces-google-data-centre-cooling-bill-by-40/.