AI Computing Is on Pace to Consume More Energy than India, Arm Says

UTC by Godfrey Benjamin · 3 min read
AI Computing Is on Pace to Consume More Energy than India, Arm Says
Photo: Fortune Brainstorm TECH / Flickr

Arm sees AI as a key driver to its growth. The company stated in an earlier report that its AI proposition differs from other notable players.

Amid growing interests in Artificial Intelligence (AI), Rene Haas, the CEO of Arm Holdings Plc (NASDAQ: ARM) has raised concerns regarding energy consumption. The CEO projected if the current trend continues, by 2030, data centers worldwide will consume more electricity than India, the world’s most populous country.

AI’s Energy Consumption Threat

Haas stated in an interview that AI is yet to reach its full potential as it is still developing. Nonetheless, he believes finding solutions to reduce energy consumption is critical to ensuring that AI can reach its full potential while limiting its environmental footprint.

Computational power refers to a computer system’s ability to process and alter data. This capability is defined by more than just speed. It includes processor speed, memory capacity, data throughput, and energy efficiency.

AI, in particular, has emerged as a key driving force behind the surge in computational power requirements. The complexity and size of AI models have been increasing, necessitating parallel advancements in computational power to enable these models to learn from vast datasets, make predictions, and perform tasks with increasing accuracy and efficiency.

According to Haas, the next crucial stage for AI development involves more extensive training that necessitates the software processing massive volumes of data. He emphasized that as AI systems become advanced and require more training to enhance their capabilities, they will increasingly strain the existing energy capacity of data centers and other infrastructure.

Haas’ concerns about AI’s energy usage are consistent with an increasing awareness of technology’s environmental impact, particularly in data centers. AI’s increasing energy demands may worsen existing difficulties in energy use and sustainability.

However, he is also interested in the industry’s migration to Arm chip designs, which are becoming quite popular in data centers. The company’s technology, already widely used in smartphones, was designed to consume energy more efficiently than standard server processors.

Arm’s Interests in AI

Arm, which launched its IPO last year, sees AI as a key driver to its growth. The company stated in an earlier report that its AI proposition differs from other notable players like Nvidia Corp (NASDAQ: NVDA). While both businesses operate in overlapping fields, Arm declared at the time that their primary focus is on Central Processing Units (CPUs).

Arm is well-known for designing the blueprints or “architectures” of various semiconductors. These architectures cover the entire design, including components and programming language instructions, that other firms use to create chips.

Moreover, Microsoft Corp (NASDAQ: MSFT) and Alphabet Inc (NASDAQ: GOOG) employ Arm’s technology to build in-house processors for their server farms. As part of that transition, they are reducing their reliance on off-the-shelf components manufactured by Intel Corp (NASDAQ: INTC) and Advanced Micro Devices Inc (NASDAQ: AMD).

According to Haas, adopting more custom-built chips can help organizations reduce challenges and save energy. Notably, utilizing a method like this might save more than 15% on data center power consumption.

Artificial Intelligence, News, Technology News
Related Articles