• Banananomics
  • Posts
  • Banananomics: New AI Technology Uses Whopping Amounts Of Power

Banananomics: New AI Technology Uses Whopping Amounts Of Power

Google announced this week it is well behind on a pledge to eliminate net carbon emissions by 2030.

Immediately Actionable News For Global Markets

New AI Technology Uses Whopping Amounts Of Power

Google announced this week it is well behind on a pledge to eliminate net carbon emissions by 2030.

Inputs that matter: Google said the jump highlighted "the challenge of reducing emissions" while it invests in the build-out of large language models and their associated applications and infrastructure.

  • Google admitted that "the future environmental impact of AI" was "complex and difficult to predict."

  • Chief sustainability officer Kate Brandt said the company remained committed to the 2030 target but stressed the "extremely ambitious" nature of the goal.

  • "We still expect our emissions to continue to rise before dropping towards our goal," said Brandt.

  • CNBC reported that electricity demand is forecast to grow by 20% by 2030, with AI data centers alone expected to add about 323 terawatt hours to the US's electricity demand.

The opportunity: As tech giants including Google, Amazon, and Microsoft have outlined plans to invest tens of billions of dollars in AI, climate experts have raised concerns about the environmental impacts of power-intensive tools and systems.

  • Meanwhile, energy generation and transmission constraints are challenging companies seeking to develop new technology. 

  • Analysts at Bernstein said in June that AI would "double the rate of US electricity demand growth, and total consumption could outstrip current supply in the next two years."

  • These large-scale data centers use GPUs that are enormously heat-producing. 

  • The water used to cool these GPU chips is freshwater, coming from the same reserves used for drinking water.

Zoom in: Google's data center electricity consumption increased 17% in 2023 and amounted to approximately 7-10% of global data center electricity consumption, the company estimated.

  • Google said that its data centers also consumed 17% more water in 2023 than in the previous year.

  • Microsoft reported in May that its total carbon emissions have increased by nearly 30% since 2020, primarily due to the construction of data centers.

  • ChatGPT alone uses half a million kilowatt-hours per day.

  • Amazon has been "the world's largest corporate purchaser of renewable energy for four straight years."

Between the lines: While renewables will likely play an essential role in meeting AI energy demands, analysts say immediate implementation is challenging.

  • The AI push concerns more brownouts in Texas, California wildfires, stronger Gulf hurricanes, and 126 degrees in Delhi this spring.

  • Data centers, which house computer systems and associated components, had already been burgeoning with the Internet and cloud computing.

  • As the value of cryptocurrency has multiplied, so have the data centers, such as those that mine Bitcoin in cheap energy havens like Plattsburgh, New York.

Follow the money: AI's voracious electricity consumption is driving an expansion of fossil fuel use, including delaying the retirement of some coal-fired plants.

  • However, AI is not just a problem but also a potential solution.

  • It is already being harnessed to make the power grid smarter and speed up the innovation of new nuclear technologies.

Get your official Banananomics swag

Banananomics official swag store is open. A place to buy Banananomics merchandise, such as:

As always, we appreciate your support. International shipping is available.

Thank you for reading,

Todd Moses (CEO)