AI Law - International Review of Artificial Intelligence Law
G. Giappichelli Editore

24/06/2024 - Environmental Concerns of Generative AI's Energy and Water Usage (Italy)

argument: Notizie/News - Environmental Law

Based on QuiFinanza, generative AI models, including ChatGPT, consume significant amounts of energy and water, raising environmental sustainability concerns. Research from the University of Colorado Riverside and the University of Texas Arlington indicates that training ChatGPT consumes about 700,000 liters of clean water in U.S. data centers. Additionally, running the AI requires a large amount of energy, with the training of models like GPT-3 emitting 305% more CO2 than a flight from San Francisco to New York.

The water usage is classified into three scopes:

  • Scope-1: Direct water use in cooling systems.
  • Scope-2: Offsite water use for electricity generation.
  • Scope-3: Water used in the production of AI hardware.

Electricity consumption is also high, with data centers using 1-1.5% of global electricity. The environmental impact includes substantial CO2 emissions. For instance, training a GPT-3 model emits more CO2 than the annual emissions of an average American.

Big tech companies are beginning to address these issues. Microsoft plans significant investments in renewable energy by 2030, and Google aims for carbon-free energy by the same year. Amazon, a major renewable energy purchaser, still faces challenges in reducing its overall impact.

Reducing the environmental footprint of AI includes model compression, selective data training, environmental impact estimation, and using renewable energy sources. The life cycle analysis (LCA) methodology helps in assessing the full environmental impact of AI systems from design to deployment.