
As enterprise AI adoption soars, so do the energy and resource demands of AI systems. Every impressive demo of a large language model hides a hefty carbon footprint in the background. Training GPT-3 (175 billion parameters) gobbled up an estimated 1,287 MWh of electricity, emitting 502 metric tons of CO2 – about as much carbon as 110 gasoline cars running for a year. It’s not just electricity: running data centers for AI consumes vast amounts of water for cooling.