
Every time you ask ChatGPT a question, binge-watch Netflix or store files in Google Drive, somewhere, a data centre lights up. In today’s world, where data is the new oil, data centres have become the unseen processors of our digital lives. Data Centres are the home of AI, the nurseries where machines are trained, nurtured and shaped by human input, almost like children. Just as a child evolves through care and guidance, AI evolves through data, algorithms, and constant human input. But behind this glowing digital cradle lies a quiet crisis —one powered by relentless energy use, environmental costs, and ethical grey zones.

It was a seemingly ordinary afternoon on 28 April 2025 in the Iberian Peninsula, when a widespread power outage swept across, leaving millions in the dark, business organisations closed, travellers stranded, and communication lines disrupted. The incident reports are yet to be completed, but the incident set the stage for discussion. As AI and data centres expand rapidly, there’s a rising debate in determining which fuels can sustainably meet this demand. Though the Iberian blackout wasn’t driven by AI, it exposed just how fragile our power system has become under growing pressure. And AI is only adding to that load.
The need for power
The AI-dedicated data centre is an emerging class of infrastructure that is currently few, but it is customised for various properties of AI workloads. They require a high absolute power supply, greater power density with supplementary hardware. According to Goldman Sachs Research, given the higher processing workloads demanded by AI, the density of power in data centres is likely to grow from 162kW per square foot to 176kW per square foot in 2027, excluding power overheads such as cooling or other functions related to data centre infrastructure. It projects power demand will reach 84GW by 2027, with AI growing to 27% of the overall market. The growth of AI is increasing the integration of high-performance servers, which has increased power density within data centres. With future electricity demand remaining uncertain, scenario-based management is essential to satiate the growing appetite for power supply.
Moreover, today’s power grids are significantly sophisticated than their early counterparts, integrating thousands of micro-grids, storage systems, plants, renewables, and millions of endpoints, as much of contemporary life is driven by electricity, including data centres. This intricate system relies on a fast, reliable and centralised communication network. But data centres don’t just depend on this web of connections; they host and process real-time data that grid operators rely on to keep the balance between electricity supply and demand, second by second. Deloitte highlights in its report that AI-powered data centres must operate 24/7, requiring high-power GPUs, and local grids are pushed beyond their limits. So, AI Data centres are not just energy consumers- they are power-hungry engines that exert disproportionate pressure on local electricity grids. This growing dependence risks destabilising local grids, diverting energy away from nearby communities and revealing critical vulnerabilities.

Environmental Concerns
The rapid expansion of AI data centres sounds environmental alarms, especially around their immense water consumption. These centres, often located in drought-prone or water-stressed regions, require vast amounts of water daily to operate their evaporative cooling system to prevent server heating. According to the International Energy Agency (IEA), a single 100-megawatt data centre can consume up to 2 million litres of water per day, an amount equivalent to the daily usage of 6500 households. Alarmingly, over two-thirds of data centres constructed since 2022 are located in areas already grappling with water scarcity. As a result, with the unceasing rise of AI, freshwater scarcity is emerging as a major concern. The global water footprint of data centres has now crossed 560 billion litres annually, and this number is projected to increase twofold by 2030. This growing tension between technological advancement and environmental sustainability underscores a harsh reality.

If AI is the future, then OpenAI’s Stargate is its nerve centre. A Mega Mart of computation sprawling across continents. The infrastructure needed to support AI’s next leap is stretching power grids, driving up carbon emissions and demanding resources at an unprecedented scale. And while investments are being made in renewable energy, the race between AI expansion and sustainable infrastructure is far from balanced. These mega marts are both a technological marvel and an environmental warning. But the question remains: as we build machines that can think, can we also build systems smart enough to power them sustainably

Conclusion
To sum up, as AI technologies advance, their unseen footprint becomes more visible- not in lines of code, but in megawatts consumed and infrastructure stretched to its limit. An estimated 165% rise in energy demand is expected by 2030, according to Goldman Sachs, and data centre energy consumption forecasted by the IEA is to double at 945 TWh; the challenge ahead is immense. If these developments are not met with parallel investment in renewable energy and sustainable design, we risk trading digital innovation for environmental regression. The pressure AI puts on infrastructure is real, but it also presents an opportunity to rethink how we build, power and sustain our digital future.
Written by – Janhavi Dubey
Edited by – Neelambika Kumari Devi
The post The Growing Appetite of Data Centres appeared first on The Economic Transcript.
Â