Every Earth Day, the tech industry publishes a take on this story it has learned to tell very well, and it goes something like this: AI will optimise energy grids, machine learning will accelerate the discovery of new battery materials, and data-driven systems will make supply chains leaner, buildings smarter and transport cleaner. The World Economic Forum has estimated that AI tools could generate up to $2 trillion in energy savings by 2030. On paper, tech looks like the solution.
On the other hand, another narrative exists, Google, Microsoft and Amazon have published their most ambitious net-zero targets to date while simultaneously signing agreements for new gas-fired power capacity to keep AI data centres running. Google’s greenhouse gas emissions rose 48% between 2019 and 2024, with Microsoft’s growing 30% in the same period. The data centres powering large language models are projected to consume more electricity than some entire countries by 2027. The discrepancy between the pledge and the power bill has never been larger.
This is the tougher way to look at it: the industry that is loudest about saving the planet is also the one building the infrastructure most likely to stress it. The energy demand of AI is both structural and proportional – every new model is larger than the last, every new application requires more compute and the efficiency gains that AI proponents cite as justification have so far been outpaced by the growth in usage.
Economists call this the rebound effect, and climate researchers have documented it in detail.
The Net-Zero Pledge Problem
Part of what makes this difficult to assess is that the accounting is opaque.
Big Tech companies report Scope 1 and 2 emissions – direct emissions and purchased energy – credibly enough, but Scope 3, the emissions embedded in supply chains, hardware manufacturing and the downstream effects of the products themselves, tells the real story. Very few companies provide Scope 3 numbers that hold up to independent scrutiny, and the ones that do tend to rely on carbon offsets whose integrity has become a matter of serious dispute.
The renewable energy certificates behind most tech companies’ carbon neutral claims are financial instruments matched to renewable generation somewhere on the grid, not electrons flowing directly from a solar farm to a server rack.
The 2030 net-zero targets that several major tech companies have published are, according to independent analysis, verging on fantasy given current AI growth trajectories.
What Would Actually Have To Change
The answer with no sugarcoating from climate experts is that voluntary commitments have reached the limit of what they can deliver.
Without mandatory Scope 3 disclosure, binding energy efficiency standards for data centres and hard emissions caps, the pledge cycle will continue regardless of what happens to actual emissions. Tariq Fancy, the former Chief Investment Officer for Sustainable Investing at BlackRock, has argued that ESG frameworks function as a placebo: they absorb public concern and delay the structural regulation that would actually alter corporate behaviour.
There are genuine optimists in this debate: Stanford clean energy researcher Yi Cui points to AI’s potential to accelerate materials discovery and improve battery chemistry, but the question, as Hugging Face AI and Climate Lead Sasha Luccioni has framed it, is whether the models doing that work are being built with energy efficiency as a design constraint or as an afterthought.
We put those questions to some of the industry’s leading voices on climate, sustainability and ESG to find out where they actually stand.
Our Experts:
- Brad Johnson, Director, Industry Executive for Electric Utilities, Bentley Systems
- Jason Beckett, Head of Architecture, Hitachi Vantara
- George Mazzella, Vice President of Marketing, GreenFi
- Dr Andy MacInnes, Chief Development Officer, Paragraf
- Kate Steele, Director, EMEA HPC/AI, Lenovo Infrastructure Solutions Group
Brad Johnson, Director, Industry Executive for Electric Utilities, Bentley Systems
![]()
“Earth Day 2026 has a theme that cuts straight to the heart of data centre development: Our Power, Our Planet. The data centre industry now consumes power at a scale that strains grid infrastructure designed for a different era. When a gigawatt facility trips offline due to a software misconfiguration, the effect on the surrounding grid mirrors stretching the world’s longest slinky to its absolute limit and releasing it. The 2025 Iberian outage showed how complex dynamic events cascade through interconnected systems. Multiply that risk across hundreds of gigawatt-scale facilities deploying globally, and grid resilience stops being a utility problem and becomes a developer responsibility.
“Smart developers are already recognising that real engineering value lives beyond the fence line. Siting decisions, grid interconnection, renewable energy access, road networks, water supply and substation constraints determine whether a billion-dollar investment performs or fails. Communities hold equal power in that equation. From Wisconsin to upstate New York, projects are stalling because developers skipped the social licence conversation. Microsoft’s local grid stewardship and Google’s Frankfurt campus delivering free district heating while consuming zero water represent the new baseline for planning approval.
“One hundred and forty years of grid evolution have generated hard lessons about what works and what fails under pressure. Transit-oriented development proved that concentrating facilities around high-connectivity assets reduces cost, carbon and deployment time. Data centre developers need to apply identical logic to energy corridors, treat internal right-of-way infrastructure with municipal-grade rigour, and plan for subsurface complexity before construction begins rather than after the first operational crisis. The engineering intelligence to get this right already exists, and the industry needs to use it.”
Jason Beckett, Head of Architecture, Hitachi Vantara
![]()
“The uncomfortable truth is that the technology sector is enabling real progress in areas like energy optimisation, climate modelling and resource efficiency, but it is also driving a surge in energy demand that risks outpacing those gains. AI is a clear example of this. Data centre electricity consumption is already projected to reach nation-scale levels in the next few years, and the infrastructure required to support AI workloads is accelerating that trend.
“In our most recent sustainability report, we saw clear evidence that infrastructure design can materially shift this balance, with organisations achieving double-digit reductions in energy consumption alongside improvements in performance and cost. But those gains are not yet keeping pace with the scale of demand.
“That does not make this greenwashing, but it does expose a gap between ambition and execution. The next phase of AI will not be defined by capability alone, but by how efficiently it can be deployed. That means designing around energy constraints, not just performance, and being far more transparent about the trade-offs involved.”
George Mazzella, Vice President of Marketing, GreenFi
![]()
“The commitments are real in the sense that these companies are spending billions on renewable energy procurement. But the net effect is negative because the load growth is more than the clean energy additions: Google’s own reported water consumption rose from 4.3 billion gallons in 2021 to more than 6.1 billion in 2024. A single AI-focused hyperscale data centre can consume as much electricity as 100,000 homes. US electricity demand was flat for two decades and is now growing at nearly 2% annually, driven largely by data centre construction.
“The sustainability reports look good in isolation, but the actual infrastructure footprint is growing far larger than the offsets. The second and third-order effects are not factored in: data centres exacerbate the heat island effect, cause ecosystem stress and biodiversity damage by reducing river flows, encourage further industrial build-out nearby, and generate upstream supply chain impacts through rare earth mining, manufacturing emissions and e-waste that are less visible and usually not monitored. There is also a structural lock-in dynamic: cheap and abundant compute, because of deferred environmental costs, means growth in AI-related fields, which then drives demand for more data centres rather than fewer.
“The communities absorbing the environmental cost of this buildout are not the ones benefiting from it. You do not see new data centre developments in DC or Long Island, because those communities are wealthy, politically organised and aggressive about land use. They go where land is cheap and political resistance is low. The entities driving this infrastructure buildout should pay rates that reflect the full cost they cause to the system, not discounted economic development tariffs that shift the cost to residential customers. Right now it is the opposite. Residential customers pay more per kilowatt-hour than large commercial users, and then pay again through rate increases to fund infrastructure those commercial users required. That is a double extraction, and it needs to change.”
Dr Andy MacInnes, Chief Development Officer, Paragraf
![]()
“We are entering a phase where technological progress is no longer constrained by what software can do, but by the physical efficiency of the hardware it runs on. From data centres and communications networks to healthcare technologies and electrified transport, modern infrastructure depends on billions of electronic components operating continuously.
“Healthcare alone is responsible for around 4 to 5% of global carbon emissions, equivalent to the world’s fifth-largest emitter if it were a country, and relies on energy-intensive imaging, diagnostics and continuous monitoring systems. At the same time, the digital infrastructure behind AI and cloud computing is placing growing pressure on energy grids. Data centres in countries such as Ireland account for roughly 20% of national power demand, reflecting how concentrated and fast-growing this energy use has become.
“While we can generate more power and expand grid capacity, we must also address efficiency at the source. Gains at the component level can translate into significant system-wide impact when multiplied across billions of devices. This is where new materials play a critical role. By enabling more sensitive, efficient, and lower-power electronic components, advances in materials science are helping to reduce energy consumption while unlocking entirely new technological capabilities. In this sense, materials are no longer just a scientific concern, they are becoming a must.”
Kate Steele, Director, EMEA HPC/AI, Lenovo Infrastructure Solutions Group
![]()
“Data centres sit at the heart of this conversation, especially as AI adoption continues to accelerate. With 93% of IT leaders in EMEA planning to increase AI investments in the next 12 months, the scale and intensity of AI-driven workloads are set to rise significantly. As enterprises expand these workloads, they are confronting escalating energy demand alongside tightening sustainability mandates.
“The challenge is not simply how to consume less, but how to do more with smarter, cleaner systems. The narrative is shifting from resource intensity to resource intelligence. With the right design choices, data centre providers can accelerate the transition to renewable energy, reduce environmental impact, and even create value for local communities.
“The focus must move beyond minimising harm to actively redesigning infrastructure, embracing renewable energy, enabling circular systems and transforming excess into opportunity. Because the future of digital growth depends not just on innovation, but on how collectively we choose to power it.”
For any questions, comments or features, please contact us directly.
