As global leaders gather in New York this week for the 80th session of the United Nations General Assembly (UNGA), one of the pressing topics on the agenda is the ethical and sustainable use of artificial intelligence.
Nearly three years after generative AI made its explosive debut in late 2022, concerns around its massive resource demands continue to dominate discussions across both public and private sectors.
While advancements have been made to make AI workloads more efficient, experts warn that much work remains to be done. Much of the debate to date has centered on “green” data centers—those powered by renewable energy, optimized cooling systems, and improved efficiency designs.
However, analysts emphasize that this is only part of the solution. To build truly sustainable AI ecosystems, organizations must adopt a broader approach that goes beyond infrastructure, incorporating strategies such as data efficiency, software optimization, and maximizing low-carbon energy usage.
Modern data centers have indeed taken critical steps forward, with innovations in liquid cooling, power conversion, and rack-level optimization helping to support the soaring compute requirements of AI. But even the most advanced facilities face significant challenges. Studies suggest that AI inferencing alone—the process of running trained models to generate predictions or responses—could consume up to 20% of global energy by 2030. Experts caution that this isn’t just a technology issue, but a looming business risk as well.
Industry voices argue that focusing only on infrastructure is insufficient. The sustainability challenge spans the entire AI lifecycle—from data collection and model training to deployment and monitoring.
A parallel has been drawn to personal health: improving your workout routine while keeping an unhealthy diet may bring some results but won’t deliver the full benefit. Similarly, good AI requires good data. A strong data strategy not only produces better model outcomes but also dramatically improves energy efficiency.
Yet, in the rush to deploy AI, many organizations skip this step. The result is models trained on poor-quality, irrelevant, or unstructured data—leading to underperformance and unnecessary resource consumption.
Analysts recommend a four-step framework—Collect, Curate, Clean, and Confirm—to ensure data relevance and quality. Properly curated datasets can reduce redundancy, eliminate bias, improve accuracy, and significantly cut energy costs during both training and inferencing.
As world leaders debate AI’s future at UNGA, one message is clear: achieving sustainable AI will require systemic changes across infrastructure, data practices, and governance—not just greener servers.