Writing About AI
Uvation
Reen Singh is an engineer and a technologist with a diverse background spanning software, hardware, aerospace, defense, and cybersecurity. As CTO at Uvation, he leverages his extensive experience to lead the company’s technological innovation and development.
The carbon footprint of GPUs is shaped by several factors that go beyond just direct electricity consumption. The primary contributors include:
The NVIDIA H100 Tensor Core GPU was designed with features to improve its performance-per-watt and reduce its carbon impact. Its efficiency comes from:
Training large-scale AI models, such as language models with hundreds of billions of parameters, requires immense computational power and generates a substantial carbon footprint. Studies have shown that the process can emit hundreds of tons of CO₂, an amount comparable to the annual carbon footprint of hundreds of cars. This high level of emissions is due to the need for thousands of GPUs to run uninterrupted for weeks or even months, with each GPU consuming between 300 and 700 watts under full load.
There is a notable distinction between the carbon footprint of academic research and that of enterprise-scale deployments.
Data centres and their cooling systems play a critical role in the overall carbon footprint of GPU-based workloads. High-performance GPUs generate significant heat, and cooling systems can account for as much as 40% of a data centre’s total energy consumption. To manage this, operators are adopting advanced techniques like liquid and immersion cooling, which are more efficient than traditional air cooling. Major cloud providers such as AWS, Google, and Microsoft are investing in these advanced cooling solutions and are committed to powering their data centres with renewable energy to reduce their carbon footprint.
We are writing frequenly. Don’t miss that.