Chapter 1: Green AI Index

This white paper provides an in-depth framework for evaluating the environmental impacts of artificial intelligence (AI) and data centers using the Green AI Index, a comprehensive assessment tool that focuses on various environmental metrics such as energy consumption, carbon emissions, and water use. It aims to offer a structured method for understanding and mitigating the environmental costs associated with the development, deployment, and operation of AI systems. The framework combines real-time energy consumption and lifecycle analysis to give a holistic view of the environmental footprint of AI and data centers.

Data Centers

Scope 1: Direct Carbon Emissions

Scope 1 covers the direct emissions that result from activities controlled by the data center itself. These emissions primarily originate from the combustion of fuels in generators and the leakage of refrigerants used for cooling systems. For example, diesel or natural gas generators, often deployed as backup systems in data centers, produce carbon dioxide and other greenhouse gases when operating. Refrigerants, while essential for cooling IT equipment, can leak over time and contribute significantly to carbon emissions due to their high global warming potential (GWP). Calculating Scope 1 emissions involves measuring the amount of fuel consumed and the type of refrigerants used, applying the appropriate emission factors to determine the overall environmental impact.

Scope 2: Indirect Carbon Emissions from Purchased Energy

Scope 2 accounts for the indirect emissions associated with the electricity, heating, or cooling purchased by data centers from external suppliers. These emissions arise from the generation of the energy consumed by data centers, which can vary based on the energy mix (renewable vs. fossil fuel-based) in the region where the data center is located. To calculate Scope 2 emissions, data centers measure their energy consumption and apply the regional or national emission factors for electricity production. One common metric for evaluating a data center’s energy efficiency is Power Usage Effectiveness (PUE), which compares the total energy used by the facility to the energy used by the IT equipment.

Scope 3: Other Indirect Carbon Emissions

Scope 3 involves all other indirect emissions not covered under Scope 1 and Scope 2, encompassing emissions throughout the value chain. This includes emissions from the manufacturing and transportation of IT equipment, employee commuting, waste disposal, and the supply chain. The production of hardware like servers, cooling equipment, and other infrastructure materials generates significant emissions due to the energy-intensive nature of manufacturing and logistics. Lifecycle assessments (LCAs) are often used to calculate these emissions by assessing the cradle-to-grave environmental impact of the hardware used in data centers.

Scope 1: Direct Water Use

Scope 1 direct water use pertains to the water consumed on-site for data center operations, primarily in cooling systems. Data centers use various cooling technologies, including evaporative cooling and adiabatic systems, that consume water directly to maintain the operational temperature of IT infrastructure. Cooling towers, for instance, rely on significant amounts of water, especially in regions with high ambient temperatures. Measuring water use for these purposes allows data centers to assess their direct water consumption footprint and implement measures to reduce water waste, such as using closed-loop cooling systems.

Scope 2: Indirect Water Use Associated with Electricity

Scope 2 water use refers to the water consumed indirectly through the electricity purchased by the data center. The water footprint of electricity generation varies depending on the type of power source used. Thermal power plants, for example, consume vast amounts of water for cooling purposes, while renewable energy sources like wind or solar have much lower water usage. Data centers connected to grids with a higher percentage of fossil fuel-based energy sources may indirectly contribute to substantial water use. To calculate this, data centers must consider the regional energy mix and estimate the associated water consumption per kilowatt-hour (kWh) of electricity.

Scope 3: Other Indirect Water Uses

Scope 3 water use encompasses the water consumed throughout the supply chain of data center operations, including the production of servers, chips, and other infrastructure. For example, semiconductor manufacturing, which is critical for producing the CPUs and GPUs used in data centers, requires highly purified water during fabrication processes. Moreover, transportation of hardware, construction of data centers, and maintenance activities also contribute to indirect water consumption. Scope 3 water assessments offer a holistic view of the entire lifecycle water footprint of data centers.

AI Models

LCA Overview

The Life Cycle Assessment (LCA) framework provides a comprehensive view of the environmental impacts of AI models across their entire lifecycle. LCA covers every stage, from raw material extraction for hardware manufacturing to the final disposal or recycling of components. It takes into account both carbon emissions and water usage, offering a detailed breakdown of the environmental footprint of AI model training, deployment, and operation. For example, training a large AI model such as GPT-3 requires significant computational resources, translating to both operational and embodied environmental costs. Through LCA, stakeholders can identify the key areas contributing to environmental impact and develop strategies to mitigate them.

Operational Carbon Footprint

The operational carbon footprint of AI models primarily stems from the energy required to train and run these models. Training large-scale AI models requires extensive use of GPUs and CPUs, which are energy-intensive. The energy consumed during training sessions can be measured in GPU-hours (GPUh), and the total carbon emissions depend on the energy efficiency of the hardware and the electricity mix used by the data center. Beyond training, inference, or the process of using AI models to make predictions, also contributes to the operational carbon footprint. Since AI models are often deployed in cloud environments that serve millions of requests daily, the cumulative energy usage for inference can become significant.

Embodied Carbon Footprint

The embodied carbon footprint refers to the emissions generated during the production of the hardware used for training and running AI models. This includes the emissions from extracting raw materials, such as silicon and rare earth elements, as well as the manufacturing and transportation of GPUs, CPUs, and other components. The semiconductor fabrication process is particularly energy-intensive and generates a significant portion of the embodied carbon footprint. Once hardware is no longer in use, its disposal or recycling also contributes to the embodied carbon emissions, making it essential to consider the full lifecycle of the equipment used in AI operations.

Concept of Water Footprint

The water footprint concept refers to the total volume of freshwater consumed, both directly and indirectly, throughout the lifecycle of a product, process, or system. In the context of AI, this includes water used in the operation of data centers (for cooling) and the embodied water used in the manufacturing of hardware components such as servers and GPUs. Given the significant energy demands of AI models, the associated water use for cooling and electricity generation can be substantial, particularly in water-stressed regions. Understanding the water footprint of AI models is crucial for evaluating their overall environmental sustainability.

Operational Water Footprint

The operational water footprint encompasses the direct water usage required to maintain data centers where AI models are trained and deployed. Cooling systems, such as evaporative cooling towers, rely heavily on water to dissipate the heat generated by the IT infrastructure. Water consumption can vary seasonally, with higher usage during periods of elevated temperatures. Additionally, the electricity consumed by AI models indirectly contributes to water use, as power plants require water for cooling and electricity generation. By optimizing cooling efficiency and transitioning to less water-intensive energy sources, data centers can reduce their operational water footprint.

Embodied Water Footprint

The embodied water footprint refers to the water used during the manufacturing and production of the hardware that supports AI models. This includes water used in the mining and processing of raw materials, as well as in the fabrication of semiconductors, GPUs, and other electronic components. For instance, semiconductor manufacturing requires ultra-pure water, making it one of the most water-intensive industrial processes. The embodied water footprint also accounts for the water used in transporting and assembling hardware, providing a comprehensive view of the hidden water costs associated with AI infrastructure.

Next
Next

Chapter 2: Environmental and Energy Regulations for Data Centers and AI Infrastructure