Elements of Sustainable ICT pt. 8 (On AI)

AI is the discipline of developing intelligent agents that encompases machine learning, deep learning, generative AI (ability to create new content based on what's learned). Types of DL models are Discriminate models (can classify cat pictures) and Generative AI models that can generate cat pictures (very compute intensive).

ML and AI models are build in 4 phases: development (Collect data -> clean -> train), training (Live data and validation produces trained models), Inference (Query -> prediction (output of some sort)) and Monitoring.


Kalle Tolonen
March 7, 2025


Measuring Energy

Measuring the consumption of DC's and the equipment used in DC's is done by the owners of DC's and there's not much you can do.

Measuring Emissions

Lifecycle emissions are produced in each phase (Product -> contructions -> maintain & use -> EOL -> beyond lifecycle), where maintain & use-phase is responsible for the most. Measuring operational carbon emissions is done by carbon intensity, which is how many grams of co2 /kwh (co2-equilevant) is being produced.

Reports

Reports of energy & emissions from AI are provided by cloud vendors (there are obvious reliablity concerns in self reporting). Development-phase is computationally intensive and it's not studied extensively. Training-phase has been discussed a lot - 10-15% of Google's total energy consumption (19-21) and 40% of that was used for training. There are claims that it's a stable propotion of energy use in ML.

We can have an impact by reporting metrics if we're involved in development of models, choose providers and hardware cafefully and use the models in a responsible way.

Key learnings

  1. Energy consumption is trending upwards
  2. Emission depend upon: ML-model (architecture, params, hours to train), Machines used (CPUs/GPUs/TPUs) perf/watt, Mechanization (Cloud computing, on-prem) and Geographical region (Carbon intensity of the electricity used)
  3. Image generation uses most resources, text classification the least (2 orders of magnitude)
  4. 40% of energy is used in training, 60% is used in inference and inference requests are growing

Comments

No published comments yet.

Add a comment

Your comment may be published.