AI is the discipline of developing intelligent agents that encompases machine learning, deep learning, generative AI (ability to create new content based on what's learned). Types of DL models are Discriminate models (can classify cat pictures) and Generative AI models that can generate cat pictures (very compute intensive).
ML and AI models are build in 4 phases: development (Collect data -> clean -> train), training (Live data and validation produces trained models), Inference (Query -> prediction (output of some sort)) and Monitoring.
Measuring the consumption of DC's and the equipment used in DC's is done by the owners of DC's and there's not much you can do.
Lifecycle emissions are produced in each phase (Product -> contructions -> maintain & use -> EOL -> beyond lifecycle), where maintain & use-phase is responsible for the most. Measuring operational carbon emissions is done by carbon intensity, which is how many grams of co2 /kwh (co2-equilevant) is being produced.
Reports of energy & emissions from AI are provided by cloud vendors (there are obvious reliablity concerns in self reporting). Development-phase is computationally intensive and it's not studied extensively. Training-phase has been discussed a lot - 10-15% of Google's total energy consumption (19-21) and 40% of that was used for training. There are claims that it's a stable propotion of energy use in ML.
We can have an impact by reporting metrics if we're involved in development of models, choose providers and hardware cafefully and use the models in a responsible way.
Your comment may be published.
Name:
Email:
Message: