ICT sector accounts for 4-10% of global energy consumption and these numbers are growing. Software work has 3 bigger phases: development, testing & deployment and usage. Usage is the phase where most of the emissions occur, since that's where most of the time and work (as in energy) is done.
ICT's business logic is bringing new and fancier stuff to market - SaaS measures time to market, devs think about making their own life easier and efficiency & environmental issues are secondary or not considered at all. There little to no progress in making software environmentally responsible even as it would curb energy usage. The more operations & data we need/use, the more hardware we need and that increases demands for networks and premises - we consume more, it's a cascading effect.
Cloud providers have incentives to use less energy or hardware. This leads to higher utilization and that's good for the environment. There isn't much transparency - nobody knows who has the most efficient systems.
Manufacturing -> logistics -> usage -> dismantling (negative effect on emissions, since can provide resources), is the cycle of hardware emissions. Data centers and networks produce most of their emissions in operative phase (80/20), but end user devices are about 50/50 (manufacturing/usage).
It's difficult to measure energy usage of software. There isn't a solid scientific framework for measurement. It's a good idea to measure electricity consumption over time. There aren't public measurement, so there's a lack of a benchmark. There a link between execution time and energy usage, if it's not possible to measure electricity consumption directly.
Your comment may be published.