Will we ever be able to accurately assess the carbon footprint of IT? Could a relational footprint methodology be more useful? Paper notes on Pasek et al (2023).
There is a correlation between the training time and energy consumption, but that doesn’t mean there is a correlation between training time and carbon emissions.
Is it better to replace powerful developer laptops with cloud dev environments? What is the carbon cost of my software development – builds, tests, deploys, code hosting, dev environments?
Why isn’t carbon aware workload scheduling more common? Data center level scheduling is infeasible, so what are the opportunities for developers to implement more granular functionality?
Data centers are not 100% efficient, so they generate waste heat, which causes anthroprogenic heat flux, and can therefore be linked to global warming. But how much? And should we be concerned?
As more applications are run through a web browser, even that is beginning to be streamed from the cloud. Is that the best use of our now highly efficient computers? How energy efficient is application streaming?