There are currently only two credible estimates for global data center energy consumption: 196 TWh to 400 TWh for 2020. Why do we see vastly higher reports?
It is currently impossible to properly compare how sustainable one product is vs another. Pictures of wind farms look nice, but how do you choose which cloud region to deploy (or move) your resources if there is no way to compare them?
As more applications are run through a web browser, even that is beginning to be streamed from the cloud. Is that the best use of our now highly efficient computers? How energy efficient is application streaming?
To avoid accusations of greenwashing, data center operators must consider investing in a portfolio of renewable energy products. RECs cannot credibly be used to back claims of 100% renewable energy use.
Most data center sustainability strategies still focus on renewable energy certificates (RECs). RECs are now considered low quality products because they cannot credibly be used to back claims of 100% renewable energy use
I now have to make a decision between having the best, cutting-edge hardware vs the freedom and control to do what I want with the device I spend most of my time on.