Most data centre owners have received the message about reducing their carbon footprint and the transition to renewable energy. We regularly see new projects announced with net-zero carbon goals. They now need to do something similar for water.
There are currently only two credible estimates for global data center energy consumption: 196 TWh to 400 TWh for 2020. Why do we see vastly higher reports?
It is currently impossible to properly compare how “sustainable” one product is vs another. Pictures of wind farms look nice, but how do you choose which cloud region to deploy (or move) your resources if there is no way to compare them?
My notes on the paper: Heptonstall, P.J. & Gross, R.J.K. (2021) A systematic review of the costs and impacts of integrating variable renewables into power grids. Nature Energy. 6 (1), 72–83.
As more applications are run through a web browser, even that is beginning to be streamed from the cloud. Is that the best use of our now highly efficient computers? How energy efficient is application streaming?