Skip to main content

Data center energy and the importance of efficiency

Using simple extrapolation is an easy way to figure out what something is going to do in the future, based on past trends. If you assume something is x is directly proportional to y, increasing x will result in y changing in relation to that correlation.

This works with simple relationships. For example, we know that wind speed increases exponentially with height, above any local obstructions. We also know that with wind turbines, power output is very sensitive to wind speed and rotor blade swept area. If you double wind speed, power output increases x8. If you double rotor area, power output increases x4.

Although engineering the machines to take advantage of these concepts is considerably more complex, the underlying power laws are straightforward. We can rely on these assumptions to make predictions, or extrapolate based on known data.

Unfortunately, when it comes to energy usage, and data center energy in particular, these extrapolations fail. This is because there is another factor at play.

XKCD extrapolation cartoon
XKCD 605

Over the past decade, data center usage has undoubtedly changed. The number of servers deployed has grown, but that growth has slowed and been replaced with a x5 increase in “instances” hosted on those servers i.e. physical servers are being virtualised (Masanet et al, 2020). Those servers are also more likely to be located in “hyperscale” facilities i.e. the big cloud providers (Forrester, 2019), and the number of them has increased significantly: x6 more compute instances in total, x10 more network traffic and x25 more storage capacity in 2018 compared to 2010 (Masanet et al, 2020).

We know that data centers consume around 200TWh of electricity each year, or about 1% of global usage. But as of 2020, that has only increased by 6% compared to 2010 levels (Masanet et al, 2020). Energy usage has been decoupled from data center usage. How? Efficiency improvements.

Since 2010, server electricity usage has decreased by x4, the watts per TB of storage has dropped by x9 and the efficiency of the data center facilities at the best providers has resulted in very low PUE values (Masanet et al, 2020). For example, Google’s fleet-wide trailing twelve month PUE is just 1.10, a new record low.

Graphs of historical energy usage and projected energy usage under doubled
computing demand
Historical energy usage and projected energy usage under doubled computing demand (Masanet et al, 2020)

Similar trends can be seen elsewhere. Over the last 10 years, UK electricity demand has fallen by 13% despite population growing from 62.2m to 66.9m and GDP growing by almost 30% from £1.7tn to £2.2tn (Staffell and Wilson, 2019).

Graph of UK electricity demand
UK electricity demand (Staffell and Wilson, 2019).

Of course, this is entirely separate from where data centers get their energy. The challenges of reaching 100% renewable energy for data centers still exist. Data centers might be using less energy compared to usage, but they still have to plug into the grid which is nowhere near 100% renewable.

The UK is the world-leader in decarbonisation – the amount of electricity generated by fossil fuels has fallen by 54% and renewables are now over 30% of UK electricity generation (Staffell and Wilson, 2019). However, that is not reflective of the world in general. The majority of global electricity is generated from non-renewable sources.

Graph of the share of global electricity generation by
fuel
Share of global electricity generation by fuel (percentage) (BP Statistical Review of World Energy, 2019).

The data center energy estimates cited here use bottom-up modelling to calculate energy usage, so changes (improvements) in the underlying assumptions – power draw by servers, disks and networking, and the data centre PUE ratios – are key factors. The challenge is finding accurate numbers.

Google and Microsoft publish aggregated statistics but do not offer enough detail to be able to model accurately (Amazon doesn’t release anything useful). As such, they are based on disparate sources, such as vendor equipment reports and assumptions around the characteristics of deployed equipment. This means the results have a different character from the national energy stats cited above, which come from direct measurement published by government and industry. Although the result is supported by the industry, it serves the “cloud is greener” narrative which may well be accurate, but suffers from a lack of transparency. Organisations such as Greenpeace have made these criticisms in the past.

Further, as the scope of the analysis expands outside of just data centers, efficiency becomes just another reason behind these trends. For example, in UK energy, milder winters and shifting economic production are other factors for reducing energy demand alongside efficiency improvements.

There are also challenges for the future. Are we going to see diminishing returns on efficiency improvements in the same way that Moore’s Law has stalled? How do these patterns change with new hardware types such as GPUs and IoT devices? What will happen as workloads start to move closer to the user with edge computing platforms? Past predictions have been wrong about massive growth, maybe they will be wrong about flat growth as well.

That’s the problem with extrapolation with complex underlying assumptions. They can very easily change.