Making sense of science

Greening data centres

Greening data centres

06.23.2022, by
Faced with the massive increase in the volume of information processed by data centres, Datazero research projects have been striving since 2015 to develop algorithms that can optimise their energy consumption and accessibility. The computer science researcher Jean-Marc Pierson initiated this programme to develop more environmentally friendly data centres.

The quantity of data stored and exchanged on networks has been rising sharply for years, from 20 zettabytes (Zb)1 in 2016 to approximately 80 Zb in 2022.2 The world went from 100 gigabytes per second (Gb/s) circulating on information networks in 2001, to 26,000 Gb/s in 2016.3 It is estimated that in 2025, the Internet of Things will generate 67 Zb of data, a significant portion of which will pass through data centres.

In its 2021 report, the French think tank The Shift Project pointed out that the global power consumption of data centres would represent approximately 5% of the world’s electricity in 2025. Between 2010 and 2018, their energy efficiency improved by 25%, while the number of servers increased by 30%, storage capacity rose by a factor of 26, and network traffic by a factor of 11. The Bitcoin cryptocurrency alone accounts for 10% of data centre consumption. The spread of 5G, not to mention the energy cost for associated facilities and communications (estimated at a factor of 2-3 compared with today), will further amplify this phenomenon, enabling digital devices and connected objects to be used anywhere and anytime, for ever-more substantial purposes. It is also important to note that data centres and networks are generally oversized; the former on average contain a mere 30% of their nominal capacity, raising questions as to whether there are alternatives to this waste.

Solving intermittence problems thanks to electricity storage

Numerous national and international actors are taking a close interest in data centre consumption for both economic and environmental reasons. Hence the initiatives launched by the web giants, who after buying “green” energy from specialized producers, started to generate their own renewable electricity to supply their data centres. However, this is not quite sufficient technically to operate facilities with tens of thousands of servers.

The Datazero research project and its extension, Datazero2, which are financed by the French National Research Agency (ANR), have sought since 2015 to develop data centres that use up to 1 megawatt of electric power without relying on an external electricity supplier. Today the metric that characterizes data centres is the amount of power they consume, regardless of the facilities they contain.

We opted for primary sources such as photovoltaic and wind power, and chose to solve problems of intermittence by storing electricity in batteries to offset daily fluctuations in production, and in fuel cells to prevent seasonal variations. The advantage is that these centres can be built anywhere on the planet, in close proximity to real needs. The first phase of the project,4 which is based on the development of several algorithms, has demonstrated the feasibility of such a data centre.

Improving efficiency

We began by developing an optimisation algorithm capable of calculating the dimensions of the centre based on user needs, both in terms of the number of computer servers and of the electrical equipment required (photovoltaic panels, wind turbines, batteries, fuel cells). This effort includes two phases: a dichotomic search for servers, and a linear program for electrical equipment. This is then used on weather data representing several years of operation, and on the flow of variable and representative computing services, enabling us to come up with the configuration that is best adapted to a particular use or location.

Our second contribution, once the centre is built, involves optimising both its electrical and computing operations in order to improve efficiency. A negotiation algorithm using game theory was developed to meet this dual constraint. We also relied on techniques from linear programming and genetic algorithms. This software architecture can individually solve problems without modelling them completely, which would be too slow for real-time use in an operational data centre, and find solutions that are optimal or close to optimal.

Google's solar-powered data centre in Saint-Ghislain, Belgium.
Google's solar-powered data centre in Saint-Ghislain, Belgium.

In the second phase of the project, which began in 2020, we decided to concentrate on two major areas. We studied new ways of interconnecting electrical equipment in order to limit hardware redundancy, all while maintaining maximum service quality and a minimal carbon footprint. Considering that all sources of energy are unlikely to be interrupted simultaneously, we developed a dynamic interconnection topology that can shift the electricity produced by each element depending on the state of the system. The project will now assess the relevance of this approach, especially to quantify the resulting decrease in redundancy.

The second focus is to study in greater detail the impact of uncertain data, such as the weather or variations in the influx of users. By modelling uncertainty on input data, optimisation and negotiation algorithms can better characterise the uncertainty of the output, as well as its potential domino effects.

The points of view, opinions and analyses published in this column are the sole responsibility of their author(s) and do not in any way constitute a statement of position by the CNRS.

Footnotes

Comments

0 comment
To comment on this article,
Log in, join the CNRS News community