You are here
Supercomputers: the Stakes in the Global Race
Funded by the civil company Genci,1 and hosted at l’Institut du développement et des ressources en informatique scientifique (Idris), the CNRS’s large-scale computing centre, the Jean Zay supercomputer takes its place in a headlong global race in which available computing capacity goes hand in hand with the power and competitiveness of nations. But first, what exactly is a supercomputer?
Stéphane Requena: A supercomputer is a very dense assembly of servers stacked on top of one another, and connected by a high-speed network. It can work on problems that are beyond the reach of a single office PC by performing calculations simultaneously in parallel, which is to say by giving each server a little part of the problem to be worked on. Its power is measured in petaflops,2 or a million billion operations per second. In short, it’s an accelerator for science, and a particularly useful tool for all of society.
What uses will now be possible with the new Jean Zay supercomputer?
Jamel Atif:3 Supercomputers are capable of High Performance Computing (HPC), which is to say simulations that focus primarily on major scientific challenges such as climate science, engineering problems in the broader sense, etc. Currently there is a major need for supercomputers dedicated to artificial intelligence (AI), whose applications range across all disciplinary fields, including the autonomous car, medicine, and the Humanities and social sciences. The new Jean Zay supercomputer is a converged machine, dedicated to both HPC and AI.
S.R.: The world is facing an explosion in the volume of data to be processed, which is produced by major scientific instruments (satellites, telescopes, sequencers, etc.), as well as supercomputers through numerical simulations. Researchers have reached the limits of what it can do using traditional data analysis methods in a competitive time, and new methods must be deployed with AI.
J. A.: In addition to the accelerated calculation provided by the machine, there are notably scientific issues that we could not imagine or address previously. In the traditional approach, researchers create algorithms on their computers before deploying them on a larger scale. In AI, access to a supercomputer is an indispensable condition to creating new algorithms. So we’re changing how we conceive science.
Which major scientific fields will benefit from this?
J. A.: High-precision medicine will be a central aspect in coming years. For instance, OmicsFermerTechnology that can analyse massive quantities of data from the living world on a molecular scale, for example a series of genes (genomics), RNA (transcriptome), proteins (proteomics), etc. technologies generate a great amount of data, and there are expectations that algorithms will discover new links and connections between different health parameters. One of the goals in this field is to create medicine doses that are adapted to the profiles of individual people, thereby making them more effective than general medicine prescribed to all. Others issues include the Humanities and social sciences, cybersecurity, Earth sciences and astronomy, and climate science.
S.R.: It will help increase the resolution (mesh size) for climate forecasting models by transitioning to global models that are accurate down to the kilometre, as opposed to today’s accuracy of twenty kilometres. Models will be more predictive, leading to more refined monitoring of extreme events as they develop, such as cyclones. It also opens the way for major technological progress in the field of new materials, autonomous vehicles, health/biology, and industry more generally, which represents 15% of the projects that use France’s existing supercomputers. Jean Zay will also be useful to the emerging domain of decision-making support: how can these supercomputers help public authorities act following natural risks?
Will the Jean Zay supercomputer have to evolve? It’s a machine that has a lifespan of approximately 6 to 7 years. Since the field evolves very quickly, machines have to respond to these evolutions in order to remain competitive.
S.R.: Deploying such an infrastructure actually requires considerable technological monitoring, and almost two years of preparation. It’s a fairly complex hardware operation, in which each component has to be validated individually. For Jean Zay, we have a plan for the evolution of the machine’s capacities, with the addition of complementary technologies in the next two years, especially for the part dedicated to numerical simulations. But the same will have to be done on the AI side, with additional funding on this part by public authorities. If we stop in the middle of this machine’s development, we will lose the efforts that have been made, along with all of the advantages of the initial investment.
How is France positioned in relation to the rest of the world in the field of supercomputers?
S.R.: French research has access to three large-scale computing centres.4 The one already housed at Idris has a power of 1.5 petaflops. The new machine will have a computing power 10 times greater (14 petaflops at launch and 17 petaflops in late 2019). This new configuration is indispensable in a context marked by strong European and global competition. All of these technologies—AI, HPC, quantum computers—are part of national strategies, which are now investing massively in converged supercomputers. It’s an endless race for computing power, one that’s especially strong between China and the United States, which is the current global leader with its Summit supercomputer, with a power of over 200 petaflops.
J. A.: This merciless struggle should lead to the deployment in 2021 of what are referred to as “Exascale” machines, which can perform a billion billion operations per second. That’s the equivalent of 60 times Jean Zay. The Americans will install two converged AI+HPC machines in two years, with a capacity of 1,500 petaflops. It’s on another level, but the model chosen by France was adopted long ago by the United States, Japan, and China.
What about Europe?
J. A.: When you look at each country’s AI strategy over the last two years, you see 40 flags that bear witness to the mobilization of European states. Each has its plan, but we have not yet succeeded in transforming all of these flags into a single one for the European Union. Independent of the strategy specific to each country, it is important to establish a “super-strategy,” one that is as comprehensive as possible. For all that, Europe still lacks examples of success to build on. A joint AI project is one potential approach for demonstrating that Europe can have an impact, for no European country can go it alone.
S.R.: When Genci was created , each country understood that a European infrastructure was needed to compete with the major powers in the field (the United States, China, Japan). Prace, a European Genci of sorts, was created in 2010 thanks especially to the contributions of four member states (Germany, France, Italy, and Spain), but without the European financial support needed to sustain this ambition. This changed with the launch, in late 2018, of the EuroHPC European initiative, with a budget of one billion euros for its initial phase. It coordinates R&D efforts and the deployment of the first world-class supercomputers dedicated to research. Three “pre-exascale” machines (approximately 200 to 300 petaflops) will be installed in Spain, Italy, and Finland. In a second phase, the program will support the development of Exascale machines (in 2022/2023), with France and Germany volunteering to host them. Now it is up to us to create a French team that includes the entire ecosystem of supercomputers, drawing from Genci, the CNRS, CEA, universities, Inria, and the Ministry of Research, with a view to convincing EuroHPC to entrust us with this machine.
Can Europe take inspiration from the developments implemented by the global leaders in the field?
J. A.: Overall it’s the power of the state that moves things forward the most quickly. That’s why it’s a “strategic war.” People often think that the US AI business was primarily developed by the digital giants. But mindsets change, and there is a collective awareness of sorts emerging with regard to data-related issues. Europe has always occupied a delicate position between the two giants. But it’s possible to create competitive AI that generates business all while remaining ethical. AI is still a field of research. It’s important to measure the progress that has been made, for instance by state-of-the-art algorithms that know how to distinguish a cat from a dog, or how to play Go. Deploying them in critical fields such as health and cybersecurity still requires a lot of research. In particular, there is an economic and societal model that can be developed with respect to the notions of trust and responsibility. Europe can develop machine learning that increases confidence, work on the transparency of algorithms, and prepare ethical AI, as long as it buckles down to it.
S.R.: However, one of the key issues is not to have the evolutions of AI imposed on us, as was the case with digital technology. We must use all available resources, including human and material ones, such as supercomputers, in conjunction with an agile circuit of political decision-making, in order to help drive these changes rather than have them imposed on us.
What does Europe need to become a major player in this race for computing power?
S.R.: The central issue is to combine European technology with powerful infrastructures dedicated to European scientific applications. Currently the only European producer of Exascale-class supercomputers is the French company Atos-Bull. Securing independence from other computing powers requires a strong and independent European industry. In the confrontation between global leaders, the United States imposes, for instance, processor embargos on China. Europe is not shielded from such a move.
J. A.: A technological war is being waged, at a time when digital technology is in charge of everything; technologies must be mastered from start to finish, from hardware to the creation of algorithms. Otherwise Europe will continue to suffer these evolutions rather than help drive them. Our sovereignty is at stake, for depending on other powers also means falling behind, being less competitive in terms of innovation, research, and the economy overall.
Are players other than states involved in this race?
J. A.: The supercomputer race is not just between state powers, but also with the private powers represented by America’s GAFAM5 and China’s BATX.6 The latter has made advances in AI, and possesses computing capacities that research is far from accessing. It would be an error with regard to our citizens to allow AI, and all of the societal issues connected to it, to develop in a private context. It is crucially important to remain in the algorithm production race.
S.R.: There are also industrial actors such as Total (with its IBM Pangea III machine, ranked 11th in the latest top 500 from June 2019), EDF, Airbus, Safran, Renault, etc., that invest heavily in supercomputing, in terms of resources for computing and applications. Possessing public infrastructures as well as resources operated by industrial actors, not to mention the field of Cloud computing with a player such as OVH,7 is a genuine advantage for our country.
- 1. Grand équipement national pour le calcul intensif (French very large-scale computing centre).
- 2. The prefix peta signifies 10 to the power of 15, or one million billion, and FLOP is the acronym for FLoating Point Operations per Second.
- 3. Jamal Atif, who is a professor at l’Université Paris-Dauphine, works at the Laboratoire analyses et modélisation de systèmes pour l’aide à la décision (CNRS/Université Paris-Dauphine), as well as l’Institut des sciences de l’information et de leurs interactions of the CNRS.
- 4. Occigen is hosted at the Centre informatique national de l’enseignement supérieur (Cines) in Montpellier (3.5 petaflops), Joliot-Curie is hosted at the Très Grand Centre de calcul du CEA (TGCC, CEA’s Supercomputing Centre) at Bruyères-le-Châtel (Essonne), and Jean Zay is hosted at Idris.
- 5. Acronym for the Internet giants Google, Apple, Facebook, Amazon, and Microsoft.
- 6. Acronym for the Chinese internet giants from the 2010s, Baidu, Alibaba, Tencent and Xiaomi.
- 7. French company specializing in cloud computing services, founded in 1999 by Octave Klaba, which offers storage and exploitation solutions for data.
Share this article
After a degree in environmental studies at Paul Sabatier University in Toulouse, then in science journalism at Paris-Diderot University in Paris, Anaïs Culot worked in media relations at the CNRS and now collaborates with various magazines, including CNRS Le Journal, I'MTech and Science & Vie.
Log in, join the CNRS News community