What organisation – company, university, research institute, city, or state – could afford to waste 90 % of their resources? You don’t have to be a Minister of the Economy to know the answer: none. And yet, organisations are wasting their data even though it represents vast potential for value creation. No less than 90% of the value of data remains untapped as it does not undergo any processing whatsoever.

From the automotive industry to medicine, data is everywhere

It is thanks to the use and modeling of data that iconic platforms like Uber and Airbnb have emerged. It is also thanks to this enhancement of data use that “traditional” professions are gradually reinventing themselves. The automotive industry is now devoted to the autonomous car, medicine is counting on ultra-personalised molecules to care for individuals, and in meteorology it will soon be possible to predict areas of fog near to an airport.

But the Big Data revolution is still far from realising its full potential. Primarily due to a lack of maturity of this field, it is rare for organisations to have incorporated data analysts or other chief digital officers into their ranks to process and enhance the value of their data. And then, due to a lack of computing power, it is difficult to process the veritable data deluge generated by our activities each day in real time (prescriptive analytics).

 More data than grains of sand on the planet

The exponential increase in data is attributable in particular to smart objects. The number of these is set to explode, reaching the symbolic threshold of 50 billion connected objects around the world in 2020. By this time, 40 zettabytes – that is 40 thousand billion billion pieces of usable data – will be generated. More than the number of grains of sand on the Earth!

And not only do we need to count, identify and isolate these grains of sand, we also have to be able to link them to each other at any given moment. To achieve this, we need exceptionally powerful supercomputers known as high-performance computers (HPC).

Atos/Bull sequana: one billion billion operations per second

Today, a petaflop supercomputer – that is, a computer capable of processing a million billion operations a second – represents the equivalent of 140,000 PCs. The Bull sequana, launched by Atos last month, is in the process of being installed at the CEA (the French Alternative Energies and Atomic Energy Commission) and, by 2020, will be as powerful as 20 million PCs. Indeed it will soon be capable of processing an exaflop, or a billion billion operations per second!

While multiplying the computing capacity by 150 presents one challenge, handling the required energy consumption will be quite another. At current consumption rates, it would mean devouring 400 megawatts, equivalent to the annual consumption of a town with 60,000 inhabitants or an entire unit of a nuclear power station.

A pocket supercomputer?

That is why, in constructing the world’s most efficient supercomputer, the Atos teams have reduced the power consumption by a factor of 20 and the volume by a factor of 10. Whereas previously it would have needed the whole floor of a building to accommodate it, a Bull sequana now takes up no more space than a cupboard. To such an extent that, when I presented our supercomputer to the Minister of the Economy Emmanuel Macron, some playful photographers tried to get him to pose inside the machine!

Complete miniaturization is a challenge that we still have to tackle for this machine which, even today, still weighs several tons. So there’s a long way to go until we get to that stage – but don’t forget that the smartphone in your pocket is more powerful than the computer that sent man to the moon. And while the traditional physical limitations have certainly been a hindrance following Moore’s law, at Atos we are already working on the quantum computer for the post-2030 era.

Europe in the global HPC race

It is this constant thirst for innovation that motivates us and allows Europe to be fully engaged in the global race for computing power alongside China, Japan, and the U.S., with President Barack Obama also challenging American scientists to build an exaflop supercomputer. In addition to questions of sovereignty, data today goes hand in hand with issues of economic growth, investment, relocation near to production, research and consumption sites, employment, and above all education.

All reasons that commend the creation of a powerful ecosystem and a new program of European excellence, a position that I put to President of the European Commission, Jean-Claude Junker, who has just emphasized the importance of high-performance computing for the digitization of European industry. This is why, now more than ever, I encourage young people to make a firm commitment to studying science and technology, and more generally to get interested in science!