Why the US and China's brutal supercomputer war matters

Multi-million dollar projects to eke out an advantage in processing power aren't really about science, they're an exercise in soft power

Thought global arms races are all about ballistic missiles, space or nuclear development? Think again: the new diplomatic frontline is over processing power and computer chips.

A major shift has taken place, with a new claimant to the crown of world’s fastest supercomputer. IBM’s Summit at Oak Ridge National Laboratory in Tennessee uses Power9 CPUs and NVIDIA Tesla V100 GPUs and has 4,068 servers powered by ten petabytes of memory working concurrently to process 200,000 trillion calculations per second – 200 petaflops. That’s a lot of numbers – and here’s one more. Summit’s processing power is 117 petaflops more than the previous record-holder, China’s TaihuLight.

While it may seem significant, it’s actually largely symbolic, says Andrew Jones of the Numerical Algorithms Group, a high-performance computing consultancy. “I put no value on being twice as fast or 20 per cent faster other than bragging rights.”

That’s not to say that supercomputers don't matter. They are “being driven by science”, says Jack Dongarra, a computer science professor at the University of Tennessee and the compiler of the world’s top 500 supercomputer list. And science is driven today by computer simulation, he adds – with high-powered computers crucial to carry out those tests.

Supercomputers can crunch data far faster and more easily than regular computers, making them ideal for handling big data – from cybersecurity to medical informatics to astronomy. “We could quite easily go another four or five orders of magnitude and still find scientific and business reasons to benefit from it,” says Jones.

Oak Ridge, where Summit is housed, is already soliciting bids for a project called Coral II, the successor to the Coral project which resulted in the Summit supercomputer. The Coral II will involve three separate hardware systems, each of which has a price tag of $600 million, says Dongarra. The goal? To build a supercomputer capable of calculating at a rate of exaflops – five times faster than Summit.

While they are faster and more powerful, supercomputers are actually not much different from the hardware we interact with on a daily basis, says Jones. “The basic components are the same as a standard server,” he says. But because of their scale, and the complexity involved in programming them to process information as a single, co-ordinated unit, supercomputer projects require significant financial outlay to build, and political support to attract that funding.

That political involvement transforms them from a simple computational tool into a way of exercising soft power and stoking intercontinental rivalries.

Read more: The complicated truth about China's social credit system

With Summit, the US has wrested back the title of the world’s most powerful supercomputer for the first time since 2012 – though it still languishes behind China in terms of overall processing power. China is the home of 202 of the 500 most powerful supercomputers, having overtaken the US in November 2017.

“What’s quite striking is that in 2001 there were no Chinese machines that’d be considered a supercomputer, and today they dominate,” explains Dongarra. The sudden surge of supercomputers in China over the last two decades is an indication of significant investment, says Jones. “It’s more a reflection of who’s got their lobbying sorted than anything else,” he adds.

Recently, the Chinese leadership has been drifting away “from an aspirational ‘catch-up with the west’ mentality to aspiring to be world class and to lead,” says Jonathan Sullivan, director of the China Policy Institute at the University of Nottingham. “These achievements like the longest bridge, biggest dam and most powerful supercomputer aren’t just practical solutions, they also have symbolic meaning,” he adds.

Or putting it differently: bragging rights matter enormously to whoever’s on top.

This article was originally published by WIRED UK