LIVERMORE — Suddenly, your smart phone feels really dumb.
Meet Sierra, Lawrence Livermore National Lab’s’ new supercomputer that can perform 125 quadrillion calculations per second — think 125 followed by 15 zeroes — and will guard our nation’s nuclear stockpile.
To match that, every person on Earth would have to do one calculation every second, for 24 hours a day – for an entire year.
Unveiled Friday, the $150 million Sierra gives the United States bragging rights to two of the top three positions in global supercomputing. The new machine ranks behind Oak Ridge National Lab’s Summit and China’s Sunway TaihuLight.
It’s not just powerful, it has a stunning memory. There’s enough storage space to hold every written work of humanity, in all languages – twice.
“But it is not how big or where it ranks, it’s the science it will support,” said Bronis de Supinski, Livermore Computering’s chief technology officer and head of Livermore Lab’s Advanced Technology systems.
Sierra was conceived in a hotel room near Chicago’s O’Hare Airport at the end of 2012, in a U.S. Department of Energy collaboration between Livermore, Oak Ridge and Argonne. But over its four years of construction, the project encountered logistical hiccups, technical challenges and one major surprise: The surging cost of memory, tied to the global demand for smart phones. It caused prices to double in the final three months of 2016, said de Supinski. After negotiations, IBM made changes to its network to compensate, keeping the project on budget.
Despite our advances, the National Security Agency and the Department of Energy have warned that China is poised to outrank America in high-performance computing.
Built by IBM and NVIDIA, Sierra is desiged to support the nation’s three nuclear security laboratories: Lawrence Livermore, Sandia National Laboratories and Los Alamos National Laboratory.
And that support is critical. As North Korea continues to pursue nuclear-weapon technologies, our system is aging. Not even a bicycle is engineered to sit inactive for decades and still be able to spring into action at a moment’s notice. But that’s what is expected of a nuclear weapon.
For example, how would a hairline crack affect the life of a nuclear warhead? Without detonation, Sierra helps us find out. It can process the data needed to create a 3D picture, modeling and simulating a growing fracture in the deadly device.
“It enables simulations 100,000 times more realistic than is possible on a desktop,” said Fred Streitz, director of the Lab’s Institute for Scientific Computer Research.
Bruce Hendrickson, associate director of computation at the Lab, said the project has been a long time coming. He said the new supercomputer will open new scientific possibilities for any project — massive or minuscule.
“Talking about this as a computer is saying the roof of the Sistine Chapel is covered with paint,” said Hendrickson. “A better analogy would be Hubble Space Telescope.”
But Sierra is less lovely than its name, with flashing green lights instead of romantic peaks. Cooling fans emit a dull roar from its black refrigerator-sized units, packed into a windowless 6,000 square feet of floor space — the size of two tennis courts — inside a nondescript tan building.
Weighing as much as 40 elephants, its heft required that the floor be structurally reinforced. It’s also seismically protected, sitting on plates that shift in response to earthquakes. It expends 12 million watts of power, equivalent to 9,000 homes — and runs so hot that it must be cooled not just by air fans, but 3,500 gallons of circulating water every single minute.
Sierra’s room is protected by tight security. First, you need to get through the Lab’s notoriously tight clearance, then punch in a secret code to the door. Friday’s wide-eyed visitors were politely herded as a tight group, and could only gaze from yards away.
Supercomputing has come a long way since the Kansas City National Security Campus made history in 1964 by installing a brand new disk drive that could store 95,000 punch cards worth of data, or about 7.6 megabytes.
Sierra is powerful because it is a so-called “heterogenous” supercomputer, capable of moving data between a high-speed connection that links its central processing units (CPUs) and graphics processing units (GPUs).
Supercomputers may seem superfluous in this age of cloud computing and massive data centers. But the toughest computational problems require speed of giant machines — and the nation’s leaders aren’t willing to trust our top data to private commercial businesses, no matter how much they promise to keep things safe, said de Supinski.
Supercomputers can also aid science, medicine, energy and climate change. Its simulations allow scientists to pursue research in genetics, cell structure and atmospheric fluctuations that was previously impractical or impossible.
For all its grandeur, Sierra is short-lived, with a top-performing lifespan of five or six years, said de Supinski.
Eventually, it too will seem ploddingly slow, as the next frontier of computers — called “exascale” machines — will exceed a quintillion — a billion-billion — calculations each second.