Traditional supercomputers focused on performing calculations at blazing speeds havefallen behindwhen it comes to sifting through huge amounts of “Big Data.” That is why IBM has redesigned the next generation of supercomputers to minimize the need to shuffle data between the memory that stores the data and the processors that do the computing work. The tech giant recently earned US $325 million in federal contracts to build the world’s most powerful supercomputers for two U.S. government labs by 2017.
The first of the two new supercomputers, called “Summit”, will help researchers crunch the data needed to simulate a nuclear reactor and model climate change at the U.S. Department of Energy’s Oak Ridge National Laboratory in Tennessee. The second supercomputer, named“Sierra”, will provide the predictive power necessary for researchersat Lawrence Livermore National Laboratory in California to conduct virtual testing of the aging nuclear weapons in the U.S. stockpile. Both supercomputers rely upon what IBM says is a “data centric” architecture that spreads computing power throughout the system to crunch data at many different points, rather than moving data back and forth within the system. But the company did not detail how it would accomplish this. …[Read more]
Comments :