Sign In

Communications of the ACM

ACM TechNews

Challenges to Exascale Computing

View as: Print Mobile App Share:
MIT visiting lecturer Irving-Wladawsky-Berger

"Massively parallel architectures . . . have dominated supercomputing over the past twenty years. But they will not get us to exascale," says Irving Wladawsky-Berger, visiting lecturer at MIT's Sloan School.

Credit: HICSS

Former IBM researcher and visiting lecturer at the Massachusetts Institute of Technology Irving Wladawsky-Berger writes that supercomputing and the information technology industry will need to undergo a major technology and architectural transition in order to reap the benefits of exascale computing. Wladawsky-Berger cites the U.S. Defense Advanced Research Projects Agency's recognition of four key technology challenges through its ExaScale Computing Study. Those challenges encompass energy and power, memory and storage, concurrency and locality, and resiliency.

One of exascale computing's most persuasive arguments is its facilitation of a tipping point in predictive science, with a potentially huge impact on massively complex problems. Dealing with such problems, which contain innate uncertainties and unpredictability, entails the concurrent running of multiple copies of the same applications using numerous distinctive combinations of parameters. Areas that stand to benefit from this new style of predictive modeling include nuclear reactor design, climate studies, economics, medicine, government, and business, says Wladawsky-Berger.

From International Science Grid This Week
View Full Article


Abstracts Copyright © 2010 Information Inc., Bethesda, Maryland, USA


No entries found