Archive for the ‘Exascale’ Category



China’s Milky Way 2 supercomputer was recently declared the fastest supercomputer in the world by industry scorekeeper Top500, the latest move in the increasingly international race for high performance computing supremacy. Late last month, CI Senior Fellow Rick Stevens appeared on Science Friday, alongside Top 500 editor Horst Simon, to talk about why that competition matters, and what the global push for faster computation will do for medicine, engineering and other sciences.

“These top supercomputers are like time machines,” Stevens said. “They give us access to a capability that won’t be broadly available for five to ten years. So whoever has the time machine is able to do experiments, able to see into the future deeper and more clearly than those that don’t have such machines.”

The same time machine metaphor was also picked up by the University of Chicago’s profile of Mira, our local Top500 competitor, which was bumped down to #5 by the Milky Way 2’s top ranking. But there’s no shame in fifth-best, when fifth-best can run 10 quadrillion calculations per second — the equivalent computing power of 58 million iPads. CI Senior Fellow Gregory Voth is quoted about how access to such a world-class resource helps both today and tomorrow’s scientists.

“Having access to a computing resource like Mira provides excellent opportunities and experience for educating up-and-coming young scientists as it forces them to think about how to properly utilize such a grand resource very early in their careers,” Voth says. “This gives them a unique perspective on how to solve challenging scientific problems and puts them in an excellent position to utilize computing hardware being imagined now for tomorrow.”


The Data Science for Social Good fellowship has reached the halfway point, and the website is starting to fill up with interesting content about the projects. Some fellows have already produced tools for the community to use, such as Paul Meinshausen’s interactive tree map of the City of Chicago’s Data Portal. Instead of a cold, no-frills list of the datasets available for download by the public, Meinshausen’ s map uses color and shape to guide users quickly to the data they are seeking and make rapid comparisons about the size of the dataset. The visualization was popular enough that programmers in Boston and San Francisco quickly applied his code to their own city’s data portals, while another built a common map for every city that uses Socrata software to share its data.


Read Full Post »

rick-stevens-dataThe exascale — one million trillion calculations per second — is the next landmark in the perpetual race for computing power. Although this speed is 500 times faster than the world’s current leading supercomputers and many technical challenges remain, experts predict that the exascale will likely be reached by 2020. But while the United States is used to being the frontrunner in high-performance computing achievement, this leg of the race will feature intense competition from Japan, China and Europe. In order to pass the exascale barrier first and reap the application rewards in energy, medicine and engineering research, government funding is critical.

On Capitol Hill yesterday, CI Senior Fellow Rick Stevens testified to this urgency as part of a Congressional Subcommittee on Energy hearing, “America’s Next Generation Supercomputer: The Exascale Challenge.” The hearing was related to the American High-End Computing Leadership Act [pdf], a bill proposed by Rep. Randy Hultgren of Illinois to improve the HPC research program of the Department of Energy and make a renewed push for exascale research in the United States. You can watch archived video of the hearing here, and Stevens’ prepared opening statement is reproduced in full below.


Thank you Chairman Lummis, Ranking Member Swalwell, and Members of the Subcommittee. I appreciate this opportunity to talk to you about the future of high performance computing research and development, and about the importance of U.S. leadership in the development and deployment of Exascale computing.

I am Rick Stevens, the Associate Laboratory Director responsible for Computing, Environment, and Life Sciences research at Argonne National Laboratory. My laboratory operates one of the two Leadership Class computing systems for DOE’s Office of Science. My own research focuses on finding new ways to increase the impact of computation on science – from the development of new more powerful computer systems to the creation of large-scale applications for computational genomics targeting research in energy, the environment and infectious disease. I also am a Professor at the University of Chicago in the Department of Computer Science, where I hold senior fellow appointments in the University’s Computation Institute and the Institute for Genomics and Systems Biology.

I believe that advancing American leadership in high-performance computing is vital to our national interest. High-performance computing is a critical technology for the nation. It is the underlying foundation for advanced modeling and simulation and big data applications.


Read Full Post »