Supercomputing in an Era of Big Data and Big Collaboration
Supercomputing has reached a level of maturity and capability where many areas of science and engineering are not only advancing rapidly due to computing power, they cannot progress without it. Detailed simulations of complex astrophysical phenomena, HIV, earthquake events, and industrial engineering processes are being done, leading to major scientific breakthroughs or new products that cannot be achieved any other way. These simulations typically require larger and larger teams, with more and more complex software environments to support them, as well as real world data. But as experiments and observation systems are now generating unprecedented amounts of data, which also must be analyzed via large-scale computation and compared with simulation, a new type of highly integrated environment must be developed where computing, experiment, and data services will need to be developed together. I will illustrate examples from NCSA’s Blue Waters supercomputer, and from major data-intensive projects including the Large Synoptic Survey Telescope, and give thoughts on what will be needed going forward.
Bio
Edward Seidel is the director of the National Center for Supercomputing Applications, a distinguished researcher in high-performance computing and relativity and astrophysics, and a Founder Professor in the University of Illinois Department of Physics and a professor in the Department of Astronomy. His previous leadership roles include serving as the senior vice president for research and innovation at the Skolkovo Institute of Science and Technology in Moscow, directing the Office of Cyberinfrastructure and serving as assistant director for Mathematical and Physical Sciences at the U.S. National Science Foundation, and leading the Center for Computation & Technology at Louisiana State University. His research has been recognized by a number of awards, including the 2006 IEEE Sidney Fernbach Award, the 2001 Gordon Bell Prize, and 1998 Heinz-Billing-Award.
Scalability Limits for Scientific Simulation
Current high-performance computing platforms feature millions of processing units, and it is anticipated that exascale architectures featuring billion-way concurrency will be in place in the early 2020s. The extreme levels of parallelism in these architectures influence many design choices in the development of next-generation algorithms and software for scientific simulation. This talk explores some of the challenges faced by the scientific computing community in the post-frequency-scaling era. To set the stage, we first describe our experiences in the development of scalable codes for computational fluid dynamics that have been deployed on over a million processors. We then explore fundamental computational complexity considerations that are technology drivers for the future of PDE-based simulation. We present performance data from leading-edge platforms over the past three decades and couple this with communication and work models to predict the performance of domain decomposition methods on model exascale architectures. We identify the key performance bottlenecks and expected performance limits at these scales and note a particular need for design considerations that will support strong scaling in the future.
Paul Fischer is a Blue Waters Professor at the University of Illinois at Urbana-Champaign in the departments of Computer Science and Mechanical Science & Engineering. He received his Ph.D. in mechanical engineering from MIT and was a post-doc in applied mathematics at Caltech, where he was the first Center for Research in Parallel Computation fellow. His work is in the area of high-order numerical methods for partial differential equations and scalable linear solvers. He is the architect of the open-source fluid dynamics/heat transfer code Nek5000, which is based on the spectral element method. Nek5000 has scaled beyond a million ranks and has been awarded the Gordon Bell Prize in high-performance computing. It is used by more than 200 researchers for a variety of applications in turbulent and transitional flows.