Loading

CUG 2013 – Invited Speakers

The Cray User Group welcomes Horst D. Simon, Deputy Laboratory Director, Lawrence Berkeley National Laboratory, as the keynote speaker on Tuesday, May 7, 2013.

“Why we need Exascale, and why we won’t get there by 2020.”

It may come as surprise to many who are currently deeply engaged in research and development activities   that could lead us to exascale computing, that it has been already exactly six years, since the first set of community town hall meetings were convened in the U.S. to discuss the challenges for the next level of computing in science. It was in April and May 2007, when three meetings were held in Berkeley, Argonne and Oak Ridge that formed the basis for the first comprehensive look at exascale [1].

What is even more surprising is that in spite of numerous national and international initiatives that have been created in the last five years, the community has not made any significant progress towards reaching the goal of an Exaflops system. If one reflects and looks back at early projections, for example in 2010, it seemed to be possible to build at least a prototype of an exascale computer by 2020. This view was expressed in documents such as [2], [3]. I believe that the lack of progress in the intervening years has made it all but impossible to see a working exaflops system by 2020. Specifically, I do not expect a working Exaflops system to appear on the #1 spot of the TOP500 list with a RMAX performance exceeding 1 Exaflop/s by November 2019. In this talk I will explain why this is a regrettable lack of progress and what the major barriers are.

References

1. Simon, H., Zacharia, T., and Stevens, R.: Modeling and Simulation at the Exascale for Energy and Environment, Berkeley, Oak Ridge, Argonne (2007), http://science.energy.gov/ascr/news-and-resources/program-documents/
2. Stevens, R. and White, A.: Crosscutting Technologies for Computing at Exaflops, San Diego, (2009). http://science.energy.gov/ascr/news-andresources/ workshops-and-conferences/grand-challenges/
3. Shalf, J., Dosanjh, S. , Morrison, J. : Exascale Computing Technology Challenges. VECPAR 2010: 1-25.

Bio

Horst Simon, an internationally recognized expert in computer science and applied mathematics, was named Berkeley Lab’s Deputy Director on September 13, 2010. Simon joined the Lab in early 1996 as director of the newly formed National Energy Research Scientific Computing Center (NERSC), and was one of the key architects in establishing NERSC at its new location in Berkeley. Under his leadership NERSC enabled important discoveries for research in fields ranging from global climate modeling to astrophysics. Simon was also the founding director of Berkeley Lab’s Computational Research Division, which conducts applied research and development in computer science, computational science, and applied mathematics.

In his prior role as Associate Lab Director for Computing Sciences, Simon helped to establish Berkeley Lab as a world leader in providing supercomputing resources to support research across a wide spectrum of scientific disciplines. He has also served as an adjunct professor in the College of Engineering at the University of California, Berkeley, working to bring the Lab and the campus closer together and developing a designated graduate emphasis in computational science and engineering. In addition, he has worked with project managers from the Department of Energy, the National Institutes of Health, the Department of Defense and other agencies, helping researchers define their project requirements and solve technical challenges.

Simon’s research interests are in the development of sparse matrix algorithms, algorithms for large-scale eigenvalue problems, and domain decomposition algorithms for unstructured domains for parallel processing. His algorithm research efforts were honored with the 1988 and the 2009 Gordon Bell Prize for parallel processing research. He was also member of the NASA team that developed the NAS Parallel Benchmarks, a widely used standard for evaluating the performance of massively parallel systems. He is co-editor of the biannual TOP500 list that tracks the most powerful supercomputers worldwide, as well as related architecture and technology trends.

He holds an undergraduate degree in mathematics from the Technische Universtät, in Berlin, Germany, and a Ph.D. in Mathematics from the University of California at Berkeley.

For more information about Horst Simon, visit his Website at http://www.lbl.gov/Publications/Deputy-Director/bio.html.

 

Wednesday, May 8th – Julian Borrill, Senior Staff Scientist, Computational Cosmology Center, Berkeley Lab & Senior Research Physicist, Space Sciences, Laboratory, UC Berkeley

“Big Bang, Big Data, Big Iron – Analyzing Data From the Planck Satellite Mission”

On March 21st 2013 the European Space Agency announced the first cosmology results from its billion-dollar Planck satellite mission. The culmination of 20 years of work, Planck’s observations of the Cosmic Microwave Background – the faint echo of the Big Bang itself – provide profound insights into the foundations of cosmology and fundamental physics.

Planck has been making 10,000 observations of the CMB every second since mid-2009, providing a dataset of unprecedented richness and precision; however the analysis of these data is an equally unprecedented computational challenge. For the last decade we have been developing the high performance computing tools needed to make this analysis tractable, and deploying them on Cray systems at supercomputing centers in the US and Europe.

This first Planck data release required tens of millions of CPU-hours on the National Energy Research Scientific Computing Center’s Hopper system. This included generating the largest Monte Carlo simulation set ever fielded in support of a CMB experiment, comprising 1,000 realizations of the mission reduced to 250,000 maps of the Planck sky. However our work is far from done; future Planck data releases will require ten times as many simulations, and next-generation CMB experiments will gather up to a thousand times as much data as Planck.

Bio

Julian Borrill is a Senior Staff Scientist in the Computational Research Division at Berkeley Lab and a Senior Research Physicist at the Space Sciences Laboratory at UC Berkeley. In 2007 he co-founded the Computational Cosmology Center – a joint enterprise of Berkeley Lab’s Physics and Computational Research Divisions.

His work is focused on developing and deploying the high performance computing tools needed to analyze the massive datasets being gathered by cosmology experiments, and in particular from observations of the Cosmic Microwave Background – the faint echo of the Big Bang itself.

He is a member of the international Planck satellite collaboration, computational systems architect for the US Planck team, manager of the CMB community’s HPC resources at the DOE’s National Energy Research Scientific Computing Center, and executive committee member for High Energy Physics on the NERSC User Group.

Prior to moving to Berkeley in 1997 he held postdoctoral positions at Dartmouth College, New Hampshire and Imperial College, London. He has an M.A. in Mathematics and Political Science from the University of Cambridge, an M.Sc. in Astrophysics from the University of London, an M.Sc. in Information Technology also from the University of London, and a D.Phil. in Theoretical Physics from the University of Sussex.

For more information about Julian Borrill, visit his website at http://crd.lbl.gov/borrill .