David Keyes
Abstract: An oft-quoted motivation for extreme computing is to tackle critical energy and environmental simulations, to improve prediction by: relaxing the decoupling, improving the fidelity of the physics, increasing the resolution, and quantifying the uncertainty. Meeting these objectives will indeed justify the daunting development, acquisition, and operation costs of the new hardware. New hardware is, however, only part of the challenge and carries perhaps neither the highest risk nor the highest payoff. Much mathematics and software appears to be missing if the hardware is to be used near its potential, since our existing scientific computing code base has been assembled under pressure to squeeze out as many floating point operations as possible and to improve the execution rate of the ones that remain. Instead, for reasons of energy and acquisition cost, algorithms must now focus on squeezing out synchronizations, memory, and memory transfers, including copies. For instance, the promises of multiphysics simulation will not be realized in extreme-scale computational environments in the primary manner through which codes are coupled today, through divide-and-conquer operator splitting, in which entire groups of interdependent fields are processed sequentially, each waiting for completion of the previous and each being copied back in for reading by another after being processed. High concurrency and power-efficient design of the individual cores put opposite pressures on algorithms: requiring both more data locality and more freedom to redistribute data and computation. Besides porting current algorithmic functionality to emerging architectures, new functionality needs to be developed to exploit the possibilities of the merger of the third and fourth paradigms of large-scale data and large-scale simulation, in areas such as data assimilation, inversion, optimization and design, sensitivity analysis, and uncertainty quantification. After decades of algorithm refinement during a period of programming model stability new programming models and new capabilities must be developed simultaneously. We extrapolate current trends and speculate on new directions.
Biography: David Keyes is the inaugural Dean of Computer, Electrical and Mathematical Sciences and Engineering at KAUST and an adjunct professor of Applied Mathematics at Columbia University. Keyes graduated in Aerospace and Mechanical Sciences from Princeton and earned a doctorate in Applied Mathematics from Harvard. With backgrounds in engineering, applied mathematics, and computer science, he works at the algorithmic interface between parallel computing and the numerical analysis of partial differential equations. For his algorithmic influence in simulation, Keyes has been named a Fellow of the Society for Industrial and Applied Mathematics (SIAM) and of the American Mathematical Society (AMS). The IEEE Computer Society gave him the Sidney Fernbach Award is 2007 and he was a co-winner of ACM’s Gordon Bell Prize in 1999. Editor or co-author of many U.S. federal agency reports on computational science and engineering, Keyes received the SIAM Prize for Distinguished Service to the Profession in 2011.