Overview

Workshop Program

Overview

Driving scientific AI performance on HPC systems with MLPerf benchmarks

Abstract AI methods are powerful tools that promise to dramatically change the way we do science on high performance computational resources. Adoption of these techniques is growing across many domains, with important use-cases including analysis of experimental data, acceleration of expensive simulations, and the control or design of experiments. Meanwhile, the computational costs, particularly in the area of training deep neural network models, are growing dramatically as we adopt increasingly large and complex models, tasks, and datasets.

Working with Proxy-Applications: Interesting Findings, Lessons Learned, and Future Directions

Abstract A considerable amount of research and engineering went into designing proxy applications, which represent common high-performance computing workloads, to co-design and evaluate the current generation of supercomputers, e.g., RIKEN's Supercomputer Fugaku, ANL's Aurora, and ORNL's Frontier. These scaled-down versions of important scientific applications helped researchers around the world to identify bottlenecks, performance characteristics, and fine-tune architectures for their needs. We present some of our findings which will also influence the next generation of HPC systems.

Lightweight Requirements Engineering for Exascale Co-design

Abstract Given the tremendous cost of an exascale system, its architecture must match the requirements of the applications it is supposed to run as precisely as possible. Conversely, applications must be designed such that building an appropriate system becomes feasible, motivating the idea of co-design. In this process, a fundamental aspect of the application requirements are the rates at which the demands for different resources grow as a code is scaled to a larger machine.

Modeling and Simulating Future Computer Architectures for Data-intensive Applications

Abstract IARPA has initiated a new program, the Advanced Graphic Intelligence Logical computing Environment (AGILE), to develop innovative, efficient, scalable computer architecture designs capable of executing next-generation large-scale data-analytic applications. This effort will require flexible, scalable, and detailed simulation of the architecture designs to assess the performance, efficiency, and validity. Data-intensive applications transform massive, unstructured, heterogeneous data streams into actionable knowledge. Today, data is increasing exponentially in volume, velocity, variety, and complexity.