IESP AND APPLICATIONS. IESP BOF, SC09 Portland, Oregon Paul Messina November 18, 2009

Similar documents
EESI Presentation at IESP

NRC Workshop on NASA s Modeling, Simulation, and Information Systems and Processing Technology

December 10, Why HPC? Daniel Lucio.

Thoughts on Reimagining The University. Rajiv Ramnath. Program Director, Software Cluster, NSF/OAC. Version: 03/09/17 00:15

Parallel Programming I! (Fall 2016, Prof.dr. H. Wijshoff)

The Exascale Computing Project

!! Enabling!Exascale!in!Europe!for!Industry! PRACEdays15!Satellite!Event!by!European!Exascale!Projects!

Graduate Studies in Computational Science at U-M. Graduate Certificate in Computational Discovery and Engineering. and

Parallel Computing 2020: Preparing for the Post-Moore Era. Marc Snir

The marginalisation of cross-cutting issues in CCUS Mission Innovation PRDs

COURSE 2. Mechanical Engineering at MIT


UNCLASSIFIED. UNCLASSIFIED Air Force Page 1 of 13 R-1 Line #1

Overview of the NSF Programs

High Performance Computing i el sector agro-alimentari Fundació Catalana per la Recerca CAFÈ AMB LA RECERCA

Revolutionizing Engineering Science through Simulation May 2006

The PRACE Scientific Steering Committee

Special Contribution Japan s K computer Project

Enabling Scientific Breakthroughs at the Petascale

Sourcing in Scientific Computing

The LinkSCEEM FP7 Infrastructure Project:

High Performance Computing and Modern Science Prof. Dr. Thomas Ludwig

Exascale Initiatives in Europe

DECC Innovation Programme The story so far 5 June 2013

Call for FY 2017 DoD Frontier Project Proposals

National e-infrastructure for Science. Jacko Koster UNINETT Sigma

Computational Science and Engineering Introduction

Call for FY 2018 DoD Frontier Project Proposals

UNCLASSIFIED R-1 ITEM NOMENCLATURE

HIGH-LEVEL SUPPORT FOR SIMULATIONS IN ASTRO- AND ELEMENTARY PARTICLE PHYSICS

THE EARTH SIMULATOR CHAPTER 2. Jack Dongarra

Community Perspective: GeoSpace Observations and Analysis

Centre for Doctoral Training: opportunities and ideas

Extreme Scale Computational Science Challenges in Fusion Energy Research

e-infrastructures for open science

The Spanish Supercomputing Network (RES)

HORIZON 2020 BLUE GROWTH

Computing center for research and Technology - CCRT

Facts and Figures. RESEARCH TEACHING INNOVATION

NUIT Support of Researchers

FP7-INFRASTRUCTURES

Post K Supercomputer of. FLAGSHIP 2020 Project. FLAGSHIP 2020 Project. Schedule

President Barack Obama The White House Washington, DC June 19, Dear Mr. President,

The Bump in the Road to Exaflops and Rethinking LINPACK

Exascale-related EC activities

Petascale Design Optimization of Spacebased Precipitation Observations to Address Floods and Droughts

Cuyamaca MSE PLOs. Exercise Science-1 List and define the five basic components of physical fitness. Active

FORWARD LOOK. Mathematics and Industry Success Stories - DRAFT. European Mathematical Society

Towards EU-US Collaboration on the Internet of Things (IoT) & Cyber-physical Systems (CPS)

The UK e-infrastructure Landscape Dr Susan Morrell Chair of UKRI e-infrastructure Group

Special Newsletter Best Practices

Mathematics, Information, and Life Sciences

Master's Thesis Advanced Exercise in Applied Physics

System Coupling 14.0 Twoway FSI with ANSYS FLUENT and ANSYS Mechanical

M&S Engineering Complex Systems; Research Challenges

Ensuring that CFD for Industrial Applications is Fit for Purpose

COMPUTER SCIENCE AND ENGINEERING

Deep Learning Overview

The Association of Loudspeaker Manufacturers & Acoustics International presents. Dr. David R. Burd

DUE CONFERENCE 2015 FUTURE INTERNET CONCEPTS FOR DEMAND MANAGEMENT. By: Hinesh Madhoo and Tiaan Willemse. Date: 31 March 2015

Broadening the Scope and Impact of escience. Frank Seinstra. Director escience Program Netherlands escience Center

European View on Supercomputing

About the European Science Foundation

GROUP OF SENIOR OFFICIALS ON GLOBAL RESEARCH INFRASTRUCTURES

NASA Fundamental Aeronautics Program Jay Dryer Director, Fundamental Aeronautics Program Aeronautics Research Mission Directorate

HeliophysicsScience Centers

SEAM Pressure Prediction and Hazard Avoidance

Climate Change Innovation and Technology Framework 2017

UNAVCO's Community Planning for real-time GPS in Earthscope's Plate Boundary Observatory

Opening Science & Scholarship

Canada s Most Powerful Research Supercomputer Niagara Fuels Canadian Innovation and Discovery

Project TiFab; Innovative Linear Friction Welding technology for Near Net Shape manufacture of advanced titanium aerospace components

ComPat Tomasz Piontek 12 May 2016, Prague Poznan Supercomputing and Networking Center

Computer & Information Science & Engineering (CISE)

The Path To Extreme Computing

NASA Technology Road Map: Materials and Structures. R. Byron Pipes

International students from non-english speaking backgrounds will be required to take an English language communications course.

Mid/Long-Term Management Policy

e-infrastructures in FP7: Call 9 (WP 2011)

Engineered Resilient Systems DoD Science and Technology Priority

THE BLUEMED INITIATIVE AND ITS STRATEGIC RESEARCH AGENDA

Prescriptive Engineering Knowledge & Models;

High Performance Computing Facility for North East India through Information and Communication Technology

European Commission. 6 th Framework Programme Anticipating scientific and technological needs NEST. New and Emerging Science and Technology

European Research Infrastructures Framework Programme 7

Prototyping: Accelerating the Adoption of Transformative Capabilities

Concepts and Challenges

Supercomputers have become critically important tools for driving innovation and discovery

The Biological and Medical Sciences Research Infrastructures on the ESFRI Roadmap

SEAS-ERA STRATEGIC FORUM

CO-ORDINATION MECHANISMS FOR DIGITISATION POLICIES AND PROGRAMMES:

Perspectives on CFD V&V in Nuclear Regulatory Applications

Innovative Approaches in Collaborative Planning

Commission proposal for Horizon Europe. #HorizonEU THE NEXT EU RESEARCH & INNOVATION PROGRAMME ( )

Clean Energy Smart Manufacturing Innovation Institute

ETSU V/06/00187//REP; DTI Pub/URN 01/799 (for Ove Arup reference:

Dissemination and Exploitation Plan

Invitation for SMEs from associate partner institutions preparing a course under NPTEL

First MyOcean User Workshop 7-8 April 2011, Stockholm Main outcomes

January 2006 DDDAS Workshop Report

Transcription:

IESP AND APPLICATIONS IESP BOF, SC09 Portland, Oregon November 18, 2009

Outline Scientific Challenges workshops Applications involvement in IESP workshops Applications role in IESP

Purpose of DOE workshops To identify grand challenge scientific problems in [research area] that can exploit computing at extreme scales to bring about dramatic progress toward their resolution. The goals of the workshops are to identify grand challenge scientific problems [ ] that could be aided by computing at the extreme scale over the next decade; identify associated specifics of how and why new high performance computing capability will address issues at the frontiers of [ ]; and provide a forum for exchange of ideas among application scientists, computer scientists, and applied mathematicians to maximize the use of extreme scale computing for enabling advances and discovery in [ ].

Science Workshop Series Climate, November 6-7, 2008 HEP, December 9-11, 2008 Nuclear Physics, January 26-28, 2009 Fusion Energy Sciences, March 18-20, 2009 Nuclear Energy, May 11-12, 2009 BES, August 13-15, 2009 Biology, August 17-19, 2009 NNSA, October 6-8, 2009

Process used Workshops were organized jointly by US DOE s office of Advanced Scientific Computing Research and other DOE program offices Workshop chair(s) worked with relevant DOE program offices and colleagues to identify key areas to cover Four six panels defined, panel co-chairs recruited White papers for each panel drafted and posted in advance of workshop Priority Research Directions (PRDs) identified by each panel Panels populated by domain science experts as well as mathematicians and computer scientists, including some international Observers from other agencies and math and CS community invited to each workshop, including some international

Priority Research Direction (use one slide for each) Scientific and computational challenges Brief overview of the underlying scientific and computational challenges Summary of research direction What will you do to address the challenges? Potential scientific impact What new scientific discoveries will result? What new methods and techniques will be developed? Potential impact on SCIENCE DOMAIN How will this impact key open issues in SCIENCE DOMAIN? What s the timescale in which that impact may be felt?

Climate PRDs for Model Development and Integrated Assessment How do the carbon, methane, and nitrogen cycles interact with climate change? How will local and regional water, ice, and clouds change with global warming? How will the distribution of weather events, particularly extreme events, that determine regional climate change with global warming? What are the future sea level and ocean circulation changes?

Climate PRDs for Algorithms and Computational Environment Develop numerical algorithms to efficiently use upcoming petascale and exascale architectures Form international consortium for parallel input/ output, metadata, analysis, and modeling tools for regional and decadal multimodel ensembles Develop multicore and deep memory languages to support parallel software infrastructure Train scientists in the use of high-performance computers.

Exa-scale Computational Resources (slide courtesy Martin Savage) Meeting structured around present Nuclear Physics areas of effort Nuclear Astrophysics Cold QCD and Nuclear Forces Nuclear Structure and Reactions Accelerator Physics Exa-scale computing is REQUIRED to accomplish the Nuclear Physics mission in each area Staging to Exa-flops is crucial : 1 Pflop-yr to 10 Pflop-yrs to 100 Pflop-yrs to 1 Exa-flop-yr (sustained) June 28, 2009 Hot and Dense QCD

Nuclear Energy materials modeling Applications of high performance (peta-scale and exa-scale) computing carry along both the burden and the opportunity of improved uncertainty evaluations, margins quantifications, and reliable predictions of materials behavior. Exascale computing will enable simulations of trillions of atoms over seconds or days simulations of complex, coupled physics and chemistry of reactor materials.

Recurring Topics in the Workshops -- Applications Exascale is needed: the science case is clear Predictive simulations develop experimentally validated, predictive capabilities that can produce accurate and robust simulations Ensemble simulations for V&V, UQ Multiphysics simulations Data volumes will be huge Observations and simulation results

Recurring Topics in the Workshops CS & Applied Mathematics Multiphysics and multiscale algorithm and software coupling Algorithms and software that deal with millions of cores and even more threads Data handling Interoperability Workflow issues Fault tolerance

For more information http://extremecomputing.labworks.org/

Role of Applications in IESP Identify applications needs in algorithms, software tools, programming frameworks Work with selected applications to co-design the software environment and architectures (but not as part of IESP)

Applications considered (not comprehensive, not ordered) Climate change Meterology Materials science Biology Plasma physics/fusion Geophysics Fluid dynamics Structural mechanics Electromagnetics Aerodynamics Combustion Lattice quantum chromodynamics Biophysics Astronomy/cosmology Molecular dynamics Video processing Chemistry Nuclear engineering/fission Epidemiology Nanotechnology/microelectronics Emergent sciences (e.g., social, networks, etc.)

Applications serve as Co-Design Vehicles Technology drivers Advanced architectures with greater capability but with formidable software development challenges Alternative R&D strategies Choosing architectural platform(s) capable of addressing PRD s of Co-Design Vehicles on path to exploiting Exascale Recommended research agenda Effective collaborative alliance between Co-Design Vehicles, CS, and Applied Math with an associated strong V&V effort Crosscutting considerations Identifying possible common areas of software development need among the Apps that serve as co-design vehicles Addressing common need to attract, train, and assimilate young talent into this general research arena

4.3.1 Co-design Vehicles: Priority Research Directions Criteria for Considera-on (1) Demonstrated need for Exascale (2) Significant Scien-fic Impact in: basic physics, environment, engineering, life sciences, materials (3) Realis-c Produc-ve Pathway (over 10 years) to Exploita-on of Exascale Summary of Barriers & Gaps What will co- design vehicles do to address the barriers & gaps in associated Priority Research Direc-ons (PRD s)? Poten-al Impact on SoNware What new sonware capabili-es will result? What new methods and tools will be developed? Poten-al impact on user community (usability, capability, etc.) How will this realis-cally impact the research advances targeted by co- design vehicles that may benefit from exascale systems? What s the -mescale in which that impact may be felt?

Computational challenges at the exascale Model Complexity The cloud feedbacks is the largest source of uncertainty in climate sensitivity estimates. Cloud Resolving Model (CRM) replaces the conventional convective and stratiform cloud parameterizations Global Cloud Resolving Model (GCRM as integration between Global Circulation Model and CRM) represents a global atmospheric circulation model with a grid-cell spacing of approximately 3 km, capable of simulating the circulations associated with large convective clouds. The major limitation is its high computational cost. Exascale architectures provide a solution to the last issue.

Industrial challenges in the Oil & Gas industry: Depth Imaging roadmap 10 15 flops 1000 Algorithm complexity 3-55 Hz 9.5 PF 100 Visco elastic FWI petro-elastic inversion 3-35 Hz 10 elastic FWI visco elastic modeling 900 TF 1 isotropic/anisotropic FWI elastic modeling/rtm 3-18 Hz 56 TF 0,5 isotropic/anisotropic RTM isotropic/anisotropic modeling RTM 0,1 Paraxial isotropic/anisotropic imaging Substained performance for different frequency content over a 8 day processing duration Asymptotic approximation imaging 1995 2000 2005 2010 2015 2020 Algorithmic complexity Vs. corresponding computing power HPC Power PAU (TF) courtesy 19

High Performance Computing Capacity: # of Overnight Loads cases run 10 2 enabler LES UnsteadyR ANS Available Computational Capacity [Flop/s] 1 Zeta (10 21 ) 10 3 RANS Low Speed 1 Exa (10 18 ) 10 4 RANS High Speed 1 Peta (10 15 ) x10 6 10 5 10 6 Smart use of HPC power: Algorithms Data mining knowledge 1 Tera (10 12 ) 1 Giga (10 9 ) HS Design 1980 1990 2000 2010 2020 2030 Data Set CFD-based LOADS & HQ Aero Optimisation & CFD-CSM Capability achieved during one night batch Full MDO CFD-based noise simulation Real time CFD based in flight simulation Courtesy AIRBUS France

Computational Challenges and Needs for Academic and Industrial Applications Communities 2003 2006 2007 2010 2015 Consecutive thermal fatigue event 9 fuel assemblies The whole vessel reactor Computations enable to better understand the wall thermal loading in an injection. Knowing the root causes of the event define a new design to avoid this problem. Computation with an L.E.S. approach for turbulent modelling Refined mesh near the wall. Part of a fuel assembly 3 grid assemblies No experimental approach up to now Will enable the study of side effects implied by the flow around neighbour fuel assemblies. Better understanding of vibration phenomena and wear-out of the rods. Computations with smaller and smaller scales in larger and larger geometries a better understanding of physical phenomena a more effective help for decision making A better optimisation of the production (margin benefits) 10 6 cells 10 7 cells 10 8 cells 3.10 13 operations 6.10 14 operations 10 16 operations 10 9 cells 3.10 17 operations 10 10 cells 5.10 18 operations Fujistu VPP 5000 1 of 4 vector processors Cluster, IBM Power5 400 processors IBM Blue Gene/L 20 Tflops during 1 month 600 Tflops during 1 month 10 Pflops during 1 month 2 month length computation 9 days # 1 Gb of storage # 15 Gb of storage # 200 Gb of storage # 1 Tb of storage # 10 Tb of storage 2 Gb of memory 25 Gb of memory 250 Gb of memory 2,5 Tb of memory 25 Tb of memory Power of the computer Pre-processing not parallelized Pre-processing not parallelized Mesh generation IESP/Applica,on Subgroup ibid. ibid. Scalability / Solver ibid. ibid. ibid. Visualisation

From sequences to structures : HPC Roadmap 2009 2011 2015 and beyond Grand Challenge GENCI/CCRT Proteins 69 (2007) 415 Identify all protein sequences using public resources and metagenomics data, and systematic modelling of proteins belonging to the family (Modeller software). Improving the prediction of protein structure by coupling new bio-informatics algorithm and massive molecular dynamics simulation approaches. Systematic identification of biological partners of proteins. Computations using more and more sophisticated bio-informatical and physical modelling approaches Identification of protein structure and function 1 family 1 family 5.103 cpu/~week 5.104 cpu/~week 1 family ~ 104*KP cpu/~week CSP : proteins structurally characterized ~ 104 # 25 Gb of storage # 5 Tb of storage # 5*CSP Tb of storage 500 Gb of memory 5 Tb of memory 5*CSP Tb of memory

Summary Many application domains will benefit from usable exascale systems IESP is involving representatives from a number of those applications areas Some applications teams are eager to serve as codesign vehicles