» CHUCK MOREFIELD: In 1956 the early thinkers in artificial intelligence, including Oliver Selfridge, Marvin Minsky, and others, met at Dartmouth.

Similar documents
As we develop automated systems to assist us with rapid decisionmaking,

Architecting Systems of the Future, page 1

OFFensive Swarm-Enabled Tactics (OFFSET)

Operations Research & Analysis 2025: What are the roots and where do we go next

The Monolithic Radio Frequency Array & the Coming Revolution of Convergence

Communication is ubiquitous; communication is the central fabric of human existence.

Parallel Computing 2020: Preparing for the Post-Moore Era. Marc Snir

CSC 550: Introduction to Artificial Intelligence. Fall 2004

Engineering Autonomy

Goals of this Course. CSE 473 Artificial Intelligence. AI as Science. AI as Engineering. Dieter Fox Colin Zheng

Innovation Report: The Manufacturing World Will Change Dramatically in the Next 5 Years: Here s How. mic-tec.com

Looking ahead : Technology trends driving business innovation.

Research Statement. Sorin Cotofana

Challenges and Opportunities in the Changing Science & Technology Landscape

2018 Research Campaign Descriptions Additional Information Can Be Found at

Enabling Scientific Breakthroughs at the Petascale

Understanding DARPA - How to be Successful - Peter J. Delfyett CREOL, The College of Optics and Photonics

Neural Networks The New Moore s Law

Proposers Day Workshop

Hypernetworks in the Science of Complex Systems Part I. 1 st PhD School on Mathematical Modelling of Complex Systems July 2011, Patras, Greece

Distributed Robotics: Building an environment for digital cooperation. Artificial Intelligence series

Executive Summary. Chapter 1. Overview of Control

The Nature of Informatics

High Performance Computing Systems and Scalable Networks for. Information Technology. Joint White Paper from the

Decision Superiority. Presented to Williams Foundation August JD McCreary Chief, Disruptive Technology Programs Georgia Tech Research Institute

BEST STUDENT MARKETING CAMPAIGN BNP PARIBAS CODEHUNTER

Rethinking CAD. Brent Stucker, Univ. of Louisville Pat Lincoln, SRI

AI in Practice - or: Robots are Not Your Enemy

CPE/CSC 580: Intelligent Agents

ENHANCED HUMAN-AGENT INTERACTION: AUGMENTING INTERACTION MODELS WITH EMBODIED AGENTS BY SERAFIN BENTO. MASTER OF SCIENCE in INFORMATION SYSTEMS

Supercomputers have become critically important tools for driving innovation and discovery

Autonomy Test & Evaluation Verification & Validation (ATEVV) Challenge Area

Executive Summary. The process. Intended use

Intro to Artificial Intelligence Lecture 1. Ahmed Sallam { }

Foundations Required for Novel Compute (FRANC) BAA Frequently Asked Questions (FAQ) Updated: October 24, 2017

INSTITUTE FOR TELECOMMUNICATIONS RESEARCH (ITR)

Human-Centric Trusted AI for Data-Driven Economy

STOA Workshop State of the art Machine Translation - Current challenges and future opportunities 3 December Report

Industry 4.0: the new challenge for the Italian textile machinery industry

Innovation Report: The Future of 3D Printing & Tooling it for the Manufactured World. mic-tec.com

Thoughts on Reimagining The University. Rajiv Ramnath. Program Director, Software Cluster, NSF/OAC. Version: 03/09/17 00:15

Success Stories within Factories of the Future

Digital Engineering. Phoenix Integration Conference Ms. Philomena Zimmerman. Deputy Director, Engineering Tools and Environments.

Technology Trends with Digital Transformation

UNCLASSIFIED. R-1 ITEM NOMENCLATURE PE S: Microelectronics Technology Development and Support (DMEA) FY 2013 OCO

Embedding Artificial Intelligence into Our Lives

Knowledge Management for Command and Control

D8.1 PROJECT PRESENTATION

Position, Navigation, and Timing Branch C2D, Battle Command Division Fort Monmouth, NJ

Deep Learning Overview

ENGINEERS, TECHNICIANS, ICT EXPERTS

Disrupt or be Disrupted: Research Findings from the CDO Project & Policy Implications

LEVERAGING VIRTUAL REALITY Visualizing Risk & Opportunity Mike Prefling & Greg Martin Autodesk Join the conversation #AU2016

Digital Transformation. A Game Changer. How Does the Digital Transformation Affect Informatics as a Scientific Discipline?

Challenging the Situational Awareness on the Sea from Sensors to Analytics. Programme Overview

Modeling and Simulation: Linking Entertainment & Defense

UNCLASSIFIED. UNCLASSIFIED Office of Secretary Of Defense Page 1 of 5 R-1 Line #102

Constants and Variables in 30 Years of Science and Technology Policy. Luke Georghiou University of Manchester Presentation for NISTEP 30 Symposium

SESAR EXPLORATORY RESEARCH. Dr. Stella Tkatchova 21/07/2015

Executive summary. AI is the new electricity. I can hardly imagine an industry which is not going to be transformed by AI.

Modeling & Simulation Roadmap for JSTO-CBD IS CAPO

ICSB Top 10 Trends for 2019 Micro-, Small and Medium-sized Enterprises (MSMEs) continue to be on the move!

Digital Engineering (DE) and Computational Research and Engineering Acquisition Tools and Environments (CREATE)

NRC Workshop on NASA s Modeling, Simulation, and Information Systems and Processing Technology

Chapter Sixteen. Inventing the Future

Exascale Initiatives in Europe

The first topic I would like to explore is probabilistic reasoning with Bayesian

Weaponizing the Spectrum

ARMY RDT&E BUDGET ITEM JUSTIFICATION (R2 Exhibit)

PRESS RELEASE EUROSATORY 2018

Space Challenges Preparing the next generation of explorers. The Program

Domain: Computer Science and Information Technology Curricula for the First Year (2012/2013)

VIRTUAL NATION SHIFTING THE CULTURAL TOURISM PARADIGM WITH VIRTUAL REALITY PRESENTED TO THE INTERNATIONAL ABORIGINAL TOURISM CONFERENCE

Technology Roadmapping. Lesson 3

Creating the Right Environment for Machine Learning Codesign. Cliff Young, Google AI

STE Standards and Architecture Framework TCM ITE

Advanced Robotics Introduction

Agent-Based Systems. Agent-Based Systems. Agent-Based Systems. Five pervasive trends in computing history. Agent-Based Systems. Agent-Based Systems

N E T W O R K UPGRADE SOLUTIONS UPGRADE YOUR MPT NETWORK YOUR WAY

FREELANCING IN AMERICA: 2017

More specifically, I would like to talk about Gallium Nitride and related wide bandgap compound semiconductors.

Computer Science: Disciplines. What is Software Engineering and why does it matter? Software Disasters

REINVENT YOUR PRODUCT

Additive Manufacturing: A New Frontier for Simulation

We Have an App for That: U.S. Military Use of Widgets and Apps to Increase C2 Agility

Mission Capability Packages

BIG IDEAS. Personal design choices require self-exploration, collaboration, and evaluation and refinement of skills. Learning Standards

Link dan Match Concept for Accounting Education in Indonesia

Beneficial Role of Humans and AI in a Machine Age of the Telco EcoSystem

2018 Avanade Inc. All Rights Reserved.

STRATEGIC FRAMEWORK Updated August 2017

in the New Zealand Curriculum

Impact of Technology on Future Defense. F. L. Fernandez

Responsible AI & National AI Strategies

Software-Intensive Systems Producibility

LEMNIOV5.TXT. Title: The Next DARPA Revolution: Integrated Microsystems Zachary Lemnios

Computer Science and Philosophy Information Sheet for entry in 2018

An Approach to Integrating Modeling & Simulation Interoperability

Leads. are the life blood of your business

Mission Space. Value-based use of augmented reality in support of critical contextual environments

Transcription:

DARPATech, DARPA s 25 th Systems and Technology Symposium August 8, 2007 Anaheim, California Teleprompter Script for Dr. Chuck Morefield, Deputy Director, Information Processing Technology Office Extreme Computing» CHUCK MOREFIELD: In 1956 the early thinkers in artificial intelligence, including Oliver Selfridge, Marvin Minsky, and others, met at Dartmouth. This influential group made a large bet, turning their backs on analog cybernetics and jumping on the digital train in earnest as they started a long quest for human levels of machine reasoning. From our vantage point 50 years later, it appears they placed a pretty good bet, since not many years from now small and cheap digital systems will have the memory and processing capacity of a human brain. A few decades after that, we will likely find the processing capacity of all of humanity contained in a single computer. IPTO is following a number of threads as we wend our way up this steep ramp in raw performance. We like to refer to our thoughts in this area as Extreme Computing. As I give you a glimpse of what we mean by Extreme Computing, I also

want to solicit your help. This will be a bit like an Easter Egg hunt: my job is to place some eggs along the path, and your job is to find them. Don t worry, it won t be that hard! Along the way, we will touch on productive computing, Moore s law, the memory wall, novel computing architectures, AI, and software complexity. As you find things of interest, please check in with Bill Harrod, our manager for Extreme Computing, and others of us from IPTO to discuss your technical thoughts. As Charlie mentioned, DARPA is vitally interested in the hardware aspects of computing performance. DARPA s largest foundational hardware program is High-Productivity Computing Systems (HPCS for short). HPCS leverages switched architecture, linking many homogenous multicore components into a single integrated computing bundle. The goal of this program is to bring online usable general-purpose systems by the year 2010. To ensure that these high-end systems meet critical national security

needs, the HPCS program has established a strong collaboration with key government user agencies. Nonetheless, an important requirement for HPCS is to achieve commercial viability. Custom small production stove-pipes were not allowed, and current performers have received a lot of encouragement to develop scalable systems that serve a spectrum of commercial markets, markets that range from the Ito calculus of computational finance to the vast compute farms of Web 2.0. HPCS is now in its final phase, which will culminate in prototype petascale systems that will undergo significant acceptance testing by stressing applications. If successful, the vendors will begin selling their machines to the DoD and in commercial marketplaces. IPTO is also looking beyond petascale, toward the exascale horizon. The HPCS systems almost in the grasp of the user community will not scale directly to the next level.

Several technical roadblocks stand in our way, including power, size, and especially programmability. So moving from petascale to exascale requires some more innovation. At exascales, we are focused less now on immediate commercial or military transition, and much more on enabling technology, particularly in the areas of power and programmability. As Moore s paradigm hits the power wall, we can no longer continue to turn to increasing clock rates for more cycles. Our sister office MTO is looking for solutions to this problem at the chip level. One way around the power wall can be found in variable precision arithmetic units components that require the application itself to select run-time precision. We are also looking at techniques that kick off computation before all data is received, and at stochastic processing that relaxes constraints on precision.

Meanwhile, data movement is not keeping up with processing speed, so we are also hitting a memory wall. It will be important to find new techniques that make access to data rapid and transparent, and that minimize data movement within and without the chip. IPTO continues to explore novel microprocessor architectures. Our most recent program in this area is called Polymorphous Computing Architectures, PCA for short. This program aims at deployed devices with very limited power and space, for example embedded in the webbing of a soldier on patrol. PCA architectures can morph in milliseconds their architectural resources changing configuration at run-time, adapting hardware to the mission as it unfolds. Our goal is high performance in small packages that give users in the field immediate local access to powerful computing.

These fractional forms of extreme computing will be in great demand as technology such as IPTO s hand held translators move into widescale use. Beyond the current PCA program, revolutionary processing architectures and approaches are sought that address key exascale hardware issues. Candidate technologies that will be considered include, for example, novel approaches to input/output and storage, or non-von Neumann processing architectures that greatly increase key metrics such as FLOPS per watt. Of course, increased raw power constantly entices us to build more complex software. The list of power users that initially motivated the DARPA petascale program are concerned about domains like weather forecasting, biomolecular modeling, or the simulation of large physical platforms. But other power users are emerging, many of whom are concerned with the architecture of the brain. As Dan Oblinger will tell you momentarily, we want to explore AI programming applied from the bottom up, in addition to building good old fashioned expert systems from the top down. We are particularly interested in modeling the sub-symbolic "instruction

set" of the brain. Adding multiple layers of learning to these low level models, we can move progressively upward to complex abilities such as prediction and planning for robotics or command and control. Success here would provide alternate approaches to perception, reasoning, and language that compliment those of IPTO s symbolic AI work under Dave Gunning. All of this work consumes increasing amounts of computing power. Even more computationally intense modeling problems lie just ahead. The world is watching as we face an elusive foe in concurrent, multifaceted conflicts. As retired General Barry McCaffrey recently noted, for these conflicts the political and economic struggle for power has become the actual field of battle. For this reason, as you will hear later in Sean s talk, we want culturally sensitive models of large asymmetric nets of individual actors and the macro economic and political forces through which they interact. These models will involve huge numbers of reasoning agents, scaling up Dan s individual AI models by many orders of magnitude. Beneath all these complex applications, whether big physics or models of the human terrain, sit very large multi-core bundles of computing

power. The evolution of the ARPAnet has ensured that numbers of these multicore systems will be attached to wide area nets. And today s commercial technology already allows each core of a netted multi-core system to itself contain multiple virtual machines (which are of course netted themselves!) As we move along this continuum, our computing fabric already has started to exhibit the classic features of mathematical complexity. Managing this complexity is a key issue for IPTO. For lack of a more exciting word, we often use the word productivity to describe our work in managing complexity. Complexity (or productivity if you will) is the major bottleneck to exploiting the full capacity of machines already in use. For future machines, it will become an even more dominant issue. Removing the bottlenecks associated with complexity will require new scalable and adaptive software tools with enough internal intelligence to

hide vast quantities of complex software development dog work beneath their surface. Extreme Computing must face head on the problem of how to program this vast increase in computing power. We need to reduce the cost, time, and especially the expertise required to build software for large multicores and specialized computing devices. It is no exaggeration to say new software advances are key to our children s future, given the economic and military competition we face, and given the likely size of our skilled programmer base. Extreme Computing must focus on cognitive support to software developers, as our systems evolve toward rich combinations of processing elements from the chip to the system level. Today, using these tremendous architectural riches requires highly sophisticated developers. But even sophisticated developers face challenges, and under-achieve on such systems, which as a consequence have become increasingly expensive to end users.

This calls for us to re-think our development technology, for example with new adaptive compilers. Today, production-quality compilers are expensive, and are unique to their target platform. They are usually designed to encompass a huge portfolio of applications and system resources, and as a result impose needless overhead on developement. This prevents developers from extracting all the available cycles from the system, and requires extensive developer expertise to ensure run-time modules of varying provenance can co-exist in the same universe without crashing. IPTO is interested in your thoughts for new compilers such as these, compilers that transparently adapt to the underlying platform, and hide complexity from less-sophisticated developers. So.. as petascale computing technology becomes a reality with HPCS, IPTO has reached another critical crossroads,

and we want to kick it up a notch with Extreme Computing. Thanks for your attention to our basket of Easter eggs. We want your best ideas as we ramp up performance and deal with software complexity. I ve given you a top-down view of some IPTO thoughts, but we need your fresh ideas... from the bottom up... to start the next revolution. We need your creativity, and we want to hear from your best minds. Please: don t be shy. Now I would like to introduce Dan Oblinger, who will speak to you about IPTO s programs in Learning and Reasoning.