Closing the Knowledge-Deficit in the Defense Acquisition System: A Case Study

Similar documents
DoDI and WSARA* Impacts on Early Systems Engineering

Michael Gaydar Deputy Director Air Platforms, Systems Engineering

WSARA Impacts on Early Acquisition

DoD Modeling and Simulation Support to Acquisition

A New Way to Start Acquisition Programs

An Assessment of Acquisition Outcomes and Potential Impact of Legislative and Policy Changes

Program Success Through SE Discipline in Technology Maturity. Mr. Chris DiPetto Deputy Director Developmental Test & Evaluation October 24, 2006

Synopsis and Impact of DoDI

Test & Evaluation Strategy for Technology Development Phase

An Element of Digital Engineering Practice in Systems Acquisition

SUBJECT: Army Directive (Acquisition Reform Initiative #3: Improving the Integration and Synchronization of Science and Technology)

Manufacturing Readiness Assessment (MRA) Deskbook

DEFENSE ACQUISITION UNIVERSITY EMPLOYEE SELF-ASSESSMENT. Outcomes and Enablers

The New DoD Systems Acquisition Process

A Review Of Technical Performance and Technology Maturity Approaches for Improved Developmental Test and Evaluation Assessment

Jerome Tzau TARDEC System Engineering Group. UNCLASSIFIED: Distribution Statement A. Approved for public release. 14 th Annual NDIA SE Conf Oct 2011

Technology Transition Assessment in an Acquisition Risk Management Context

Digital Engineering and Engineered Resilient Systems (ERS)

Transitioning Technology to Naval Ships. Dr. Norbert Doerry Technical Director, SEA 05 Technology Group SEA05TD

Model Based Systems Engineering (MBSE) Business Case Considerations An Enabler of Risk Reduction

Digital Engineering Support to Mission Engineering

Digital Engineering (DE) and Computational Research and Engineering Acquisition Tools and Environments (CREATE)

Department of Defense Instruction (DoDI) requires the intelligence community. Threat Support Improvement. for DoD Acquisition Programs

Best Practices for Technology Transition. Technology Maturity Conference September 12, 2007

Manufacturing Readiness Level Deskbook

Manufacturing Readiness Assessments of Technology Development Projects

Our Acquisition Challenges Moving Forward

Reducing Manufacturing Risk Manufacturing Readiness Levels

Gerald G. Boyd, Tom D. Anderson, David W. Geiser

Technology & Manufacturing Readiness RMS

Dedicated Technology Transition Programs Accelerate Technology Adoption. Brad Pantuck

Follow the Yellow Brick Road

GAO Technology Readiness Assessment Guide: Best Practices for Evaluating and Managing Technology Risk in Capital Acquisition Programs

COMMERCIAL INDUSTRY RESEARCH AND DEVELOPMENT BEST PRACTICES Richard Van Atta

Technology Roadmapping. Lesson 3

Panel: Systems Engineering Considerations in Practicing Test & Evaluation A Perspective from DoD

Analysis of Alternatives (AoAs) from a Cost Estimating Perspective

Prototyping: Accelerating the Adoption of Transformative Capabilities

AAC/XR: Shaping Tomorrow

TECHNICAL RISK ASSESSMENT: INCREASING THE VALUE OF TECHNOLOGY READINESS ASSESSMENT (TRA)

DMSMS Management: After Years of Evolution, There s Still Room for Improvement

Manufacturing Readiness Level (MRL) Deskbook Version 2016

Air Force Research Laboratory

Using the Streamlined Systems Engineering (SE) Method for Science & Technology (S&T) to Identify Programs with High Potential to Meet Air Force Needs

RAPID FIELDING A Path for Emerging Concept and Capability Prototyping

Technology transition requires collaboration, commitment

Engineered Resilient Systems NDIA Systems Engineering Conference October 29, 2014

Engineered Resilient Systems DoD Science and Technology Priority

Facilitating Human System Integration Methods within the Acquisition Process

Final Report of the Subcommittee on the Identification of Modeling and Simulation Capabilities by Acquisition Life Cycle Phase (IMSCALCP)

UNCLASSIFIED. FY 2016 Base FY 2016 OCO

DOD Technology Innovation & Transition

Stakeholder and process alignment in Navy installation technology transitions

Strategic Guidance. Quest for agility, innovation, and affordability. Distribution Statement A: Approved for Public Release

2 August 2017 Prof Jeff Craver So you are Conducting a Technology Readiness Assessment? What to Know

Manufacturing Readiness Levels (MRLs) and Manufacturing Readiness Assessments (MRAs)

USAF Digital Thread Initiative Overview

SYSTEMS ENGINEERING MANAGEMENT IN DOD ACQUISITION

An Industry Response to the Acquisition Changes

The Role of the Communities of Interest (COIs) March 25, Dr. John Stubstad Director, Space & Sensor Systems, OASD (Research & Engineering)

Presented at the 2017 ICEAA Professional Development & Training Workshop. TRL vs Percent Dev Cost Final.pptx

THE EM LEAD LABORATORY: PROVIDING THE RESOURCES AND FRAMEWORK FOR COMPLEXWIDE ENVIRONMENTAL CLEANUP-STEWARDSHIP ACTIVITIES

Are Rapid Fielding and Good Systems Engineering Mutually Exclusive?

Models, Simulations, and Digital Engineering in Systems Engineering Restructure (Defense Acquisition University CLE011)

Module 2 Lesson 201 Project Coordinator (PC) Duties

UNCLASSIFIED. R-1 ITEM NOMENCLATURE PE F: NAVSTAR Global Positioning System User Equipment Space

Autonomy Test & Evaluation Verification & Validation (ATEVV) Challenge Area

Integrated Transition Solutions

2017 AIR FORCE CORROSION CONFERENCE Corrosion Policy, Oversight, & Processes

Challenges and Innovations in Digital Systems Engineering

UNCLASSIFIED. UNCLASSIFIED Office of Secretary Of Defense Page 1 of 5 R-1 Line #102

Software-Intensive Systems Producibility

Developing S&T Strategy. Lesson 1

Administrative Change to AFRLI , Science and Technology (S&T) Systems Engineering (SE) and Technical Management

Manufacturing Readiness Levels (MRLs) Manufacturing Readiness Assessments (MRAs) In an S&T Environment

The Drive for Innovation in Systems Engineering

Report to Congress regarding the Terrorism Information Awareness Program

Digital Engineering. Ms. Philomena Zimmerman. Deputy Director, Engineering Tools and Environments OUSD(R&E)/Systems Engineering

Committee on Development and Intellectual Property (CDIP)

UNCLASSIFIED R-1 Shopping List Item No. 127 Page 1 of 1

Transitioning the Opportune Landing Site System to Initial Operating Capability

Smart Grid Maturity Model: A Vision for the Future of Smart Grid

DUSD (S&T) Software Intensive Systems

Technology Program Management Model (TPMM) Overview

The Army s Future Tactical UAS Technology Demonstrator Program

UNCLASSIFIED. R-1 ITEM NOMENCLATURE PE F: NAVSTAR Global Positioning System User Equipment Space

Lesson 17: Science and Technology in the Acquisition Process

Manufacturing Readiness Assessment Overview

Digital Engineering. Phoenix Integration Conference Ms. Philomena Zimmerman. Deputy Director, Engineering Tools and Environments.

UNCLASSIFIED. R-1 ITEM NOMENCLATURE PE F: NAVSTAR Global Positioning System User Equipment Space. FY 2011 Total Estimate. FY 2011 OCO Estimate

Rapid Fielding A Path for Emerging Concept and Capability Prototyping

Impact of Technology Readiness Levels on Aerospace R&D

DoD Research and Engineering

Getting the evidence: Using research in policy making

DoD Engineering and Better Buying Power 3.0

Building the S&T Foundation for Agile Solutions

Engineering Autonomy

Status of USAF Systems Engineering

Fifteenth Annual INCOSE Region II Mini-Conference. 30 October 2010 San Diego

Modeling Enterprise Systems

Transcription:

Closing the Knowledge-Deficit in the Defense Acquisition System: A Case Study Luis A. Cortes Michael J. Harman 19 March 2014 The goal of the STAT T&E COE is to assist in developing rigorous, defensible test strategies to more effectively quantify and characterize system performance and provide information that reduces risk. This and other COE products are available at www.afit.edu/stat. STAT T&E Center of Excellence 2950 Hobson Way Wright-Patterson AFB, OH 45433

This page intentionally left blank.

Abstract GAO Report to Congressional Committees Defense Acquisitions; Assessment of Selected Weapon Programs (GAO-13-294SP), March 2013, offers observations on the performance of 86 programs that make DOD s $1.6 T Major Defense Acquisition Program portfolio for 2012. The report correlates a development cost growth of $271B to a "knowledge deficit" early in the program. That knowledge deficit ripples through the acquisition system feeding upon itself and leaving decision makers with less and less knowledge to make informed decisions on how and when to proceed to the next acquisition phases that require commitment for additional funding. The cost growth reduces the funding available for other priorities, including solutions to reduce the knowledge deficit itself. The Scientific Test and Analysis Techniques in Test & Evaluation Center of Excellence (STAT COE) became the DOD resource solution to enable a knowledge-based approach for 27 Major Defense Acquisition Programs, of which 16 were flagged in the GAO report. STAT COE helped institutionalizing the use of STAT in T&E an approach to assist the programs in transforming test data into knowledge, and knowledge into acquisition-decisionquality information. STAT COE is completing the proof of concept phase in September 2014 and faces termination. Even though STAT COE has made substantial contributions to the programs under engagement during the proof of concept phase, there is a need to continue providing similar contributions to the programs under engagement and additional programs. This paper outlines STAT COE contributions to close the knowledge deficit in the defense acquisition system. Key words: scientific test and analysis techniques, design of experiments, DOD test and evaluation i

This page intentionally left blank. ii

Identifying the knowledge-deficit The Department of Defense (DOD) makes significant investment decisions in the acquisition of weapons systems, defense business systems, national security systems, and joint systems. For instance, the investment for just 86 weapon system programs in 2012 was $1.6T 1. Because the imperative is to deploy cost effective, safe, survivable, operationally effective, and operationally suitable systems that meets the warfighter needs, every investment decision must be well informed. Figure 1 2 outlines the wealth of decision-quality-information required at key decision points of the defense acquisition system. MDD Authorizes AoA and preparation for next Milestone Improving Milestone Process Effectiveness AoA Study Guidance AoA Study Plan ICD AoA Milestone A Information Requirements per DoD Instruction 5000.02. Affordability Analysis MS A Programmatic Information Source Selection Acquisition Strategy (AS) Affordability Requirement RFP(s) including Specs and SOW Draft Documents: Alt LFT&E Plan APB CARD Exit Criteria ISP LCSP LCSS Plan Authorizes TD Phase RFP Release and Source Selection; Certifies the Program Contract Award for TD Phase Manpower Estimate PPP SEP SE Trade-Off Analysis TEMP Supporting Information CDD DoD Component Cost Estimate DCAPE ICE (ACAT ID) Preliminary TRA STAR TD Phase or Preliminary Design Review Pre-EMD Review Authorizes EMD Phase RFP Release and Source Selection Source Selection Contract Award for EMD Phase MS B Approves the APB and Certifies the Program Critical Design Review Programmatic Information Final Milestone B Documents Supporting Information 2366b Certification CCA Confirmation DoD Component Approved Plans (Corrosion Prevention & Control, PESHE, IUID) LFT&E Waiver OTA Report PDR Assessment Replaced System Sustainment Plan Spectrum Certification TRA EMD Phase MDA Approves AS Source Selection MS C Milestone C Information Requirements per DoD Instruction 5000.02. Contract Award for P&D Phase P&D Phase NOTE: Information Requirements will vary based on program type (e.g., MDAP/MAIS/Major Systems) and Acquisition Category. Authorizes P&D Phase RFP Release and Source Selection MATERIEL SOLUTION ANALYSIS TECHNOLOGY DEVELOPMENT (TD) ENGINEERING & MANUFACTURING DEVELOPMENT (EMD) PRODUCTION & DEPLOYMENT (P&D) = AS Approval = Milestone Decision Review = Contract Award = Formal Decision Point = Technical Review ATTACHMENT Figure 1. Improving milestone process effectiveness 1 GAO Report to Congressional Committees Defense Acquisitions; Assessment of Selected Weapon Programs (GAO-13-294SP), March 2013 2 Improving Milestone Process Effectiveness, Principal Deputy Under Secretary of Defense (Acquisition, Technology, and Logistics), 23 June 2011, as presented by Mr. Skip Hawthorne, (OUSD(AT&L) DPAP/AP, in DoDI 5000.02 Re-Issuance Brief, 5 Dec 2013. 1

In 2013, the Government Accounting Office (GAO) 3 correlated a development cost growth of $271B of 86 weapons systems programs to a "knowledge deficit" early in the program. This knowledge deficit ripples through the acquisition system feeding upon itself and leaving decision makers with less and less knowledge to make informed decisions on how and when to proceed to the next acquisition phases that require commitment for additional funding. The cost growth reduces the funding available for other priorities, including solutions to reduce the knowledge deficit itself. A knowledge deficit means the program is proceeding throughout the defense acquisition system without sufficient knowledge about its technologies, design, or manufacturing processes, and faces unresolved risks that could lead to cost increases and schedule delays. GAO also correlated the cost growth rate with the maturity of the technologies 86% for programs that start with immature technologies and about 43% for those that start with nearing maturity technologies. GAO assessed the program s attainment of product knowledge by scoring eight knowledge-based acquisition practices at three critical points in the acquisition system as shown in Figure 2: Milestone B, Milestone C, and half-way between the milestones at the transition from Integration to Demonstration. Figure 2. DOD operation of the acquisition system milestones and GAO knowledge points. 3 GAO Report to Congressional Committees Defense Acquisitions; Assessment of Selected Weapon Programs (GAO-13-294SP), March 2013 2

The three knowledge points are: (1) resources and requirements match; (2) design stability; and (3) manufacturing process maturity. Figure 3 is an example of the scorecard. Figure 3. Examples of knowledge scorecards. The knowledge-based shortfalls include demonstration of critical technologies in relevant or realistic environments and testing of integrated or representative prototypes. GAO stated that the likelihood of a weapon system to be delivered within its estimated cost and schedule is a function of the knowledge the program has reached by each of the three key decision points: Positive acquisition outcomes require the use of a knowledge-based approach to product development that demonstrates high levels of knowledge before significant commitments are made. One of the most effective and efficient means of gaining knowledge about a system and reducing the knowledge gaps in technology, design, and production is Test & Evaluation (T&E): 3

DOD Directive 5000.01; The Defense Acquisition System 4 Test and evaluation shall be integrated throughout the defense acquisition process. Test and evaluation shall be structured to provide essential information to decision-makers, assess attainment of technical performance parameters, and determine whether systems are operationally effective, suitable, survivable, and safe for intended use. The conduct of test and evaluation, integrated with modeling and simulation, shall facilitate learning, assess technology maturity and interoperability, facilitate integration into fielded forces, and confirm performance against documented capability needs and adversary capabilities as described in the system threat assessment. In a native context, T&E is the disciplined process of subjecting a system to pre-established conditions and then deliberately and methodologically changing them to gain knowledge about how the system react to those changes. An efficient and effective T&E strategy includes: (1) well-defined, end-to-end, mission-oriented objectives within the scope of the decisions to be informed; (2) mission-oriented, traceable measures of capability and readiness; (3) complete coverage of the test space; (4) test designs with good properties and structures; (5) disciplined test protocols; (6) statistical data analysis and assessment techniques and standardized evaluation criteria; and (7) clear products to inform decisions. However, there are instances in which this strategy falls short. 4 DOD Directive 5000.01, The Defense Acquisition System. (20 Nov 07) 4

Addressing the knowledge-deficit Earlier in 1998, the National Research Council concluded that T&E did not take full advantage of the benefits afforded by statistics. Statistics is the mathematical science dealing with the planning, collection, analysis, organization, explanation, and presentation of data. Statistics seeks to investigate and establish relationships between events. Scientific test and analysis techniques (STAT) exploits the tremendous power of grafting statistical methods and testing to acquire a deeper knowledge of the system s capabilities and to maximize the utility of the information. DOD Instruction 5000.02, Operation of the Defense Acquisition System 5, recognizes the value of STAT: DOD Instruction 5000.02; Operation of the Defense Acquisition System Use scientific test and analysis techniques to design an effective and efficient test program that will produce the required data to characterize system behavior across an appropriately selected set of factors and conditions. Ensure that each major developmental test phase or event in the planned test program has a welldefined description of the event, specific objectives, scope, appropriate use of modeling and simulation, and an evaluation methodology. Describe an evaluation methodology in the TEMP starting at Milestone A that will provide essential information on programmatic and technical risks as well as information for major programmatic decisions. Starting at Milestone B, the evaluation methodology will include an evaluation framework to identify key data that will contribute to assessing progress toward achieving: key performance parameters, critical technical parameters, key system attributes, interoperability requirements, cybersecurity requirements, reliability growth, maintainability attributes, developmental test objectives, and others as needed. In addition, the evaluation framework will show the correlation/mapping between test events, key resources, and the decision supported. The evaluation methodology will support a Milestone B assessment of planning, schedule, and resources and a Milestone C assessment of performance, reliability, interoperability, and cybersecurity. 5 Interim DOD Instruction 5000.02, Operation of the Defense Acquisition System. (25 Nov 13) 5

STAT are knowledge-based scientific and statistical methods used to enable the development of efficient, rigorous test strategies that yield defensible results. STAT enables better interpretation of test data, which helps inform decision makers of the true state of system capabilities across the entire technical and operational requirements space and the risks associated with the decisions to be made. The use of STAT can help generate test efficiencies and, ultimately, enable fielding a more effective, suitable, and survivable system. STAT encompasses methods such as design of experiments (DOE) and reliability growth. DOE is the cornerstone statistical method embedded in the STAT portfolio. DOE is the systematic integration of well-defined and structured scientific strategies for gathering empirical knowledge about a system or process using statistical methods for planning, designing, executing, and analyzing an experiment. The core principle is to make purposeful changes to the input variables of a process or system to observe and exploit the changes in the output response. DOE adds rigor and discipline to T&E and facilitates a comprehensive understanding of the tradeoffs in the techno-programmatic domains: risks, cost, and utility of information. The use of DOE in T&E allows for: Statistically identifying the performance drivers and their interactions. Characterizing system performance over the entire battle space. Illuminating the tradeoff between number of tests, risks, and amount and quality of the information obtained. Developing empirical models that could be useful for tactical decision making and performance assessment. Providing means for optimizing system performance. In 2010, Director Operational Test & Evaluation (DOT&E) issued the memorandum Guidance on the use of Design of Experiments (DOE) in Operational Test and Evaluation requesting an increase in the use of scientific and statistical 6

methods in developing rigorous, defensible test plans and in evaluating their results. The memorandum stated the importance of understanding "how much testing is enough" and specifically cited design of experiments as a method that adds substantive test design content and justification. The challenge issued by DOT&E was answered in 2012 when, in collaboration with DOT&E and the service test components, DASD(DT&E) created the Scientific Test and Analysis Techniques in Test & Evaluation Center of Excellence (STAT COE). STAT COE became the DOD resource to enable a knowledge-based T&E approach for 27 Major Defense Acquisition Programs by helping in illuminating: (1) defensible test strategies to acquire knowledge more effectively; (2) the transformation of knowledge into acquisition-decision-quality information; and (3) the risks associated with the decisions to be made. Sixteen of those 27 programs flagged in the GAO report are under STAT COE engagement during the proof of concept phase. Only seven of them entered development with technologies fully mature or nearing maturity, and four are amongst the 10 largest programs. Table 1 shows the scorecard for 11 of the 16 programs under STAT COE engagement. The implementation plan for STAT COE calls for three phases. The first phase is a three-year proof of concept program that will culminate with a transition to sustainment in September 2014. Over the last two years, STAT COE has made substantial contributions to closing the program s knowledge-deficit by: Helping institutionalizing STAT Working Groups, which allows T&E working-level personnel to get engage in practicing STAT. Becoming chartered member of the T&E Working-Level Integrated Process Team (WIPT), which enables mid-level personnel to own the STAT strategy. Authoring segments of the Test & Evaluation Master Plan (TEMP), including a DOE Appendix, which incorporates STAT into the test strategy. Providing specialized DOE training to the program s T&E personnel. 7

Collaborating with Operational Test Agencies on developing comprehensive test designs that span over the DT/OT spectrum, which permits the practice of a more coherent STAT program. Table 1. GAO Scorecard for 11 Major Defense Acquisition Programs Program Requirements and Resources Match Design Stability Manufacturing Process Maturity A B C D E F G H CVN-78 GPS Generation III (GPS III) Integrated Air and Missile Defense Joint Light Tactical Vehicle KC-46 LHA(R) Next Generation Operational Control System (OCX) P-8 Ship to Shore Connector Space Based Infrared System (SBIRS) Space Fence Key: Knowledge attained Knowledge not attained Not applicable A-Demonstrate all critical technologies in a relevant environment B-Demonstrate all critical technologies in a realistic environment C-Complete preliminary design review D-Release at least 90% of design drawings/complete three-dimensional product model E-Test a system-level integrated prototype F-Demonstrate critical processes are in control G-Demonstrate critical processes on a pilot production line H-Test a production representative prototype 8

Figure 4. Defense acquisition system timelines. Even though STAT COE has made substantial contributions to the programs under engagement during the proof of concept, there is a need to continue providing similar contributions. Figure 4 shows synchronized timelines for three of the Major Defense Acquisition Programs supported by COE and flagged by the GAO report. Clearly, there are numerous opportunities to improve the outcome of significant decisions by continuing fostering an approach that demonstrates high levels of knowledge before significant commitments are made. Every investment decision must be well informed. STAT COE is the catalyst for transforming test data into knowledge, and knowledge into decision-quality-information. 9

The STAT COE return on investment As mentioned previously, program cost growth reduces the funding available for other priorities, including solutions to reduce the knowledge deficit itself. The STAT COE is a solution focused on reducing this knowledge gap. But, in light of an apparent lack of funding, where will future COE funding come from and what is the expected return on investment? Consider these relative comparisons: Current COE annual operating budget of $1.5M This is only 0.0027% of the $271B growth amount (assuming 5 years COE operating costs=$7.5m) COE design efficiency has reduced test runs by 15-75% Assuming a conservative test reduction of 1% equates to $2.71B, sufficient to pay for the COE many times over. If the COE was to expand to cover all 86 programs (4 programs/expert) Operating costs would roughly triple to $23.8M for 5 years (.0088% of the $271B) The COE could assign a STAT expert full time to every program This would cost $107.5M ($250K/year for 5 years). This is only 0.04% of the $271B. Intangible benefits Informed decisions The COE focus is rigorous design and quantifying risk, not cost reduction. However, the by-product of rigor is knowledge and as long as the COE works to close the knowledge deficit, their actions will pay for their efforts many times over. Summary Integrating STAT into the programs is not an option, it is an investment for the DOD and serves to reduce cost growth and close the knowledge gap. Failure to directly address this challenge encourages more of the same cycle of unplanned cost growth, loss of flexibility, and implementation delays. 10