M&S Requirements and VV&A: What s the Relationship?

Similar documents
A FRAMEWORK FOR PERFORMING V&V WITHIN REUSE-BASED SOFTWARE ENGINEERING

Defense Modeling & Simulation Verification, Validation & Accreditation Campaign Plan

SWEN 256 Software Process & Project Management

Chapter 8: Verification & Validation

UNIT-III LIFE-CYCLE PHASES

Systems Engineering Initiatives for Verification, Validation and Accreditation of DoD Models and Simulations

Software-Intensive Systems Producibility

Dan Dvorak and Lorraine Fesq Jet Propulsion Laboratory, California Institute of Technology. Jonathan Wilmot NASA Goddard Space Flight Center

Making your ISO Flow Flawless Establishing Confidence in Verification Tools

Design Science Research Methods. Prof. Dr. Roel Wieringa University of Twente, The Netherlands

Industrial Applications and Challenges for Verifying Reactive Embedded Software. Tom Bienmüller, SC 2 Summer School, MPI Saarbrücken, August 2017

Credible Autocoding for Verification of Autonomous Systems. Juan-Pablo Afman Graduate Researcher Georgia Institute of Technology

Pragmatic Strategies for Adopting Model-Based Design for Embedded Applications. The MathWorks, Inc.

Models, Simulations, and Digital Engineering in Systems Engineering Restructure (Defense Acquisition University CLE011)

Software processes, quality, and standards Static analysis

Principled Construction of Software Safety Cases

ARGUING THE SAFETY OF MACHINE LEARNING FOR HIGHLY AUTOMATED DRIVING USING ASSURANCE CASES LYDIA GAUERHOF BOSCH CORPORATE RESEARCH

ARTES Competitiveness & Growth Full Proposal. Requirements for the Content of the Technical Proposal. Part 3B Product Development Plan

A Knowledge-Centric Approach for Complex Systems. Chris R. Powell 1/29/2015

Putting the Systems in Security Engineering An Overview of NIST

MIL-STD-882E: Implementation Challenges. Jeff Walker, Booz Allen Hamilton NDIA Systems Engineering Conference Arlington, VA

An introduction to software development. Dr. C. Constantinides, P.Eng. Computer Science and Software Engineering Concordia University

William Milam Ford Motor Co

Introduction to adoption of lean canvas in software test architecture design

Understanding Requirements. Slides copyright 1996, 2001, 2005, 2009, 2014 by Roger S. Pressman. For non-profit educational use only

Introduction to Systems Engineering

Implementing the International Safety Framework for Space Nuclear Power Sources at ESA Options and Open Questions

HOW TO SUCCESSFULLY CONDUCT LARGE-SCALE MODELING AND SIMULATION PROJECTS. Osman Balci

Autonomy Test & Evaluation Verification & Validation (ATEVV) Challenge Area

IBM Software Group. Mastering Requirements Management with Use Cases Module 2: Introduction to RMUC

Automated Driving Systems with Model-Based Design for ISO 26262:2018 and SOTIF

Background T

ARTES Competitiveness & Growth Full Proposal. Requirements for the Content of the Technical Proposal

Scientific Certification

Systems Engineering Overview. Axel Claudio Alex Gonzalez

Jerome Tzau TARDEC System Engineering Group. UNCLASSIFIED: Distribution Statement A. Approved for public release. 14 th Annual NDIA SE Conf Oct 2011

SAFETY CASES: ARGUING THE SAFETY OF AUTONOMOUS SYSTEMS SIMON BURTON DAGSTUHL,

2 August 2017 Prof Jeff Craver So you are Conducting a Technology Readiness Assessment? What to Know

System of Systems Software Assurance

NRC Workshop on NASA Technologies

Safety recommendations for nuclear power source applications in outer space

PROJECT FINAL REPORT Publishable Summary

Foundations Required for Novel Compute (FRANC) BAA Frequently Asked Questions (FAQ) Updated: October 24, 2017

Third Year (PR3) Projects

By RE: June 2015 Exposure Draft, Nordic Federation Standard for Audits of Small Entities (SASE)

Stevens Institute of Technology & Systems Engineering Research Center (SERC)

An Overview of Model and Simulation Verification, Validation, and Accreditation

Introduction to Software Requirements and Design

A New Way to Start Acquisition Programs

PEGASUS Effectively ensuring automated driving. Prof. Dr.-Ing. Karsten Lemmer April 6, 2017

DEFENSE ACQUISITION UNIVERSITY EMPLOYEE SELF-ASSESSMENT. Outcomes and Enablers

Violent Intent Modeling System

Software verification

UNCLASSIFIED UNCLASSIFIED 1

Software as a Medical Device (SaMD)

CSE - Annual Research Review. From Informal WinWin Agreements to Formalized Requirements

Sara Spangelo 1 Jet Propulsion Laboratory (JPL), California Institute of Technology. Hongman Kim 2 Grant Soremekun 3 Phoenix Integration, Inc.

NQ Verification Key Message Reports

Information and Communication Technology

Testing in the Lifecycle

UNIT IV SOFTWARE PROCESSES & TESTING SOFTWARE PROCESS - DEFINITION AND IMPLEMENTATION

Fault Management Architectures and the Challenges of Providing Software Assurance

National Coalition for Core Arts Standards. Visual Arts Model Cornerstone Assessment: Secondary Accomplished

TECHNICAL AND OPERATIONAL NOTE ON CHANGE MANAGEMENT OF GAMBLING TECHNICAL SYSTEMS AND APPROVAL OF THE SUBSTANTIAL CHANGES TO CRITICAL COMPONENTS.

24 Challenges in Deductive Software Verification

Requirements Analysis aka Requirements Engineering. Requirements Elicitation Process

Presentation by Matthias Reister Chief, International Merchandise Trade Statistics

Win and Influence Design Engineers--- Change Their Affordability DNA

NACE International Standards & DoD Corrosion Prevention/Control Effort

2018 ASSESS Update. Analysis, Simulation and Systems Engineering Software Strategies

M&S Engineering Complex Systems; Research Challenges

The Safety Case Approach and Other Pressing Issues

Skylands Learning is your trusted learning advisor. That is our promise your trusted learning advisor. Four simple words.

VERIFICATION, VALIDATION & ACCREDITATION: DISCIPLINES IN DIALOGUE OR CAN WE LEARN FROM THE EXPERIENCES OF OTHERS? Panel Presentation

INTELLIGENT SOFTWARE QUALITY MODEL: THE THEORETICAL FRAMEWORK

Frequently Asked Questions

Lecture 13: Requirements Analysis

SPACE SITUATIONAL AWARENESS: IT S NOT JUST ABOUT THE ALGORITHMS

Industrial Experience with SPARK. Praxis Critical Systems

Selecting, Developing and Designing the Visual Content for the Polymer Series

MILITARY RADAR TRENDS AND ANALYSIS REPORT

A Case Study of Changing the Tires on the Bus While Moving

Analysis of Software Artifacts

progressive assurance using Evidence-based Development

Software Project Management 4th Edition. Chapter 3. Project evaluation & estimation

Technology Transfer: An Integrated Culture-Friendly Approach

IECI Chapter Japan Series Vol. 5 No. 2, 2003 ISSN

1. Creating a derived CPM

UNIT VIII SYSTEM METHODOLOGY 2014

Assessing the Welfare of Farm Animals

Introduction to Design Science Methodology

A NEW METHODOLOGY FOR SOFTWARE RELIABILITY AND SAFETY ASSURANCE IN ATM SYSTEMS

Towards an MDA-based development methodology 1

Enabling Model-Based Design for DO-254 Compliance with MathWorks and Mentor Graphics Tools

Software Maintenance Cycles with the RUP

Developing and Distributing a CubeSat Model-Based Systems Engineering (MBSE) Reference Model Interim Status

Developing and Distributing a Model-Based Systems Engineering(MBSE) CubeSat Reference Model Status

Department of Energy s Legacy Management Program Development

UML and Patterns.book Page 52 Thursday, September 16, :48 PM

The Nature of Science Investigating Key Ideas Related to NOS

Transcription:

M&S Requirements and VV&A: What s the Relationship? Dr. James Elele - NAVAIR David Hall, Mark Davis, David Turner, Allie Farid, Dr. John Madry SURVICE Engineering

Outline Verification, Validation and Accreditation (VV&A) Activities Improve Upon and Refine Model and Simulation (M&S) Requirements How should M&S Requirements be developed? Why aren t they developed properly? How does VV&A relate to M&S requirements? What are some examples of VV&A improving M&S requirements? Conclusions & Recommendations 1

How Should M&S Requirements Be Developed? DOD M&S Book of Knowledge says identify, validate and scope the requirement: Needs assessment, technical review, specific use But it provides little practical help IEEE Standards: IEEE STD 830-1998 for Software Specifications Basic Tenet: Need to establish agreement between customers and suppliers It s better to have usable and easily updated documentation than a static referenceable report * * Robert Japenga, How to Write a Software Requirements Specification 2

Why are M&S Requirements Not Developed Properly? Our Experience in DOD: Many DOD M&S are widely reused and have evolved from older M&S But may have been drastically changed in the process Requirements are not well defined or updated (or documented) Example: Effective Time-on-Station (ETOS) code had the same name as previous code that had been used in the past, but had almost no code in common, let alone any documented requirements! Outside of DOD: Similar issues due to Emergent Requirements * Requirements change throughout the life of, and over multiple uses of M&S It s impossible to imagine the perfect design in every detail ahead of time Defining Requirements is a process of ongoing discussion throughout the course of development User Stories are important Who are they, what do they want to do, and why? * P. J. Srivastava, Thoughts on Project Management, Leadership & Life 3

Examples of Flawed Requirements* Example Year RD VER VAL Rationale Titanic 1912 X X Not enough lifeboats; rivets failed Edsel 1958 X Looked like it was sucking on a lemon Apollo-13 1970 X Voltage spec not verified IBM PCjr 1983 X People have fat fingers Challenger 1986 X X Faulty design; launched in the cold (outside Reqts) Mars Orbiter 1999 X X X Mismatched measurement units sub-to-prime; insufficient testing; mismatched models satelliteto-ground X = failure in: RD = Requirements Development VER = Requirements Verification: Proving each requirement has been satisfied VAL = Requirements Validation, ensuring that: The set of requirements is correct, complete and consistent You can create a model that satisfies the requirements You can test it *Bayhill & Henderson, Requirements Development, Verification and Validation Exhibited in Famous Failures 4

WHAT DO WE MEAN BY VV&A? DOD Definitions: VERIFICATION: The process of determining that a model implementation and its associated data accurately represent the developer's conceptual description and specifications. (Does the model do what the originator intended and is it relatively error free?) Did we build the model right? VALIDATION: The process of determining the degree to which a model and its associated data are an accurate representation of the real world from the perspective of the intended uses of the model. (Do model results match real world data well enough for your needs?) Did we build the right model? ACCREDITATION: The official certification [determination] that a model, simulation, or federation of models and simulations and its associated data are acceptable for use for a specific purpose. (Does the accreditation authority have adequate evidence to be confident that a model and its input data are credible and suitable for a particular use?) Is it fit for this purpose? General Software Development Community Definitions*: VERIFICATION: Ensuring that the computer program of the computerized model and its implementation are correct. VALIDATION: Substantiation that a model within its domain of applicability possesses a satisfactory range of accuracy consistent with the intended application of the model. * R. Sergeant, Verification and Validation of Simulation Models, 2013 Journal of Simulation 5

How Does M&S VV&A Relate to M&S Requirements? V&V is a Process; Accreditation is the Decision; Risk is the Metric V&V is a rheostat gradually shining light on the M&S: how much light you need depends on the risks of using M&S results VV&A also shines a light on the M&S Requirements and identifies any deficiencies and recommends improvements The ultimate goal of V&V efforts is to form a foundation for making good program decisions BUT, in order to do that the program must have good M&S requirements VV&A efforts, if done properly, improve the M&S requirements for: Capability (what functionality & fidelity does it need?) Accuracy (how accurate do software, data and outputs need to be?) Usability (what processes need to be in place to ensure it is not misused?) 6

V&V and M&S Requirements Software development is an iterative process*: Requirements Development, conceptual model development & validation, model specification, V&V: all are required and iterative Proper application of those processes should result in complete requirements, but only by including V&V steps all along the way V&V focuses on determining: What questions do the users need to answer? What M&S outputs will be used to help answer those questions? What characteristics must the M&S have to provide those outputs? Capability, Accuracy, Usability What information is needed to show the M&S has those characteristics? V&V results, CM artifacts, documentation, pedigree, etc. What information is missing, and how can we best develop it? What are the risks of not obtaining that information?? * R. Sergeant, Verification and Validation of Simulation Models, 2013 Journal of Simulation 7

VV&A Focuses and Enhances M&S Requirements VV&A is tied to intended uses through requirements VV&A Team may need to help the user derive: Detailed intended use statements Requirements tied to those uses Ultimate Goal: Reduce the Risk of using M&S To an acceptable level for the intended use We create and populate the table on the following slide for all programs we support as part of the VV&A effort M&S Requirements for Capability, Accuracy and Usability Acceptability Criteria Metrics Measurement Methods 8

M&S Requirements, Criteria and Metrics M&S Requirements Acceptability Criteria Metrics/Measures Capability: functional and fidelity characteristics required Software Accuracy: S/W is adequately tested Data Accuracy: input and embedded data are appropriate and documented Output Accuracy: outputs are of sufficient accuracy for the application Usability: processes and documentation are in place to ensure proper operation and interpretation of outputs Documented specific details of requirements for design and data, and appropriate output parameters Appropriate and documented S/W environment, testing and verification Authoritative input data sources, documented data V&V, verified data transformations Dynamic behaviors are appropriate; compares to benchmarking, SME expectation and/or test data CM is adequate and demonstrated; users are appropriately trained and supported; documentation is adequate for use Review of requirements and design, complete documentation, outputs are appropriate to the need Review of verification and testing results and S/W development environment Review and acceptance of documented data V&V and sources Review and acceptance of validation results important to the intended use Review and acceptance of documented processes and demonstration that they are being followed 9

Examples of VV&A Improving M&S Requirements Missile Endgame Simulation A long-established and widely used M&S A Distributed Simulation A A Live/Virtual/Constructive M&S under development B D F C Cross Domain Solution G Effective time-on-station (ETOS) simulation A re-write of an existing M&S used previously 10

Intended uses: Missile End-Game Simulation Support to ordnance system design Generate inputs to mission and campaign simulations Intended uses had similar M&S capability requirements, but different credibility requirements Design support required higher fidelity and credibility for specific systems Inputs to higher-level sims required less fidelity but modeling of all system effects M&S had been developed and modified over 30+ years No documented M&S requirements existed VV&A Team and the developer came up with requirements, acceptability criteria, and metrics/measures V&V activities were only performed if they provided data to support those metrics SME review of sensitivity analyses in many cases was adequate Provided the customer with detailed requirements to support his specific intended uses Only conducted V&V when it supported demonstrating those requirements 11

Distributed Simulation V&V tasking for a live-virtual-constructive simulation of unmanned aircraft operation in the National Airspace Designed to demonstrate successful airspace integration of unmanned aircraft High-risk application (collision avoidance) meant V&V very important part of process Intended use and M&S requirements were only defined by the developer in a general sense V&V team defined specific intended use, M&S requirements, acceptability criteria and metrics (precisely what did it need to do and how well did it need to do it?) A In this case the V&V team actually defined the M&S requirements for the ultimate user Often occurs in early stages of M&S development (at least for DOD M&S) B D F C Cross Domai n Soluti on G 12

ETOS Simulation It s easy to look like you re making progress if you don t know where you re going Issue: The program had an intended use statement but no software requirements: this created many issues with verification as well as with the software The program office was changing requirements during M&S development The developer and the program office had no consensus on M&S requirements A software requirements document serves as an agreement between the program office and the developer No software design requirements document meant there were no testable requirements for verification Solution: We worked with the developer to create software design requirements These software design requirements were used as testable parameters to create an implementation test procedure Each requirement was matched with corresponding test(s) *ETOS is defined as the total time the mission area is covered by an aircraft on station, divided by the total coverage time required 13

ETOS Example Independent V&V began with ETOS M&S Version 1: Verification test procedures were developed using the software requirements document we created Multiple errors were discovered and documented Software Quality Assessment (SQA) was performed via manual code review Relatively small code Biggest issue was lack of objects Subject Matter Expert (SME) Review was performed Verification errors were confirmed Sensitivity analyses were reviewed In general, SMEs agreed that M&S was realistic enough for the intended use if known errors were corrected The developer addressed multiple errors (bugs) Developer can address software issues using the following justifications: User Error, Software Test Error, Software Requirements change, No Fix and Software Update All major bugs should be fixed through a software update. Non major bugs can become new assumptions, limitations or known issues Independent V&V continued with ETOS M&S Version 2: Verification test procedure was used again Multiple new errors were discovered Corrected in Version 3 OK 14

ETOS Example (CONT) Independent V&V continued with ETOS M&S Version 3 Verification test procedure was used a third time, using updated software requirements New errors were discovered; some older errors remained The developer addressed the most important errors Version 4 consisted of fixing the newly discovered errors in V3 Verification test procedure was used again The Software Requirements document and test procedures were updated with new requirements V&V Report was updated Results were determined to be adequate for use of Version 4 by the customer Remaining issues were identified as limitations 15

SYSTEMS ENGINEERING PROCESSES APPLY TO M&S DEVELOPMENT There is a proper order to good M&S development: Determine requirements Develop Test Update/Manage It helps if the process is followed in the right order Many of the errors discovered during ETOS M&S verification testing could have been avoided by creating the design requirements and having them approved by the program office before building the model This would help avoid the issues created when the program office and the developer have different final products in mind. Easy to say, hard to do in DOD Especially for continued use of Legacy M&S for a variety of purposes V&V is best done during M&S development, but in DOD it s usually done after the fact 16

Conclusions & Recommendations M&S Requirements tend to focus on: System functional representations and how it works VV&A focuses on: Are functional representations adequate (conceptual validation) for the intended use Is the implementation adequate (verification) Are the functional representations and implementation representative of reality (validation) Are processes in place to ensure that the M&S won t be misused Acceptability criteria are the critical difference: M&S requirements focus on conceptual representation completeness and traceability VV&A focuses on requirements for the accuracy of those representations and adds specificity to flesh out the requirements Recommendation: VV&A teams should be brought in early in the development process to help refine M&S requirements Particularly in the case of emerging requirements during phased development Also enhances the V&V approach and better leverages S/W developer V&V activities Developers do a lot of work to convince themselves their M&S works right, but they tend not to document that work so they can convince someone else later on 17

Supplemental Material 18

The Simulation Credibility Equation M&S Credibility = f (Capability, Accuracy, Usability) V&V Capability Functional and Fidelity Characteristics Accuracy Software, Data, Outputs Usability Training, Documentation, CM, User Support 19

The Essence of Accreditation M&S REQUIREMENTS Capability Accuracy Usability Defined by the User (Formally or Implied) PROBLEM CONTEXT IDENTIFY M&S DEFICIENCIES IDENTIFY WORK-AROUNDS, USAGE CONSTRAINTS, REQUIRED IMPROVEMENTS AND RISKS ACCREDITATION DECISION M&S INFORMATION Data Quality M&S Documentation Design Documentation Configuration Mgt V&V Results Etc. Provided by the Model Developer or Model Proponent TO PROVE THE M&S IS FIT FOR PURPOSE: REQUIRES AN OBJECTIVE COMPARISON OF M&S REQUIREMENTS WITH M&S INFORMATION WITHIN THE CONTEXT OF THE PROBLEM TO ASSESS THE RESIDUAL RISK OF USING THE M&S 20