
d9b8942429723e88e794b94ab64fee20.ppt
- Количество слайдов: 16
A Prescriptive Adaptive Test Framework (PATFrame) for Unmanned and Autonomous Systems: A Collaboration Between MIT, USC, UT Arlington and Softstar Systems Dr. Ricardo Valerdi Massachusetts Institute of Technology March 10 , 2010
Outline • • • The Challenge PATFrame team PATFrame objectives & scope PATFrame features Next steps “Anything that gives us new knowledge gives us an opportunity to be more rational” - Herb Simon http: //lean. mit. edu © 2010 Massachusetts Institute of Technology Valerdi- 2
Sponsors Transition Partners
PATFrame kickoff meeting Fort Hood, TX - Aug 2009 http: //lean. mit. edu © 2010 Massachusetts Institute of Technology Valerdi- 4
The Challenge: Science Fiction to Reality “You will be trying to apply international law written for the Second World War to Star Trek technology. ” Singer, P. W. , Wired For War: The Robotics Revolution and Conflict in the 21 st Century (Penguin, 2009) http: //lean. mit. edu © 2010 Massachusetts Institute of Technology Valerdi- 5
PATFrame Objective To provide a decision support tool encompassing a prescriptive and adaptive framework for UAS So. S Testing • • • PATFrame will be implemented using a software dashboard that will enable improved decision making for the UAS T&E community Focused on addressing BAA topics TTE-6 Prescribed System of Systems Environments and MA-6 Adaptive Architectural Frameworks Three University team (MIT-USC-UTA) draws from experts in test & evaluation, decision theory, systems engineering, software architectures, robotics and modeling http: //mit. edu/patframe http: //lean. mit. edu © 2010 Massachusetts Institute of Technology Valerdi- 6
Prescriptive Adaptive Test Framework Test Strategy/ Test Infrastructure Prescriptive Adaptive System under test http: //lean. mit. edu © 2010 Massachusetts Institute of Technology Valerdi- 7
Time Scale for Testing Decisions PATFrame Scope Test Execution Data Collection, Test Long Term (real time) analysis & Planning Development Planning reprioritization (in So. S (i. e. , design for Decisions/ environment) testability) investments Minutes http: //lean. mit. edu Hours Days Months Years © 2010 Massachusetts Institute of Technology Valerdi- 8
Net-centricity of the environment Testing a system in a So. S environment net-centric focus So. S Testing a So. S in So. S environment Ultimate Goal DARPA Urban Grand Challenge Use case (UAS in So. S) UAST focus http: //lean. mit. edu none Human S So. S Complexity of the system under test AI A of uto un sys nom de te y rt m es t Difficulty PATFrame Initial Focus: Testing Autonomous System in So. S environment Testing So. S © 2010 Massachusetts Institute of Technology Valerdi- 9
Prescribed System of Systems Environment Goal: Synthetic framework for So. S testing at single and multi-program level Goal: Construct Theoretical Best “So. S” Test Normative Metric set, “best” levels Prescriptive “success” … Metrics, state of the practice levels “Successful So. S Test” = f(metric. A, metric. B, etc. ) Metric. B Metric. C … Metric. N (Metric. A) Descriptive Goal: Capture Actual So. S Test Actual So. S tests include metric. A’, metric. C, etc. http: //lean. mit. edu Metric. A limit Normative (best) Descriptive (So. P) Descriptive (actual) Potential (new test A) test A contribution to state of the practice © 2010 Massachusetts Institute of Technology Valerdi- 10
Integrated Test Management http: //lean. mit. edu © 2010 Massachusetts Institute of Technology Valerdi- 11
Real Options as Prescriptive and Adaptive Framework for So. S Testing • Use case: • • • Question: what to test? (i. e. what So. S test scenarios to prescribe? ) Inputs: models of autonomous/net-centric system of systems, major uncertainties Outputs: enablers and types of real options for responding to uncertainties as candidate test scenarios (i. e. identification of how So. S can adapt) Identification of Real Options: Joint/ So. S Model: coupled dependency structure matrix 1. Real option to adjust comm. range using flexible range comm. systems on vehicles 1, 2 2. Real option to use Vehicle 3 as relay Vehicle 1 Ontology Objective: Maintain Vehicle 1 Vehicle 2 comm. Uncertainty: proximity of vehicles 1 and 2 (Army) Vehicle 2 3 (Navy) Uncertainties Vehicle 3 Mission objectives 2 (Air Force) http: //lean. mit. edu 1 © 2010 Massachusetts Institute of Technology Valerdi- 12
Testing to Reduce So. S Risks vs. the Risks of Performing Tests on an So. S Risks • What are unique risks for UAS’s? For UAS’s operating in an So. S environment? • How do you use testing to mitigate these risks? • What are metrics that you are using to measure the level of risk? http: //lean. mit. edu Risks of Testing an So. S • What are unique programmatic risks that impact your ability to do testing on UAS’s? To do testing on UAS’s operating in an So. S environment? • What methods do you use to mitigate these risks? • What are the metrics that you are using to measure the level of programmatic risk in testing? © 2010 Massachusetts Institute of Technology Valerdi- 13
Example PATFrame Tool Concept • Question: • • Technology: • • • Defect estimation model Trade Quality for Delivery Schedule Inputs: • • When am I Done testing? Defects discovered Outputs: • Defects remaining, cost to quit, cost to continue 14 http: //lean. mit. edu © 2010 Massachusetts Institute of Technology Valerdi- 14
When am I done testing? http: //lean. mit. edu 15 © 2010 Massachusetts Institute of Technology Valerdi- 15
PATFrame Primary Inputs to PATFrame Models of the system under test Model of the So. S environment Mission needs/goals Knowledge Base Analysis Techniques • How should my test • strategy change? • What realizable options Reasoning Engine are available? Plan Ontology Test Requirements Analyze and Design Execute Available resources • and time Evaluate Risk & Uncertainty Analysis (leading indicators) Primary Outputs for Test and Evaluation Process Analysis (System Dynamics) • Which test do I run and in what order? • When am I done • Testing?