Скачать презентацию ISPRAS Experience in Model Based Testing Alexander K Скачать презентацию ISPRAS Experience in Model Based Testing Alexander K

2c44c7df96d01bacc6948f340d100a89.ppt

  • Количество слайдов: 33

ISPRAS Experience in Model Based Testing Alexander K. Petrenko, Institute for System Programming of ISPRAS Experience in Model Based Testing Alexander K. Petrenko, Institute for System Programming of Russian Academy of Sciences (ISPRAS), http: //www. ispras. ru 19. 09. 2002 Intel Academic Forum. Budapest, September, 2002

ISPRAS Experience in Industrial Model Based Testing 19. 09. 2002 Intel Academic Forum. Budapest, ISPRAS Experience in Industrial Model Based Testing 19. 09. 2002 Intel Academic Forum. Budapest, September, 2002

Why Model Based Testing? n n n Exhaustive testing that covers all implementation paths Why Model Based Testing? n n n Exhaustive testing that covers all implementation paths is impossible. Exhaustive implementation based (“white box”) testing does not guaranty correct functionality. White box testing leads to increasing duration of development because test development can be launched only when the implementation is completed. Nevertheless, we like to conduct systematic testing. Formal models propose n basis for systematic testing, we derive from the models ¨ ¨ ¨ n test coverage metrics, input stimulus, results correctness criteria. test development ahead of implementation schedule. 19. 09. 2002 Intel Academic Forum. Budapest, September, 2002

Model Checking vs. Model Based Testing Model Checking Model Based Testing Answers the question Model Checking vs. Model Based Testing Model Checking Model Based Testing Answers the question Is the model correct? Does the implementation behavior conform to the model behavior? Expected result Correct model Test suite for the implementation testing and proper implementation Complexity of the models More simple in comparison with implementation because restriction of analytic analysis Close to complexity of implementation under test Relation between model and implementation Very complicated Simple 19. 09. 2002 Intel Academic Forum. Budapest, September, 2002

Synonyms n n Models (Formal) Specification We consider behavior/functional models. The models provide simplified, Synonyms n n Models (Formal) Specification We consider behavior/functional models. The models provide simplified, abstract view on the target software/hardware. Processing of the models needs their formal description/specification. 19. 09. 2002 Intel Academic Forum. Budapest, September, 2002

Model Based Testing Approach Generate exhaustive test suites for model of implementation Translate the Model Based Testing Approach Generate exhaustive test suites for model of implementation Translate the test suites to implementation level Apply the tests to implementation under test (Optionally) Interpret the testing results in terms of the model 19. 09. 2002 Intel Academic Forum. Budapest, September, 2002

Related Works IBM Research Laboratory (Haifa, Israel) n Microsoft Research (Redmond, US) n 19. Related Works IBM Research Laboratory (Haifa, Israel) n Microsoft Research (Redmond, US) n 19. 09. 2002 Intel Academic Forum. Budapest, September, 2002

Examples of Model Based Testing Applications n IBM Research Laboratory (Haifa, Israel) Store Date Examples of Model Based Testing Applications n IBM Research Laboratory (Haifa, Israel) Store Date Unit – digital signal processor ¨ API of file system, telephony and Internet protocols etc. ¨ n Microsoft Research (Redmond, US) ¨ n Universal Pn. P interface ISPRAS (Moscow, Russia) Kernel of operating system (Nortel Networks) ¨ IPv 6 protocol (Microsoft) ¨ Compiler optimization units (Intel) ¨ Massive parallel compiler testing (RFBR, Russia) ¨ 19. 09. 2002 Intel Academic Forum. Budapest, September, 2002

Origin of ISPRAS Methods n n 1987 -1994 Test suite for compiler of real-time Origin of ISPRAS Methods n n 1987 -1994 Test suite for compiler of real-time programming language for “Buran” space shuttle 1994 – 1996 ISP RAS – Nortel Networks contract on functional test suite development for Switch Operating System kernel ¨ n Few hundreds of bugs found in the OS kernel, which had been 10 years in use KVEST technology About 600 K lines of Nortel code tested by 2000 19. 09. 2002 Intel Academic Forum. Budapest, September, 2002

ISPRAS Model Based Testing: Two Approaches n n 19. 09. 2002 Uni. Tes. K ISPRAS Model Based Testing: Two Approaches n n 19. 09. 2002 Uni. Tes. K Testing of Application Program Interfaces (API) based on Software Contract Lama Compiler testing based on LAnguage Model Application (Lama) Intel Academic Forum. Budapest, September, 2002

Uni. Tes. K Testing of Application Program Interfaces (API) 19. 09. 2002 Intel Academic Uni. Tes. K Testing of Application Program Interfaces (API) 19. 09. 2002 Intel Academic Forum. Budapest, September, 2002

What is API? U Inte ser rfac e Application Program Interface (API) 19. 09. What is API? U Inte ser rfac e Application Program Interface (API) 19. 09. 2002 Intel Academic Forum. Budapest, September, 2002

Functional Testing Uni. Tes. K method deals with functional testing Requirements Formal Specifications To Functional Testing Uni. Tes. K method deals with functional testing Requirements Formal Specifications To automate testing we provide a formal representation of requirements 19. 09. 2002 Intel Academic Forum. Budapest, September, 2002 Tests

Uni. Tes. K Process Phases Techniques Interface specification n Pre- and post-conditions, invariants Test Uni. Tes. K Process Phases Techniques Interface specification n Pre- and post-conditions, invariants Test scenario description n Implicit Finite State Machines (FSM), data iterators n Test coverage metrics based on specification structure Test execution Test result analysis 19. 09. 2002 Intel Academic Forum. Budapest, September, 2002

Decomposition of Testing Tasks n n The entire test is a test sequence intended Decomposition of Testing Tasks n n The entire test is a test sequence intended to achieve specified coverage From specification we can generate oracles and define test coverage metrics 19. 09. 2002 Intel Academic Forum. Budapest, September, 2002 Test sequence construction Test oracles System under test

Test Suite Architecture Test scenario Scenario driver Data model Specification Mediator Legend: Java/C/C++/C# mediator Test Suite Architecture Test scenario Scenario driver Data model Specification Mediator Legend: Java/C/C++/C# mediator Manual Pre-built Test engine Test oracle System under test Generated Automatic derivation 19. 09. 2002 Test coverage tracker Intel Academic Forum. Budapest, September, 2002

Test Oracle for the method f Specification of method f integer f (a : Test Oracle for the method f Specification of method f integer f (a : float) post { post_f (a, f_result) f_result = f(x) post_f(x, f_result) } verdict = true verdict = false 19. 09. 2002 Intel Academic Forum. Budapest, September, 2002

Test Coverage Metrics Based on Specification Structure Specification post { if (a || b Test Coverage Metrics Based on Specification Structure Specification post { if (a || b || c || d && e ) { branch “OK“; . . . } else { branch “Bad parameters" ; . . . } } 19. 09. 2002 Partition (Derivation of branches and logical terms) BRANCH -a -!a && -. . . BRANCH -!a && Intel Academic Forum. Budapest, September, 2002 “OK” b !b && c -- op 1 -- op 2 -- op 3 “Bad parameters" !b && !c && !d !b && !c && d && !e

Test Sequence Generation We use FSM to generate test sequences which traverse all equivalence Test Sequence Generation We use FSM to generate test sequences which traverse all equivalence classes defined by partition analysis. op 2 S 1 op 1 S 2 op 3 S 4 S 3 op 3 19. 09. 2002 Intel Academic Forum. Budapest, September, 2002 But full FSM description is a labor consuming and tedious work.

FSM Construction. Statics First step of FSM construction: state and transition partition based on FSM Construction. Statics First step of FSM construction: state and transition partition based on pre- and post-condition structure (FSM factorization) test input iterators Equivalence classes of states SC 1 SC 2 SC 3 SC 4 19. 09. 2002 op 2 SC 1 op 1 Partition (Branches and logical terms) BRANCH “OK” -a -!a && b -!a && !b && c -. . . SC 2 op 3 -- op 1 -- op 2 -- op 3 BRANCH "Bad parameters" -!a && !b && !c && !d -– opi -!a && !b && !c && d && !e -– opi+1 Intel Academic Forum. Budapest, September, 2002 op 3 SC 4 SC 3 op 3

FSM Construction. Dynamics Second step of FSM construction op 2 SC 1 op 1 FSM Construction. Dynamics Second step of FSM construction op 2 SC 1 op 1 SC 2 SC 1 op 2 op 1 SC 2 op 3 Result of test execution op 3 SC 4 SC 3 op 3 19. 09. 2002 op 21 Intel Academic Forum. Budapest, September, 2002 op 3 SC 3 op 3

Model Based Testing: problems of deployment n 1994 – 1996 ISP RAS – Nortel Model Based Testing: problems of deployment n 1994 – 1996 ISP RAS – Nortel Networks contract on functional test suite development for Switch Operating System kernel ¨ Few hundreds of bugs found in the OS kernel, which had been 10 years in use KVEST had been deployed n KVEST technology only in Nortel’s regression testing process. About 600 K lines of Nortel code tested by 2000 Why? Only few formal techniques used in real life practice. Why? 19. 09. 2002 Intel Academic Forum. Budapest, September, 2002

Problems of Model Based Testing Deployment Problem Uni. Tes. K solution Formal models for Problems of Model Based Testing Deployment Problem Uni. Tes. K solution Formal models for analytical verification are too simple for test generation. Ratio of models (specifications) and implementation size is about 1: 5 -10. Mediators provide a bridge between abstract models and implementation. Executable models cannot provide test oracles in common case because dependence on implementation and indeterminism. Implicit specifications (pre- and post-conditions) provide the test oracles. Test sequence generation needs very huge models (for example, like FSM). Implicit FSM. Usual number of states is about 520. How to estimate test quality without implementation test coverage? Structure of pre- and post-condition is very informative and quite simple for the test coverage metrics. Gap between formal techniques and software/hardware development practice. Usual programming languages are extended for specification purpose. 19. 09. 2002 Intel Academic Forum. Budapest, September, 2002

Uni. Tes. K Tools and Applications n CTes. K – C testing tool ¨ Uni. Tes. K Tools and Applications n CTes. K – C testing tool ¨ Microsoft n -- alpha version IPv 6 implementation [email protected] – Java testing tool -- beta version ¨ Partially tested by itself ¨ API of parallel debugger of mp. C IDE (mp. C is a parallel extension of C) ¨ Posix/Win 32 File I/O subsystem n VDM++Tes. K -- free Further steps: C#Tes. K and C++Tes. K, (conceivably) VHDLTes. K 19. 09. 2002 Intel Academic Forum. Budapest, September, 2002

Lama Compiler testing based on Language Models Application 19. 09. 2002 Intel Academic Forum. Lama Compiler testing based on Language Models Application 19. 09. 2002 Intel Academic Forum. Budapest, September, 2002

Pilot project under contract with Intel Model Based Testing of Compiler Optimization Units Automate Pilot project under contract with Intel Model Based Testing of Compiler Optimization Units Automate optimization unit test generation: n Improve test coverage of the units n Automate test oracle problem 19. 09. 2002 Intel Academic Forum. Budapest, September, 2002

Lama approach Lama stands for Compiler testing based on LAnguage Models Application. Lama process Lama approach Lama stands for Compiler testing based on LAnguage Models Application. Lama process steps: Given: a programming language (PL) n Invent a model (simplified) language (ML) of PL n Generate a set of test “programs” in the ML n Map ML test “programs” into PL test programs n Run compiler (or a compiler unit) to process test program in PL and analyze correctness of the compiler results 19. 09. 2002 Intel Academic Forum. Budapest, September, 2002

Process of optimization unit testing Optimization background Model language design Model language building blocks Process of optimization unit testing Optimization background Model language design Model language building blocks Step 2 Iterator development Test “programs” in ML Step 3 Mapper development Test programs in PL Step 1 Program Language (PL) Specification Step 4 19. 09. 2002 Test Execution & Test result analysis Intel Academic Forum. Budapest, September, 2002 Faults & test coverage reports

An Example: Common Subexpression Elimination Optimization if condition IF instruction then block Common subexpression An Example: Common Subexpression Elimination Optimization if condition IF instruction then block Common subexpression Basic block Label Instruction Step 1 . . . Instruction Transition to label 19. 09. 2002 Intel Academic Forum. Budapest, September, 2002 else block

Result of translation into C Step if ( (('c' - 'a') + (('c' - Result of translation into C Step if ( (('c' - 'a') + (('c' - 'a') > ('c' - 'a'))) ) { (('c' - 'a') + (('c' - 'a') < ('c' - 'a'))); } else { (('c' - 'a') + (('c' - 'a') >= ('c' - 'a'))); } 3 19. 09. 2002 Intel Academic Forum. Budapest, September, 2002

Conclusion 19. 09. 2002 Intel Academic Forum. Budapest, September, 2002 Conclusion 19. 09. 2002 Intel Academic Forum. Budapest, September, 2002

Conclusion on Uni. Tes. K&Lama n n n Both Uni. Tes. K and Lama Conclusion on Uni. Tes. K&Lama n n n Both Uni. Tes. K and Lama follow model based testing approach Base idea: Testing complex software by means of exhaustive coverage of relatively simple models Area of applicability: Any software and hardware components with well-defined interfaces or functional properties 19. 09. 2002 Intel Academic Forum. Budapest, September, 2002

References 1. 2. 3. 4. 5. 6. A. K. Petrenko, I. B. Bourdonov, A. References 1. 2. 3. 4. 5. 6. A. K. Petrenko, I. B. Bourdonov, A. S. Kossatchev, V. V. Kuliamin. Uni. Tes. K Test Suite Architecture// Proceedings of FME’ 2002 conference, Copenhagen, Denmark, LNCS, No. 2391, 2002, pp. 77 -88. A. Petrenko. Specification Based Testing: Towards Practice// VI Ershov conference proc. , LNCS 2244, 2001. A. K. Petrenko, Bourdonov, A. S. Kossatchev, V. V. Kuliamin. Experiences in using testing tools and technology in real-life applications. Proceedings of SETT’ 01, Pune, India, 2001. I. B. Bourdonov, A. S. Kossatchev, V. V. Kuliamin. Using Finite State Machines in Program Testing// Programming and Computer Software, Vol. 26, No. 2, 2000, pp. 61 -73 (English version). I. Bourdonov, A. Kossatchev, A. Petrenko, and D. Galter. KVEST: Automated Generation of Test Suites from Formal Specifications// Proceedings of World Congress of Formal Methods, Toulouse, France, LNCS, No. 1708, 1999, pp. 608 -621. I. B. Bourdonov, A. S. Kossatchev, V. V. Kuliamin, A. V. Maximov. Testing Programs Modeled by Nondeterministic Finite State Machine. (www. ispras. ru/~Red. Verst/ , white papers). 19. 09. 2002 Intel Academic Forum. Budapest, September, 2002