Скачать презентацию T-76 4 5 115 Software Project Quality Practices in Course Скачать презентацию T-76 4 5 115 Software Project Quality Practices in Course

83701b7a667afb0b7c3df984c4df35c1.ppt

  • Количество слайдов: 49

T-76. (4/5)115 Software Project Quality Practices in Course Projects 18. 10. 2003 Juha Itkonen T-76. (4/5)115 Software Project Quality Practices in Course Projects 18. 10. 2003 Juha Itkonen Sober. IT, HUT HELSINKI UNIVERSITY OF TECHNOLOGY

Contents q q q Testing as a part of incremental development Exploratory peer testing Contents q q q Testing as a part of incremental development Exploratory peer testing approach Test planning Test reporting Designing and managing test cases HELSINKI UNIVERSITY OF TECHNOLOGY Juha Itkonen Sober. IT/HUT 2

Quality practices as part of incremental development HELSINKI UNIVERSITY OF TECHNOLOGY Juha Itkonen Sober. Quality practices as part of incremental development HELSINKI UNIVERSITY OF TECHNOLOGY Juha Itkonen Sober. IT/HUT 3

Quality practices are an integral part of sw development q Often, testing is seen Quality practices are an integral part of sw development q Often, testing is seen as some separate, last phase of software development process That can be outsourced to separate testing team Ø That only deeds to be done just before the release – if there is any time Ø q Quality practices can not be separated from rest of the software development Testing has to be involved from the beginning Ø Testers can, and should, contribute in each phase of the software development life-cycle Ø QA is much more than the final acceptance testing phase Ø HELSINKI UNIVERSITY OF TECHNOLOGY Juha Itkonen Sober. IT/HUT 4

The V-model of testing q q V-model is an extension of the waterfall model The V-model of testing q q V-model is an extension of the waterfall model You can imagine a little V-model inside each iteration Ø q However, you might want to be more iterative on iteration level, too. Do not take the V-model as a process for the whole project HELSINKI UNIVERSITY OF TECHNOLOGY Juha Itkonen Sober. IT/HUT 5

Two extremes of organizing testing Waterfall model Agile models (XP) Coders Tester Customer Coder Two extremes of organizing testing Waterfall model Agile models (XP) Coders Tester Customer Coder Testers HELSINKI UNIVERSITY OF TECHNOLOGY Leading idea: Testing in collaboration Juha Itkonen Sober. IT/HUT 6

Strive for more agile approach on this course q You have fixed Schedule Ø Strive for more agile approach on this course q You have fixed Schedule Ø Resources Ø q Flexibility is in scope and quality Ø q q q Quality won’t appear without planning and explicit actions You don’t have separate testing resources You probably don’t have comprehensive documentation You probably have more or less ambiguity and instability in your requirements You don’t have too much effort to spend You have big risks HELSINKI UNIVERSITY OF TECHNOLOGY Juha Itkonen Sober. IT/HUT 7

Execute tests incrementally q q q Each iteration delivers tested software Don’t plan test Execute tests incrementally q q q Each iteration delivers tested software Don’t plan test execution as a separate phase after development Unit tests are executed as a part of coding activity Ø q Functional system tests can be designed and executed simultaneously with implementation Ø q Test driven development Enables fast feedback Remember tracking Ø Ø Ø What was tested What version and environment When it was tested By whom What were the results How and for what purpose do you use the results? HELSINKI UNIVERSITY OF TECHNOLOGY Juha Itkonen Sober. IT/HUT 8

Involve the customer early q Take the customer with you to specify or review Involve the customer early q Take the customer with you to specify or review the test cases Ø q The customer plays the oracle role Give the customer the opportunity to Execute pre-specified or exploratory tests Ø Play around with the system Ø Ø Before the FD phase HELSINKI UNIVERSITY OF TECHNOLOGY Juha Itkonen Sober. IT/HUT 9

Peer testing with exploratory approach HELSINKI UNIVERSITY OF TECHNOLOGY Juha Itkonen Sober. IT/HUT 10 Peer testing with exploratory approach HELSINKI UNIVERSITY OF TECHNOLOGY Juha Itkonen Sober. IT/HUT 10

Exploratory Testing (ET) is q q Testing without predefined test cases Manual testing Based Exploratory Testing (ET) is q q Testing without predefined test cases Manual testing Based on experience, knowledge and skills of the tester Ø Without pre-documented test steps (detailed test cases) Ø q Exploring the software or system Goal is to expose quality-related information Ø Continually adjusting plans, re-focusing on the most promising risk areas Ø Following hunches Ø q Minimizing time spent on (pre)documentation HELSINKI UNIVERSITY OF TECHNOLOGY Juha Itkonen Sober. IT/HUT 11

Exploratory Testing is not a technique q q q It is an approach Many Exploratory Testing is not a technique q q q It is an approach Many testing techniques can be used in exploratory way Exploratory testing vs. scripted testing are the ends of a continuum Pure scripted (automated) testing Freestyle exploratory “bug hunting” Fragmentary test cases Chartered exploratory testing HELSINKI UNIVERSITY OF TECHNOLOGY Vague scripts Juha Itkonen Sober. IT/HUT 12

Definition of Exploratory Testing 1. Tests are not defined in advance as detailed test Definition of Exploratory Testing 1. Tests are not defined in advance as detailed test scripts or test cases. Ø 2. Exploratory testing is guided by the results of previously performed tests and the gained knowledge from them. Ø 3. 5. An exploratory tester uses any available information of the target of testing, for example a requirements document, a user’s manual, or even a marketing brochure. The focus in exploratory testing is on finding defects by exploration Ø 4. Instead, exploratory testing is exploration with a general mission without specific step-by-step instructions on how to accomplish the mission. Instead of systematically producing a comprehensive set of test cases for later use. Exploratory testing is simultaneous learning of the system under test, test design, and test execution. The effectiveness of the testing relies on the tester’s knowledge, skills, and experience. HELSINKI UNIVERSITY OF TECHNOLOGY Juha Itkonen Sober. IT/HUT 13

Scripted vs. Exploratory Testing q q In scripted testing, tests are first designed and Scripted vs. Exploratory Testing q q In scripted testing, tests are first designed and recorded. Then they may be executed at some later time or by a different tester. In exploratory testing, tests are designed and executed at the same time, and they often are not recorded. Ø You build a mental model of the product while you test it. This model includes what the product is and how it behaves, and how it’s supposed to behave HELSINKI UNIVERSITY OF TECHNOLOGY Juha Itkonen Sober. IT/HUT Tests Product James Bach, Rapid Software Testing, 2002 14

Exploratory Function Testing q Use list of functions to give structure and high level Exploratory Function Testing q Use list of functions to give structure and high level guide to your testing Requirements specification Ø Functional specification Ø User manual Ø q Explore creatively each individual function and interactions of functions Ø Cover side paths, interesting and suspicious areas Ø Ø Exceptional inputs, error situations Utilize the information gained during the testing Ø Simultaneous learning Tests are designed simultaneously with test execution Ø Use the list of functions to get back on track Ø q Coverage and progress is planned and tracked by functions Ø Not by test cases HELSINKI UNIVERSITY OF TECHNOLOGY Juha Itkonen Sober. IT/HUT 15

Session Based Test Management A method for managing ET q q Charter Time Box Session Based Test Management A method for managing ET q q Charter Time Box Reviewable Result Debriefing HELSINKI UNIVERSITY OF TECHNOLOGY Juha Itkonen Sober. IT/HUT 16

Charter q q Architecting the Charters is test planning Brief information / guidelines on: Charter q q Architecting the Charters is test planning Brief information / guidelines on: Ø What should be tested? Ø Ø Why do we test this? Ø Ø Ø q goals How to test (approach)? Ø Ø Areas, components, features, … Specific techniques or tactics to be used Test data What problems to look for? Might include guidelines on: Tools to use Ø What risks are involved Ø Documents to examine Ø Desired output from the testing Ø HELSINKI UNIVERSITY OF TECHNOLOGY Juha Itkonen Sober. IT/HUT 17

Time Box Short: 60 minutes (+-15) Normal: 90 minutes (+-15) Long: 120 minutes (+-15) Time Box Short: 60 minutes (+-15) Normal: 90 minutes (+-15) Long: 120 minutes (+-15) q q q q Focused test effort of fixed duration Brief enough for accurate reporting Brief enough to allow flexible scheduling Brief enough to allow course correction Long enough to get solid testing done Long enough for efficient debriefings Beware of overly precise timing HELSINKI UNIVERSITY OF TECHNOLOGY Juha Itkonen Sober. IT/HUT 18

Reviewable results q q Charter Effort Breakdown Ø Ø Ø q q Duration (hours: Reviewable results q q Charter Effort Breakdown Ø Ø Ø q q Duration (hours: minutes) Test design and execution (percent) Bug investigation and reporting (percent) Session setup (percent) Charter / opportunity (percent/percent) Data Files Test Notes Bugs Issues HELSINKI UNIVERSITY OF TECHNOLOGY Juha Itkonen Sober. IT/HUT 19

Debriefing q q q q The test lead reviews session sheet to assure that Debriefing q q q q The test lead reviews session sheet to assure that he understands it and that it follows the protocol The tester answers any questions Session metrics are checked Charter may be adjusted Session may be extended New sessions may be chartered Coaching / Mentoring happens HELSINKI UNIVERSITY OF TECHNOLOGY Juha Itkonen Sober. IT/HUT 20

Peer Testing in I 2 iteration q q Peer group pairs are on course Peer Testing in I 2 iteration q q Peer group pairs are on course web pages Plan and prepare for peer testing already before I 2 Ø Ø Ø q Delivering and installing the system Meetings (preparation and debriefing) Agreeing on total effort 17. 2. 2005 - Hand-off the system to the peer group Ø Ø The system under test All relevant documentation Ø Ø Ø Test Charter (at least 2 charters) Ø Ø q q User and installation manual Known bugs, bug reporting guidelines one general charter, provided by course and at least 1 from the group whose system is tested Peer testing execution 21. 2. 2005 - Peer testing reports delivered to the other group Ø Agree this with your peer group HELSINKI UNIVERSITY OF TECHNOLOGY Juha Itkonen Sober. IT/HUT 21

Peer test reporting q Iteration I 2 peer test deliverables Peer test reports and Peer test reporting q Iteration I 2 peer test deliverables Peer test reports and session logs x 2 (Own and peer group’s report) Ø Defect reports directly into bug tracking system Ø Ø Peer testing defect reports into the other group’s system Bug summary listing as an appendix in the test report In the final report you should assess peer group’s testing efforts and results HELSINKI UNIVERSITY OF TECHNOLOGY Juha Itkonen Sober. IT/HUT 22

Test Planning HELSINKI UNIVERSITY OF TECHNOLOGY Juha Itkonen Sober. IT/HUT 23 Test Planning HELSINKI UNIVERSITY OF TECHNOLOGY Juha Itkonen Sober. IT/HUT 23

Checklist for test planning q q q Overall test objectives What will and won’t Checklist for test planning q q q Overall test objectives What will and won’t be tested Test approach (why) (what) (how) Test phases Ø Test strategy, methods, techniques, … Ø Metrics and statistics Ø q Resource requirements (who) Tester assignments and responsibilities Ø Test environments Ø q q Test tasks and schedule Risks and issues HELSINKI UNIVERSITY OF TECHNOLOGY Juha Itkonen Sober. IT/HUT (when) 24

Overall test objectives (why) q The quality goals of the project What is to Overall test objectives (why) q The quality goals of the project What is to be achieved by the quality practices? and what are the most important qualities and risks for this product? Why are we testing? q This course q q q Plan and document your quality goals in project plan chapter 5. 2. 1 Ø Metrics that are used to evaluate the quality of the results in the end of each iteration Ø Ø Ø Plan and document in project plan chapter 5. 2. 1 Should be visible in project plan chapter 6. HELSINKI UNIVERSITY OF TECHNOLOGY Juha Itkonen Sober. IT/HUT 25

What will and won’t be tested (scope) q Identify components and features of the What will and won’t be tested (scope) q Identify components and features of the software under test High-enough abstraction level Ø Prioritize Ø q q Both functional and non-functional aspects Consider time, resources and risks Ø q q Everything can’t be tested and everything that is tested can’t be tested thoroughly Identify separately components and features that are not tested This course Document in project plan chapter 5. 2. 2 Ø For each iteration Ø HELSINKI UNIVERSITY OF TECHNOLOGY Juha Itkonen Sober. IT/HUT 26

Test case organization and tracking q Prioritizing tests The most severe failures Ø The Test case organization and tracking q Prioritizing tests The most severe failures Ø The most likely faults Ø Priorities of use cases Ø Ø End-user prioritizing the requirements Most faults in the past Ø Most complex or critical Ø Positive / negative Ø … Ø q Create test suites Ø Ø Ø Ø Test-to-Pass (Positive testing) Test-to-Fail (Negative testing) Smoke test suite Regression test suite Functional suites Different platforms Priorities … HELSINKI UNIVERSITY OF TECHNOLOGY Juha Itkonen Sober. IT/HUT 27

Test approach (how) q How testing is performed in general and in each iteration Test approach (how) q How testing is performed in general and in each iteration Ø Ø Levels of testing Test techniques Ø Ø Functional, non-functional Methods and techniques Tools and automation exploratory testing — q What other QA activities are used and how Ø Ø Ø q how the testing results are utilized and the feedback provided to steering the project Scope of test documentation Ø Ø q document/code reviews or inspections coding standard collecting continuous feedback from the customer Reporting and defect management procedures Ø q Used in peer testing (use also to supplement the planned tests) On what level and how test cases are documented What other test documentation is produced This course Ø Ø Ø Plan the approach and document in the project plan General approach in chapter 5. 2. 1 Details for each iteration in chapters 5. 2. 2 HELSINKI UNIVERSITY OF TECHNOLOGY Juha Itkonen Sober. IT/HUT 28

Resource requirements (who) q q People q How many, what expertise Ø Responsibilities Ø Resource requirements (who) q q People q How many, what expertise Ø Responsibilities Ø q Equipment Ø q Word processors, databases, custom tools. What will be purchased, what needs to be written? Miscellaneous supplies Ø q Where will they be located? How big will they be? How will they be arranged? Tools and documents Ø q Computers, test hardware, printers, tools. Office and lab space Ø q q Define responsibilities Identify limited / critical resources Location and availability Disks, phones, reference books, training material. Whatever else might be needed over the course of the project. This course Ø Document in the project plan HELSINKI UNIVERSITY OF TECHNOLOGY Juha Itkonen Sober. IT/HUT 29

Test environments q Identification of test environments Ø q Prioritization and focusing test suites Test environments q Identification of test environments Ø q Prioritization and focusing test suites on each Ø Ø q q Number of combinations can be huge Regression testing in different environments Scheduling implications Test lab Ø Ø Ø q Hardware, software, os, network, . . . Different hardware and software platforms Cleaning the machines Setting up the test data Moving from platform to another People vs. hardware needs This course Ø Plan carefully what is a realistic goal for testing in different environments Ø Ø Ø Quality goals of the project Prioritize Document your choices in the test plan HELSINKI UNIVERSITY OF TECHNOLOGY Juha Itkonen Sober. IT/HUT 30

Testing tasks and schedule (who) q Work Breakdown Structure (WBS) Ø Ø Ø q Testing tasks and schedule (who) q Work Breakdown Structure (WBS) Ø Ø Ø q q q Mapping testing to overall project schedule Both duration and effort Build schedule Ø Ø q Areas of the software Testable features Assigning responsibilities Number of test cycles Regression tests Releases Ø External links, i. e. Beta testing q Consider using relative dates q This course Ø Ø Document in the project plan If you are going to do e. g. usability testing and performance testing or code reviews there should be corresponding tasks in the project schedule HELSINKI UNIVERSITY OF TECHNOLOGY Juha Itkonen Sober. IT/HUT 31

QA planning during iteration I 1 planning DL 31. 10. Project level q Identify QA planning during iteration I 1 planning DL 31. 10. Project level q Identify quality goals q Plan QA approach (strategy) Ø Ø q q q Ø Ø How to achieve the goals Document in project plan chapter 5. 2 q Document in project plan chapter 5. 3 q Plan test case organization and tracking Deliverables and metrics How the results are used Ø q You have less than 2 weeks to do project level and I 1 QA planning! HELSINKI UNIVERSITY OF TECHNOLOGY Juha Itkonen Sober. IT/HUT i. e. , how many times and when certain tests are executed Tasks and schedule Ø Ø Ø For what purpose What QA practices are used How practices are used Priorities of testing Testing rounds Ø q Features, quality attributes What won’t be tested Details of the QA approach Ø Ø Ø Plan test environments and tools Ø q Iteration level q What will be tested Resources Responsibilities Test deliverables Document in the project plan chapter 6. 32

Test Reporting HELSINKI UNIVERSITY OF TECHNOLOGY Juha Itkonen Sober. IT/HUT 33 Test Reporting HELSINKI UNIVERSITY OF TECHNOLOGY Juha Itkonen Sober. IT/HUT 33

Defect tracking and reporting q Why defect tracking You don’t forget found defects Ø Defect tracking and reporting q Why defect tracking You don’t forget found defects Ø You get metrics Ø q Think what bugs are reported and when During coding? Ø After inspection? Ø Not before system testing? Ø q Bug lifecycle When and how bugs are managed Ø When and what bugs are fixed Ø Ø q Use Bugzilla or some other defect tracking system Ø q Who decides, when and how Bugzilla provided by the course Document your choices in project plan chapter 5. 2 HELSINKI UNIVERSITY OF TECHNOLOGY Juha Itkonen Sober. IT/HUT 34

Bug metrics Reported Closed Open I 1 10 5 5 Total open This iteration Bug metrics Reported Closed Open I 1 10 5 5 Total open This iteration reported q I 2 75 45 35 Total 85 50 Block Critical Major Minor Trivial 1 2 5 10 17 0 1 10 15 49 Total 35 75 Description of severe bugs found and open Ø other QA metrics Ø Ø unit test coverage code reviews source code metrics … HELSINKI UNIVERSITY OF TECHNOLOGY Juha Itkonen Sober. IT/HUT 35

Quality assessment 1/2 Functional area Coverage File conversions 2 J Only few minor defects Quality assessment 1/2 Functional area Coverage File conversions 2 J Only few minor defects found, very efficient implementation. GUI editor 0 Not started Encoder 3 K L Admin tools 1 K Nothing serious yet q q q Quality Comments 2 critical bugs found during last test round, lot of small problems Max 10 -20 functional areas Testers’ assessment of the current quality status of the system You can plan your own qualitative scales HELSINKI UNIVERSITY OF TECHNOLOGY Juha Itkonen Sober. IT/HUT Legend Coverage: 0 = nothing 1 = we looked at it 2 = we checked all functions 3 = it’s tested Quality: J = quality is good K = not sure L = quality is bad 36

Quality assessment 2/2 q Evaluate the quality of the different functional areas of the Quality assessment 2/2 q Evaluate the quality of the different functional areas of the system how much effort has been put on test execution Ø what is the coverage of testing Ø what can you say about the quality of the particular component based on your test results and ’gut feeling’ during testing Ø e. g. is the number of reported bugs low because of lack of testing or high because of intensive testing Ø q Assess the quality status of the system against the quality goals of the project HELSINKI UNIVERSITY OF TECHNOLOGY Juha Itkonen Sober. IT/HUT 37

Test report and log q Test report template provided Ø Summary of testing tasks Test report and log q Test report template provided Ø Summary of testing tasks and results Ø Ø q No detailed lists of passed and failed test cases Includes evaluation of the quality Test log Ø Ø Ø Provides a chronological record of relevant details about the execution of tests Who tested, when and what (version, revision, environment, etc. ) Lists all executed test cases Results, remarks, bugs and issues of each test case Execution date&time, used data files, etc. See Test. Case. Matrix. xls, for example. HELSINKI UNIVERSITY OF TECHNOLOGY Juha Itkonen Sober. IT/HUT 38

Test Case Design HELSINKI UNIVERSITY OF TECHNOLOGY Juha Itkonen Sober. IT/HUT 39 Test Case Design HELSINKI UNIVERSITY OF TECHNOLOGY Juha Itkonen Sober. IT/HUT 39

Deriving test cases from use cases q q If the functional requirements are modelled Deriving test cases from use cases q q If the functional requirements are modelled as use cases it is sensible to utilize them in functional testing Use case != test case Testing is interested in the uncommon and abnormal scenarios Ø One use case leads to several test cases Ø q Prioritize use cases and use this prioritization when prioritizing tests Prioritization in testing is the distribution of efforts Ø (Not the order of execution) Ø q q Maintain traceability between use cases and test cases Use cases are not complete specifications Ø q Testing only the conditions that are mentioned in use case is usually not enough See Robert V. Binder’s “Extended Use Case Test Design Pattern” http: //www. rbsc. com/docs/Test. Pattern. XUC. pdf HELSINKI UNIVERSITY OF TECHNOLOGY Juha Itkonen Sober. IT/HUT 40

Use case example: use case 1. 2. User slides a card through the card-reader Use case example: use case 1. 2. User slides a card through the card-reader Card-reader scans employee ID from card Ø Exception 1: Card can’t be read Ø Ø 3. User opens door Ø Ø Ø 6. Ø Ø 7. Ø Ø Exception 6: Door fails to lock Ø Ø Exception 3: System unable to unlock door System waits for timeout Log event Set alarm condition Use case ends System locks door Ø System unlocks door for configured time period Ø Exception 5: Door is not shut Ø Log event Use case ends System waits for timeout System locks door Use case ends User enters and door shuts Ø Exception 2: Employee ID is invalid Ø Exception 4: Door is not opened Ø System validates employee access Ø 4. Log event Use case ends 5. Ø Ø System attempts to lock door Log event Set alarm condition Use case ends Log event Use case ends HELSINKI UNIVERSITY OF TECHNOLOGY Juha Itkonen Sober. IT/HUT 41

Use case example: test cases q Test Case 1: Valid employee card is used Use case example: test cases q Test Case 1: Valid employee card is used q Swipe card Ø Verify door is unlocked Ø Don’t open the door and wait until timeout is exceeded Ø Verify door is locked Ø Slide the card through the reader Ø Verify door is unlocked Ø Enter building Ø Verify door is locked Ø q Test Case 2: Card can’t be read Test Case 5: Door is not opened q Swipe a card that is not valid Ø Verify event is logged Ø Test Case 6: Door is not shut after entry Ø Test Case 3: Invalid employee ID Ø Swipe card with invalid employee ID Ø Verify door is not unlocked Ø Verify event is logged q Ø Ø Ø q Test Case 4: System unable to unlock door Ø q Test Case 7: Door fails to lock Ø Ø Swipe card Ø “Injected” failure of unlocking mechanism Ø Verify event is logged Ø Ø HELSINKI UNIVERSITY OF TECHNOLOGY Juha Itkonen Sober. IT/HUT Swipe card Enter building Hold door open until timeout is exceeded Verify alarm is sounded Verify event is logged Swipe card Enter building “Injected” failure of locking mechanism Verify alarm is sounded Verify event is logged 42

Error-guessing and ad hoc testing q q q Always worth including After systematic techniques Error-guessing and ad hoc testing q q q Always worth including After systematic techniques have been used Can find some faults that systematic techniques can miss Supplements systematic techniques Consider Ø Ø Ø Past failures Intuition Experience Brain storming ”What is the craziest thing we can do? ” Lists in literature, error catalogs HELSINKI UNIVERSITY OF TECHNOLOGY Juha Itkonen Sober. IT/HUT 43

Test Case Specification (IEEE Std 829) 2. Test-case-specification identifier Test items: describes the detailed Test Case Specification (IEEE Std 829) 2. Test-case-specification identifier Test items: describes the detailed feature, code module and so on to 3. Input specifications: specifies each input required to execute the 1. 4. 5. 6. 7. be tested. test case (by value with tolerances or by name). Output specifications: describes the result expected from executing the test case. Results may be outputs and features (for example, response time) required of the test items. Environmental needs: Environmental needs are the hardware, software, test tools, facilities, staff, and so on to run the test case. Special procedural requirements: describes any special constraints on the test procedures which execute this test case (special set-up, operator intervention, …). Intercase dependencies: lists the identifiers of test cases which must be executed prior to this test case, describes the nature of the dependencies. HELSINKI UNIVERSITY OF TECHNOLOGY Juha Itkonen Sober. IT/HUT 44

A simple approach to test cases Test case ID Priority Test case title Description A simple approach to test cases Test case ID Priority Test case title Description Notes TC-12. 34. 5 2 Indent functionality q Indenting the current line to the right, and left. q Indenting the selected lines to the right, and left. Req 12. 34 ØMoves the indentation, no aligning. … q The common details of a test suite and test procedure (like test environment) are documented elsewhere Avoiding copy paste Ø Test catalogs are utilized to describe common details of test cases Ø Ø Ø test all available ways of performing the function (menu, keyboard, gui buttons, menu short-cut, short-cut keys, …) Test settings or preferences that affect this function This may leave too much space for an inexperienced tester HELSINKI UNIVERSITY OF TECHNOLOGY Juha Itkonen Sober. IT/HUT 45

Test Catalogs q q Test catalog is a list of typical tests for a Test Catalogs q q Test catalog is a list of typical tests for a certain situation Based on experience on typical errors that developer make HELSINKI UNIVERSITY OF TECHNOLOGY Juha Itkonen Sober. IT/HUT 46

Common pitfalls in test case definition q Poor test case organization One big pile Common pitfalls in test case definition q Poor test case organization One big pile of test cases Ø Don’t know what a certain set of test cases actually tests or which cases test a certain functionality Ø Don’t know what was tested after testing Ø q Testing wrong things Prioritize and select the most important tests Ø Consider the test case’s probability to reveal an important fault Ø q Writing too detailed step-by-step scripts Not enough time for detailed scripting Ø Few detailed, but irrelevant test cases designed and executed -> bad quality of testing, no major defects found Ø Don’t program people Ø HELSINKI UNIVERSITY OF TECHNOLOGY Juha Itkonen Sober. IT/HUT 47

Example – how to manage test cases Test. Case. Matrix. xls When do you Example – how to manage test cases Test. Case. Matrix. xls When do you write these test cases? (Hint: not at the end of the project) HELSINKI UNIVERSITY OF TECHNOLOGY Juha Itkonen Sober. IT/HUT 48

Well-timed test design q Early test design Ø Ø Ø test design finds faults Well-timed test design q Early test design Ø Ø Ø test design finds faults found early are cheaper to fix most significant faults found first faults prevented, not built in no additional effort Ø Ø q re-schedule test design causes requirement changes Not too early test case design Ø Design tests in implementation order Ø Ø Ø Avoiding anticipatory test design and deprecated, incorrect test cases that are not based on the actual features Ø Ø Start test design from the most completed and probable features Test cases are designed during or after implementation, but incrementally If things change or specifications are not detailed enough for testing Test planning must begin early, test case design not necessarily HELSINKI UNIVERSITY OF TECHNOLOGY Juha Itkonen Sober. IT/HUT 49