Effective Methods for Software Testing Third Edition
William E. Perry
WILEY
Wiley Publishing, Inc.
Contents
Introduction
xxv
Part I
Assessing Testing Capabilities and Competencies
1
Chapter 1
Assessing Capabilities, Staff Competency, and User Satisfaction The Three-Step Process to Becoming a World-Class Testing Organization Step 1: Define a World-Class Software Testing Model Customizing the World-Class Model for Your Organization Step 2: Develop Baselines for Your Organization Assessment 1: Assessing the Test Environment Implementation Procedures Verifying the Assessment Assessment 2: Assessing the Capabilities of Your Existing Test Processes Assessment 3: Assessing the Competency of Your Testers Implementation Procedures Verifying the Assessment Step 3: Develop an Improvement Plan Summary
13 14 14 16 16 18
Part II
Building a Software Testing Environment
35
Chapter 2
Creating an Environment Supportive of Software Testing Minirnizing Risks Risk Appetite for Software Quality Risks Associated with Implementing Specifkations Faulty Software Design Data Problems
37 38 38 39 39 39
3 3 5 7 8 8 9 13
IX
Contents
Chapter 3
Risks Associated with Not Meeting Customer Needs Developing a Role for Software Testers Writing a Policy for Software Testing Criteria for a Testing Policy Methods for Establishing a Testing Policy Economics of Testing Testing—An Organizational Issue Management Support for Software Testing Building a Structured Approach to Software Testing Requirements Design Program Test Installation Maintenance Developing a Test Strategy Use Work Paper 2-1 Use Work Paper 2-2 Summary
Building the Software Testing Process Software Testing Guidelines Guideline #1: Testing Should Reduce Software Development Risk Guideline #2: Testing Should Be Performed Effectively Guideline #3: Testing Should Uncover Defects Defects Versus Failures Why Are Defects Hard to Find? Guideiine #4: Testing Should Be Performed Using Business Logic Guideline #5: Testing Should Occur Throughout the Development Life Cycle Guideiine #6: Testing Should Test Both Function and Structure Why Use Both Testing Methods? Structural and Functional Tests Using Verification and Validation Techniques Workbench Concept Testing That Parallels the Software Development Process Customizing the Software-Testing Process Determining the Test Strategy Objectives Determining the Type of Development Prqject Determining the Type of Software System Determining the Project Scope tdentifying the Software Risks Determining When Testing Should Occur Defining the System Test Plan Standard
Contents Defining the Unit Test Plan Standard Converting Testing Strategy to Testing Tactics Process Preparation Checklist Summa ry
83 83 86 86
Chapter 4
Seiecting and Installing Software Testing Tools Integrating Tools into the Tester's Work Processes Tools Available for Testing Software Seiecting and Using Test Tools Matching the Tool to tts Use Seiecting a Tool Appropriate to Its Life Cycle Phase Matching the Tool to the Tester's Skill Level Seiecting an Affordable Tool Training Testers in Tool Usage Appointing Tool Managers Prerequisites to Creating a Tool Manager Position Seiecting a Tool Manager Assigning the Tool Manager Duties Limiting the Tool Manager's Tenure Summa ry
Building Software Tester Competency What Is a Common Body of Knowledge? Who Is Responsible for the Software Tester's Competency? How Is Personal Competency Used in Job Performance? Using the 2006 CSTE CBOK Developing a Training Curriculum Using the CBOK to Build an Effective Testing Team Summary
125 125 126 126 127 128 129 131
Part IM
The Seven-Step Testing Process
151
Chapter 6
Overview of the Software Testing Process Advantages of Following a Process The Cost of Computer Testing Quantifying the Cost of Removing Defects Reducing the Cost of Testing The Seven-Step Software Testing Process Objectives of the Seven-Step Process Customizing the Seven-Step Process Managing the Seven-Step Process Using the Tester's Workbench with the Seven-Step Process Workbench Skills Summary
153 153 154 155 156 156 159 160 161 162 163 164
Chapter 7
Step 1: Organizing for Testing Objective Workbench Input
165 165 166 167
xii
Contents
Chapter 8
Do Procedures Task 1: Appoint the Test Manager Task 2: Define the Scope of Testing Task 3: Appoint the Test Team Internal Team Approach External Team Approach Non-IT Team Approach Combination Team Approach Task 4: Verify the Development Documentation Development Phases Measuring Project Documentation Needs Determining What Documents Must Be Produced Determining the Completeness of Individual Documents Determining Documentation Timeliness Task 5: Validate the Test Estimate and Project Status Reporting Process Validating the Test Estimate Testing the Validity of the Software Cost Estimate Calculating the Project Status Using a Point System Check Procedures Output Summary
Step 2: Developing the Test Plan Overview Objective Concerns Workbench Input Do Procedures Task 1: Profile the Software Project Conductinga Walkthiough of the C us tomer / User Area Developing a Profile of the Software Project Task 2: Understand the Project Risks Task 3: Select a Testing Technique Structural System Testing Techniques Punctional System Testing Techniques Task 4: Plan Unit Testing and Analysis Functional Testing and Analysis Structural Testing and Analysis Error-Oriented Testing and Analysis Managerial Aspects of Unit Testing and Analysis Task 5: Build the Test Plan Setting Test Objectives Developing a Test Matrix Defining Test Administration Writing the Test Plan
Input The Requirements Phase The Design Phase The Programming Phase Do Procedures Task 1: Test During the Requirements Phase Requirements Phase Test Factors Preparing a Risk Matrix Performing a Test Factor Analysis Conducdng a Requirements Walkthrough Performing Requirements Tracing Ensuring Requirements Are Testable Task 2: Test During the Design Phase Scoring Success Factors Analyzing Test Factors Conducdng a Design Review Inspecting Design Deliverables Task 3: Test During the Programming Phase Desk Debugging the Program. Performing Programming Phase Test Factor Analysis Conducting a Peer Review Check Procedures Output Guidelines Summary
Contents Do Procedures Task 1: Build the Test Data Sources of Test Data/Test Scripts Testing File Design DefinLng Design Goals Entering Test Data Applying Test Files Against Programs That Update Master Records Creating and Using Test Data Payroll Application Example Creating Test Data for Stress/Load Testing Creating Test Scripts Task 2: Execute Tests Task 3: Record Test Results Documenting the Deviation Documenting the Effect Documenting the Cause Check Procedures Output Guidelines Summary Chapter 11
Step 5: Analyzing and Reporting Test Results 459 Overview 459 Concerns 460 Workbench 460 Input 461 Test Plan and Project Plan 461 Expected Processing Results 461 Data Collected during Testing 461 Test Results Data 462 Test Transactions, Test Suites, and Test Events 462 Defects 462 Efficiency 463 Storing Data Collected During Testing 463 Do Procedures 463 Task 1: Report Software Status 464 Establishing a Measurement Team 465 Creating an Inventory of Existing Project Measurements 465 Developing a Consistent Set of Project Metrics 466 Defining Process Requirements 466 Developing and Implementing the Process 466 Monitoring the Process 466 Task 2: Report Interim Test Results 470 Function/Test Matrix 470 Functional Testing Status Report 471 Functions Working Timeline Report 472 Expected Versus Actual Defects Uncovered Timeline Report 472
Contents
Chapter12
Defects Uncovered Versus Corrected Gap Timeline Report Average Age of Uncorrected Defects by Type Report Defect Distribution Report Normalized Defect Distribution Report Testing Action Report Interim Test Report Task 3; Report Final Test Results Individual Project Test Report Integration Test Report System Test Report Acceptance Test Report Check Procedure s Output Guidelines Summary
Step 6: Acceptance and Operational Testing Overview Objective Concern s Workbench Input Procedures Task 1: Acceptance Testing Defining the Acceptance Criteria Developing an Acceptance Plan Executing the Acceptance Plan Developing Test Cases (Use Cases) Based on How Software Will Be Used Task 2: Pre-Operational Testing Testing New Software Installation Testing the Changed Software Version Monitoring Production Documenting Problems Task 3; Post-Operational Testing Developing and Updating the Test Plan Developing and Updating the Test Data Testing the Control Change Process Conducting Testing Developing and Updating Training Material Check Procedures Output Is the Automated Application Acceptable? Automated Application Segment Failure Notification Is the Manual Segment Acceptable? Training Failure Notification Form Guidelines Summary
How Much Testing Is Enough? Software Development Methodologies Overview Methodology Types Software Development Life Cycle Defining Requirements Categories Attributes Methodology Maturity Competencies Required Staff Experience Configuration-Management Controls Basic CM Requirements Plann ing Data Distribution and Access CM Administration Configuration Identification Configuration Control Measuring the Impact of the Software Development Process Summary Chapter 15
Testing Client/Server Systems Overview Concerns Workbench Input
611 611 612 613 614
Contents Do Procedures Task 1: Assess Readiness Software Development Process Marurity Levels Conducting the Client/Server Readiness Assessment Preparing a Client/Server Readiness Footprint Chart Task 2: Assess Key Components Task 3: Assess Client Needs Check Procedures Output Guidelines Summa ry
614 614 615 621 621 622 622 624 624 624 624
Chapter 16
Rapid Application Development Testing Overview Objective Concern s Testing Iterations Testing Components Testing Performance Recording Test Information Workbench Input Do Procedures Testing Within Iterative RAD Spiral Testing Task 1: Determine Appropriateness of RAD Task 2: Test Planning Iterations Task 3: Test Subsequent Planning Iterations Task 4: Test the Final Planning Iteration Check Procedures Output Guidelines Summary
xviii Contents Detective Controls Data Transmission Control Register Contra 1 Totais Documenting and Testing Output Checks Corrective Controls Error Detection and Resubmission Audit Trails Cost/Beneiit Analysis Assessing Internal Controls Task 1: Understand the System BeingTested Task 2: Identify Risks Task 3: Review Application Controls Task 4: Test Application Controls Testing Without Computer Processing Testing with Computer Processing Transaction Flow Testing Objectives of Internal Accounting Controls Results of Testing Task 5: Document Control Strengths and Weaknesses Quality Control Checklist Summary Chapter 18 Testing COTS and Contracted Software Overview COTS Software Advantages, Disadvantages, and Risks COTS Versus Contracted Software COTS Advantages COTS Disadvantages Implementation Risks Testing COTS Software Testing Contracted Software Objective Concerns Workbench Input Do Procedures Task 1: Test Business Fit Step 1: Testing Needs Specification Step 2: Testing CSFs Task 2: Test Operational Fit Step 1: Test Compatibility Step 2: Integrate the Software into Existing Work Flows Step 3: Demonstrate the Software in Action Task 3: Test People Fit
Contents Task 4: Acceptance-Test the Software Process Step 1: Create Functional Test Conditions Step 2: Create Structura] Test Conditions Modifying the Testing Process for Contracted Software Check Procedures Output Guidelines Summary
702 702 703 704 705 705 706 706
Chapter 19 Testing in a Multipiatform Environment 717 Overview 717 Objective 718 Concerns 718 Background on Testing in a Multipiatform Environment 718 Workbench 719 Input 720 Do Procedures 721 Task 1: Define Platform Configuration Concerns 721 Task 2: List Needed Platform Contigurations 723 Task 3; Assess Test Room Configurations 723 Task 4: List Structural Components Affected by the Platform(s) 723 Task 5; List Interfaces the Platform Affects 725 Task 6: Execute the Tests 726 Check Procedures 726 Output 726 Guidelines 726 Summary 727 Chapter 20 Testing Software System Security Overview Objective Concerns Workbench Input Where Vulnerabilities Occur Functional Vulnerabilities Vulnerable Areas Accidental Versus Intentional Losses Do Procedures Task 1: Establish a Security Baseline Why Baselines Are Necessary Creating Baselines Using Baseltnes Task 2: Build a Penetration-Point Matrix Controlling People by Controlling Activities Selecting Security Activities Controlling Business Transactions
Contents Characteristics of Security Penetration Building a Penetration-Po int Matrix Task 3: Analyze the Results of Security Testing Evaluating the Adequacy of Security Check Procedures Output Guidelines Summary
756 757 760 761 762 762 762 762
Chapter 21 Testing a Data Warehouse Overview Concems Workbench Input Do Procedures Task 1: Measure the Magnitude of Data Warehouse Concerns Task 2: Identify Data Warehouse Activity Processes to Test Organizational Process Data Documentation Process System Development Process Access Control Process Data Integrity Process Operations Process Backup/ Recovery Process Perform ing Task 2 Task 3: Test the Adequacy of Data Warehouse Activity Processes Check Procedures Output Guidelines Summary
Chapter 22 Testing Web-Based Systems Overview Concerns Workbench Input Do Procedures Task 1: Select Web-Based Risks to Include in the Test Plan Security Concerns Performance Concerns Correctness Concerns Compatibility Concerns Reliability Concerns Data Integrity Concerns Usability Concerns Recoverability Concerns
Task 2: Select Web-Based Tests Unit or Component Integration System User Acceptance Performance Load/Stress Regression Usability Compatibility Task 3: Select Web-Based Test Tools Task 4: Test Web-Based Systems Check Procedures Output Guidelines Summary
Chapter 23 Using Agile Methods to Improve Software Testing The Importance of Agility Building an Agile Testing Process Agility Inhibitors Is Improvement Necessary? Compressing Time Challenges Solutions Measuring Readiness The Seven-Step Process Summary
819 819 820 821 822 823 824 825 826 826 827
Chapter 24 Building Agility into the Testing Process Step 1: Measure Software Process Variability Timelines Process Steps Workbenches Time-Compression Workbenches Reducing Variability Developing Timelines Improvement Shopping List Quality Control Checklist Conclusion Step 2: Maximixe Best Practices Tester Agility Software Testing Relationships Tradeoffs Capability Chart Measurinc Effectiveness and Efficiency
Contents Improvement Shopping List Quality Control Checklist Cunclusion Step 3: Build ort Strength, Minimize Weakness Effective Testing Processes Poor Testing Processes Improvement Shopping List Quality Control Checklist Conclusion Step 4: identify and Address Improvement Barriers The Stakeholder Perspective Stakeholder Involvement Performing Stakeholder Analysis Red-Flag/Hot-Button Barriers Staff-Competency Barriers Administrative/Organizational Barriers Determining the Root Cause of Barriers/Obstacles Addressing the Root Cause of Barriers/Ob stacles Quality Control Checklist Conclusion Step 5: Identify and Address Cultural and Communication Barriers Management Cultures Culture 1: Manage People Culture 2: Manage by Process Culture 3: Manage Competencies Culture 4: Manage by Fact Culture 5: Manage Business Innovation Cultural Barriers Identifying the Current Management Culture Identifying the Barriers Posed by the Culture Determining What Can Be Done in the Current Culture Determining the Desired Culture for Time Compression Determining How to Address Culture Barriers Open and EffectiveCommunication Lines of Communication Information/Communication Barriers Effective Communication Quality Control Checklist Conclusion Step 6: Identify Implementable Tmprovements What Is an Implementable? Identifying Implementables via Time Compression Prioritizing Implementables Documenting Approaches Quality Control Checklist Conclusion
Contents xxiii Step 7: Develop and Execute an Implementation Plan Planning Implementing Ideas Requisite Resources Quality Control Checklist Conclusion Summary Index