A Practical Approach to Verification and Validation Dr. Eugene W.P. Bingue U. S. Navy
[email protected]
Dr. David A. Cook Stephen F. Austin State University
[email protected] A Practical Approach to V&V
1
V& V - It’s all about Quality!
A Practical Approach to V&V
2
The Three “Domains” of V&V Used Right?
Requirements Validation
Program Validation
USER DOMAIN
Verification
PROBLEM DOMAIN
TOOL DOMAIN
Requirements
Fit for Intended Use?
Built Well?
Verification A Practical Approach to V&V
3
Verification and Validation – Definitions • Verification – The process of determining that a model implementation accurately represents the developer's conceptual description and specifications. “Did I build the system right?” • Validation – The process of determining (a) the manner and degree to which a model is an accurate representation of the ‘real-world’ from the perspective of the intended uses of the model. “Did I build the right system?” A Practical Approach to V&V
4
V&V vs. Testing • Testing is a discrete phase • VV&A should occur during each phase
“The first mistake that people make is thinking that the testing team is responsible for assuring quality”
Brian Marick, as quoted in Pressman “A Practitioners Guide to Software Engineering” A Practical Approach to V&V
5
Non-Simulation It works well, and as I expected!
A Practical Approach to V&V
It gives the right solutions!
6
Simulation
A Practical Approach to V&V
7
Accreditation (for simulations) • Accreditation is the official certification that a model or simulation is acceptable for use for a specific application • Three steps: – Identify gaps in the program (what it WON’T do) – Assess the risks – Recommend acceptable uses, and list limitations
A Practical Approach to V&V
8
Why V&V? “When Quality is vital, independent checks are necessary, not because people are untrustworthy but because they are human.” Watts Humphrey, Managing the Software Process
A Practical Approach to V&V
9
Source: Ould and Unwin. Testing in Software Development, 1988
Mature Process for System Development Designers View
Users View Rqmts. Analysis
User's Views
System Spec
Reqts
Accept Test Plan
User Trial Plan
Delivered System
Module Design
System Design
System Spec
Integ System
Acceptance testing
Developers View
System Design Integ Test Plan
Modules
System & int. testing
Module Design Unit Test Plan
module coding
The V&V View
Coded Units unit testing
10
Basic (and Practical ) Right Product built Right VV&A: Taxonomy for V&V Validate Equations/Algorithms Inspect Design
Inspect Requirements Inspect Conceptual Model
User's Views Formal Document Review Inspect CM practices
Reqts User Trial Plan
System Spec Accept Test Plan
Delivered System
Integ System
System Design Integ Test Plan
Functionality Testing
Unit Test Plan
Modules
System acceptance integration testing testing
Module Design module coding
Coded Units
unit testing
Inspect Code Verify Equations/Algorithms
Inspect Test Plans/Test Results
VV&C Input/Default Data A Practical Approach to V&V
11
Potential Verification & V&V Techniques Validation Techniques
Informal
Static
Audit
Cause-Effect Graphing
Desk Checking
Inspections
Control Analysis Calling Structure Concurrent Process Control Flow State Transition
Reviews
Data Analysis
Turing Test
Data Dependency
Walkthroughs
Data Flow
Face Validtion
Fault/Failure Analysis Interface Analysis Model Interface User Interface Semantic Analysis Structural Analysis Symbolic Evaluation Syntax Analysis Traceability Assessment
Source: DMSO Best Practices
Formal
Dynamic Acceptance Testing Alpha Testing Assertion Checking Beta Testing Bottom-Up Testing Comparison Testing Compliance Testing Authorization Performance Security Standards Debugging Execution Testing Monitoring Profiling Tracing Fault / Failure Insertion Testing Field Testing Functional (Black-Box) Testing Graphical Comparisons Interface Testing Data Model User Object-Flow Testing Partition Testing A Practical Approach to V&V
Predictive Validation Product Testing Regression Testing Sensitivity Analysis Special Input Testing Boundary Value Equivalence Partitioning Extreme Input Invalid Input Real-Time Input Self-Driven Input Stress Trace-Driven Input Statistical Techniques Structural (White-Box) Testing Branch Condition Data Flow Loop Path Statement Submodel / Module Testing Symbolic Debugging Top-Down Testing Visualization / Animation
Induction Inference Logical Deduction Inductive Assertions Calculus Lambda Calculus Predicate Calculus Predicate Transformation Proof of Correctness
12
V&V Techniques • Informal V&V techniques are among the most commonly used. They are called informal because their tools and approaches rely heavily on human reasoning and subjectivity without stringent mathematical formalism. • Static V&V techniques assess the accuracy of the static model design and source code. Static techniques do not require machine execution of the model, but mental execution can be used. The techniques are very popular and widely used, and many automated tools are available to assist in the V&V process. Static techniques can reveal a variety of information about the structure of the model, the modeling techniques used, data and control flow within the model, and syntactical accuracy (Whitner and Balci, 1989). .
V&V Techniques (continued) • Dynamic V&V techniques require model execution; they evaluate the model based on its execution behavior. Most dynamic V&V techniques require model instrumentation, the insertion of additional code (probes or stubs) into the executable model to collect information about model behavior during execution. – Dynamic V&V techniques usually are applied in three steps: • executable model is instrumented • instrumented model is executed • model output is analyzed, dynamic model behavior is evaluated • Formal V&V techniques (or formal methods) are based on formal mathematical proofs or correctness and are the most thorough means of model V&V. The successful application of formal methods requires the model development process to be well defined and structured. Formal methods should be applied early in the model development process to achieve maximum benefit. Because formal techniques require significant effort they are best applied to complex problems, which cannot be handled by simpler methods.
Verification and Validation Technique Taxonomy Informal Techniques audit
desk check
face validation
review
Turing test
walk-through
inspection
Static Techniques control analyses data analyses calling control flow cause-effect graphing structure data dependency data flow concurrent state process transition interface analyses model interface
user interface
semantic analysis
structural analysis
syntax analysis
fault/failure analysis symbolic evaluation
traceability assessment
Dynamic Techniques acceptance test
alpha test
assertion check
beta test
bottom-up test
comparison test
compliance tests authorization security performance standards
debugging
field test
functional test (Black Box test)
object-flow test
partition test
regression test
sensitivity analysis
structural tests (White Box tests)
statistical techniques
execution tests monitor profile
fault / failure insertion test trace
graphical comparison
interface tests data
predictive validation
model
user
product test
special input tests boundary value equivalence partitioning • • extreme input • • invalid input • • • •
symbolic debugging
real-time input self-driven input stress • • trace-driven input • • • • • •
• •
•
branch
•
• • • •
loop path
condition • • • • statement data flow
top-down test
submodel / module test
visualization / animation
Formal Techniques induction
inference
logical deduction
inductive assertion
lambda calculus
predicate calculus
predicate transformation
proof of correctness
What activities do you select? • It depends upon – Available time – Available funds – Confidence in the development and developers and process – Accreditation needs (very important for simulations) – Type of activity – User needs and desires – “Criticality” of the application
• Formally, you should document what activities you will perform in a V&V Plan A Practical Approach to V&V
16
Lessons Learned
Tricks and Traps in V&V A Practical Approach to V&V
17
Lesson 1 • Identify intended uses of the product early – Create use cases, scenarios, or SRS – Verify and Validate the requirements. Do it again. – Keep the requirements separate and current. – Insist on a design (or future maintenance will be problematic).
• Plan for V&V early • Insist on user involvement in V&V of requirements A Practical Approach to V&V
18
Lesson 2 • Software Engineering 101 – Ten 1,000 line programs are easier to V&V than one 10,000 one
• Separate different classes of uses and users. Plan and design accordingly. • You MUST have a design. A Practical Approach to V&V
19
Lesson 3 • Determine acceptability criteria as early as possible – Determine how you will know when the product is “good enough” – Know what the user really needs - “perfect” vs. the “80% solution” – This is another way of saying “The requirements must be very clear” and “the user must agree with the developers as to what the requirements are” A Practical Approach to V&V
20
Lesson 4 • Keep track of “complex” requirements – Accuracy – Fidelity – Speed – Response time – Interfaces – Interoperability – Real-time requirements
• You will need domain-specific expertise for these areas A Practical Approach to V&V
21
Lesson 5 • Start the V&V early (which is a nice way of saying “FUND the V&V early”) • Manage, organize and update the V&V artifacts • Do not confuse V&V with testing
A Practical Approach to V&V
22
Lesson 6 – use a taxonomy • • • • • • • • • • • •
Conceptual Model (SRS, Briefing, Conversation) Requirements (formal and informal) Equations and Algorithms Design (validity, coupling, cohension) Code (documentation and coding standards) Equations / algorithms / dimensional analysis Test plans, test results Check input data / default values / constants Functionality check (final user approval) Configuration Management Documentation Risks A Practical Approach to V&V
23
Lesson 7 • Know the limits of YOUR expertise
• Know when to to use – Subject Matter Experts (SMEs) who usually don’t understand program development, but can help you understand “the real world” – Statisticians and mathematicians – Domain experts, who can translate SME input into program requirements – Specialized domain-specific developers A Practical Approach to V&V
24
Lesson 8 • Configuration Management is critical. Small changes to the program can invalidate V&V results. • It is critical to re-evaluate program results, and perform incremental V&V after any changes that might affect the validity of the program. • Limit access to the code and requirements. • Update requirements and design as needed. • Save the test cases you use for V&V – you will need to reuse them frequently! A Practical Approach to V&V
25
Final Lessons • The cost of V&V is a small part of overall development • V&V saves you time and money – Without V&V, you get stuck in the “code – fix – code – fix” loop late in the development process – The risk of overall program failure increases dramatically without V&V
• V&V pays for itself and saves you $$s in decreased future maintenance costs A Practical Approach to V&V
26
A Practical Approach to V&V
27
A Practical Approach to V&V
28