In the waterfall model, we go
Testing can be viewed as Verification and Validation of the designed code and architecture
There are in general White Box testing and Block Box testing
Testers can see the code being tested in White Box testing and make modification as testing goes. White is like transparent
In Black Box testing, the code you test is "put" into a Black Box, so testers test the code and document (trace) the bug and error path
White Box testing can be regarded as "pure code testing" where testers know actually what to test. The use of debuggers are allowed
A very strict type of white box testing is "Certification Testing", as some companies will honor a certificate after "certification testing" is done. The process generally works as follows:
A set of standard is set for obtaining certification status, for example:
Support for 1000 users
Graceful exit when run out of memory
State consistence after power failure
Exercise every single of of code
Running for 10 hours
Recover from network failure
An agent (a separate company or people) is used to perform a set of tests testing the above standard
A set of tools will be provided for testing (e.g. tools for control memory allocation, multiple users logon, etc)
The organization will be presented with the test results and decide if the product being tested reach the certification standard
A certificate is granted and result published for public (web site, journal, magazine, product newsletter, etc)
Black Box testing
Testers are not given the code to look at
Given: Input, Configuration ... Test for correct output and behavior
Stress on different configurations and setups
Used to test general and complex customer setups and configurations
Ideal for product level test (end-user level test)
Unit Test - Testing by developer
Functional Test - Testing of functions (by testers or developers)
Component Test - Testing of components (a component can have one or more functions, by testers)
Product Level Test - Testing of the whole product (by testers)
System Level Test - Testing of the whole system (if you are shipping hardware as well as software, by testers)
Regression Test - Testing used when only shipping as refresh (no major change, e.g. from Win 95 to Win 95 with fix pack one installed)
Stress Test - Testing in an environment with a lot of usage count (by testers)
NLS Test - Testing based on multiple languages (for product with common code base for multiple languages, by testers)
Fix Test - (Also called PTF test, testing for the Fixes developer put out to customers, by developers, testers)
Track the Test Plans which should be rewritten by developers and testers
Track the number of defects from testing
Document the defects info (cause, environment, etc)
Assign priorty for the defects and fix in order
Track the progress of testing
Plan test cases
Attempted test cases
Successful test cases
Number of defects
Plot S-curve for progess info
Schedule testing just like Requirement, Design and Coding
(Remember ... Traceability ! ... What happen, Why, When, How, etc )
Depends on what you work on, there are lots of tools to pick from
Testing automation with rtest, STRBAT, WinTest, etc
GUI can also be tested with script written to exercise all the "windows" function and check for return code (Thus, you have to PLAN AHEAD when doing design for automatic testing)
Custom tools are often needed for non-application level product, such as compilers, drivers, database services, etc
Testing is used to test the designed code, can we expand the idea to other stages in the Waterfall Model ?
Yes. In fact, it is very important to make sure that the Requirement Specification is correct since all of our design and coding are based on the RS
We called, instead of "Testing of Requirement Specifications", Verification and Validation of Requirement Specifications
(The approach in here is derived from theory)
V & V criteria are as follows as Boehm discribed:
Completeness -- if all of its parts are present and each part is fully developed
Consistency -- if no conflicts exist in the specification
Feasibility -- it is possible to create the software and the benefit exceeds the cost
Testability -- there exists a feasible technique for determining whether the developed software satisfies the specification
Usually focus on completeness and consistency (C&C)
Internal C&C refers to the fact the specification is complete with respect to its internal structures and no conflicts exists within the specification.
External C&C means that the specification completely covers the contents in external specifications or documents and no conflicts exists with external documents.
Requirement Specifications C&C
A requirement specification is externally complete and consistent if the specification captures all user requirements of the proposed system and follows all the constraints specified by the application domain and the application itself.
Design Specification C&C
A design specification is externally complete and consistent if all the requirements in the requirement specification are captured in the design specification and no conflict exists between the design specification and the requirement specification.
Checking specifications is inherently difficult.
A specification follows the constraints
application domains (different rules for different domains)
The tasking of checking and testing is as difficult the initial creation
Logical separation of bugs
Boehm -- TBDs, nonexistent references, missing specification items, missing functions and missing products
These can be further classified:
TBDs and nonexistent references are incompleteness bugs with respect to the specification language.
missing specification items are incompleteness bugs with respect to both the specification language and analysis techniques and even applications.
Help in developing a relatively complete sets of rules
Reusability of checking rules
Same specification language -- reuse the constraints of the specification
Same analysis techniques -- reuse the constraints of the analysis techniques
Same application domain -- reuse the constraints of the application domain
Same specific application -- reuse the constraints of the specific application
Allow for automation
1. Formalism Layer
[Consistency] Do we make any inconsistency in the specification according to the formalism?
Example: If the formalism says the every data item in the specification must have a unique name, we can query the specification to see if there is any object that has an alias.
[Completeness] Do we miss any item in the specification with respect to the formalism?
Example: If there is a rule in the formalism that says "every data item should have a name and description", we can ask "is there any data item in the specification which has no name or no description?"
2. Analysis Technique Layer
[Consistency] Do we violate any consistency rules with respect to the analysis technique?
Example: If there is a rule in the analysis technique that says "A method can manipulate only the attributes in its object", we should ask "is there any method which manipulates attributes outside its object?"
[Completeness] Do we miss any item in the specification with respect to the analysis techniques?
Example: If the analysis technique says "every object should have at least one method", and we can query the specification to see whether every object has at least one method.
3. Domain Layer
[Consistency] Do we violate any consistent rules with respect to the domain constraints?
Example: If there is a domain constraints that says "a doctor cannot be a nurse at the same time in a hospital", we can ask "is there any doctor who is also a nurse in a hospital?"
[Completeness] Do we miss any item in the specification with respect to the domain constraints.
Example: If there is a domain constraints that says "a general hospital should have at least one doctor from each specialty", we can query the specification "does the hospital have doctor in each specialty?"
4. Application Layer
[Consistency] Do we violate any consistency rules with respect to the application constraints.
Example: If there is an application constraint that says "all doctors must wear a white coat in the hospital", we can ask "does every doctor in the hospital wear a white coat?"
[Completeness] Do we miss any item in the specification with respect to the domain constraints?
Example: If there is an application constraint that says "the hospital management system must contain the list of doctors from American Hospital", we can asks "is the name of every doctor included in the specification?"
"All objects must have unique names"
Do two objects have the same names?
"Every mandatory field in the DD must be filled"
Has any mandatory field in the DD not been filled.
"Each object should have at least one method other than inherited methods."
Is there any object containing no method other than inherited ones?
In a business application, we have transactions, transactions have events
Event object -- objects that trigger the start of a transaction
Control object -- object that control the execution of other objects.
Resource object -- objects that provide information to other objects. some resource object are the input to the proposed system, some are intermediate products of the system
Product object -- object that are the final products (forms or reports).
Assign each object into these types of objects allowing formal analysis.
Resource objects and product objects provide service for other objects.
Event and control objects determine the execution sequence of transactions.
Rule 1: resource objects and product objects cannot send messages to control objects and event objects.
Rule 2: control objects have no attributes but resource and product objects must have attributes.
Rule 3: transaction triggered by event objects must produce some resource for the system
Rule 4: All attributes in resource objects and products objects must be used by at least one method and defined by at least one method.
|Description||Number of Bugs||Percentage|
|Total Bugs Detected||143||100%|