VerificationValidationAndTesting

From Nsnam
Revision as of 19:23, 2 April 2009 by Craigdo (Talk | contribs)

Jump to: navigation, search

Verification, Validation and Testing

There is often much confusion regarding the meaning of the words Verification, Validation and Testing; and other associated terminology. It will be worthwhile to spend a little time establishing exactly what we mean when we use them.

A computer model is a mathematical or logical representation of something. It can represent a vehicle, a frog or a networking card. Models can also represent processes such as global warming, freeway traffic flow or a specification of a networking protocol. Models can be completely faithful representations of a logical process specification, but they necessarily can never completely simulate a physical object or process. In most cases, a number of simplifications are made to the model to make simulation computationally tractable.

Every model has a target system that it is attempting to simulate. The first step in creating a simulation model is to identify this target system and the level of detail and accuracy that the simulation is desired to reproduce. In the case of a logical process, the target system may be identified as TCP as defined by RFC 793. In this case, it will probably be desirable to create a model that completely and faithfully reproduces RFC 793. In the case of a physical process this will not be possible. If, for example, you would like to simulate a wireless networking card, you may come up with a statement such as, "an accurate MAC-level implementation of the 802.11 specification and [...] a not-so-slow PHY-level model of the 802.11a specification."

Once this is done, one can develop an abstract model of the target system. This is typically an exercise in managing the tradeoffs between complexity, resource requiremens and accuracy. The process of developing an abstract model has been called model qualification in the literature. In the case of a TCP protocol, this process results in a design for a collection of objects that will fully implement RFC 793 in ns-3. In the case of the wireless card, this process results in a number of tradeoffs to allow the physical layer to be simulated and the design of a network device and channel for ns-3.

This abstract model is then developed into an ns-3 model that implements the abstract model as a computer program. The process of getting the implementation to agree with the abstract model is called model verification in the literature.

The process so far is open loop. What remains is to determine taht a given ns-3 model has some connection to some reality -- that a model is an accurate representation of a real system, whether a logical process or a physical entity. If you are going to use a simulation model to try and predict how some real system is going to behave, you must have some reason to believe your results -- i.e., can you trust that an inference made from the model translates into a correct prediction for the real system. The process of getting the ns-3 model behavior to agree with the desired target system behavior as defined by the model qualification process is called model validation in the literature.

Generally, the process is described as a closed loop

 target-system <---------------> abstract-model <--------------> ns-3 model
       ^         qualification                    verification      ^
       |                                                            |
       +------------------------------------------------------------+
                               validation

Note that we have not used the term software testing at all in this discussion. The process of qualification, verification and validation is really a research and development activity.



Craigdo 19:03, 2 April 2009 (UTC)