ABSTRACT
Sufficient experience has been gained over the last decade in simulation validation, verification, and testing (VV&T) to establish basic principles about its characteristics. This paper presents 15 principles of simulation VV&T. These principles help the researchers, practitioners and managers better understand what model VV&T is all about. They serve to provide the underpinnings for the VV&T techniques that can be used throughout the life cycle of a simulation study. This paper also surveys current software VV&T techniques and current simulation VV&T techniques. Understanding and applying these principles and employing proper testing techniques throughout the life cycle of a simulation study are key factors in increasing the probability of success in a simulation study.
- Balci, O. 1986. Requirements for model development environments. Computers & Operations Research 13:53-67. Google ScholarDigital Library
- Balci, O. 1994. Validation, verification, and testing techniques throughout the life cycle of a simulation study. Annals of Operations Research 53:121-173.Google ScholarCross Ref
- Balci, O. 1996. Principles of simulation model validation, verification, and testing. International Journal in Computer Simulation, to appear. Google ScholarDigital Library
- Balci, O., A. I. Bertelrud, C. M. Esterbrook, and R. E. Nance. 1995. A picture-based object-oriented visual simulation environment. In Proceedings of the 1995 Winter Simulation Conference, ed. C. Alexopoulos, K. Kang, W. Lilegdon, and D. Goldsman. IEEE, Pi~cataway, New Jersey. Google ScholarDigital Library
- Balci, O. and R. E. Nance. 1985. Formulated problem verification as an explicit requirement of model credibility. Simulation 45:76-86.Google ScholarCross Ref
- Balci, O. and R. E. Nance. 1987. Simulation model development environments: a research prototype. Journal of Operational Research Society 38:753- 763.Google ScholarCross Ref
- Balci, O. and R. E. Nance. 1992. The simulation model development environment: an overview. In Proceedings of the 1992 Winter Simulation Conference, ed. J. J. Swain, D. Goldsman, R. C. Crain, and J. R. Wilson, 726-736. IEEE, Piscataway, New Jersey. Google ScholarDigital Library
- Balci, O. and R. G. Sargent. 1981. A methodology for cost-risk analysis in the statistical validation of simulation models. Communications of the ACM 24:190-197. Google ScholarDigital Library
- Hetzel, W. 1984. The complete guide to software testing. QED Information Sciences, Wellesley, Massachusetts. Google ScholarDigital Library
- Johnson, M. E. and M. Mollaghasemi. 1994. Simulation input data modeling. Annals of Operations Research 53:47-75.Google ScholarCross Ref
- Nance, R. E. 1994. Conical methodology: an evolutionary convergence of systems and software engineering. Annals of Operations Research 53:1-45.Google ScholarCross Ref
- Nance, R. E. and C. M. Overstreet. 1987. Diagnostic assistance using digraph representations of discrete event simulation model specifications. Transactions of the SCS 4:33-57.Google Scholar
- 0ren, T. I. 1981. Concepts and criteria to assess acceptability of simulation studies: a frame of reference. Communications of the A CM 24:180-189. Google ScholarDigital Library
- 0ren, T. I. 1986. Artificial intelligence in quality assurance of simulation studies. In Modelling and simulation methodology in the artificial intelligence era, ed. M. S. Elzas, T. I. Oren, and B. P. Zeigler, 267- 278. North Holland, Amsterdam, Holland.Google Scholar
- 0ren, T. I. 1987. Quality assurance paradigms for artificial intelligence in modelling and simulation. Simulation 48:149-151. Google ScholarDigital Library
- Paul, R. j. 1989. Visual simulation: seeing is believing? In Impacts of recent computer advances on operations research, ed. R. Sharda, B. L. Golden, E. Wasil, O. Balci, and W. Stewart, 422-432. Elsevier Science Publishing, New York, New York.Google Scholar
- Schach, S. R. 1993. Software engineering. 2nd ed. Irwin, Homewood, Illinois. Google ScholarDigital Library
- Schlesinger, S., et al. 1979. Terminology for model credibility. Simulation 32:103-104.Google ScholarCross Ref
- Shannon, R. E. 1975. Systems simulation: the art and science. Prentice-Hall, Englewood Cliffs, New Jersey.Google Scholar
- U.S. GAd. 1976. Report to the congress: ways to improve management of federally funded computerized models. LCD-75-111, U.S. General Accounting Office, Washington, DC.Google Scholar
- U.S. GAd. 1979. Guidelines for model evaluation. PAD-79-17, U.S. General Accounting Office, Washington, DC.Google Scholar
- U.S. GAd. 1987. DOD Simulations: improved assessment procedures would increase the credibility of results. GAO/PEMD-88-3, U.S. General Accounting Office, Washington, DC.Google Scholar
- Watson, C. E. 1976. The problems of problem solving. Business Horizons 19:88-94.Google ScholarCross Ref
- Whitner, R. B. and O. Balci. 1989. Guidelines for selecting and using simulation model verification techniques. In Proceedings of the 1989 Winter Simulation Conference, ed. E. A. MacNair, K. J. Musselman, and P. Heidelberger, 559-568. IEEE, Piscataway, New Jersey. Google ScholarDigital Library
Index Terms
- Principles and techniques of simulation validation, verification, and testing
Recommendations
Selecting verification and validation techniques for simulation projects: a planning and tailoring strategy
WSC '13: Proceedings of the 2013 Winter Simulation Conference: Simulation: Making Decisions in a Complex WorldConducting verification and validation (V&V) of modeling and simulation (M&S) requires systematic and structured application of different V&V techniques throughout the M&S life cycle. Whether an existing technique is appropriate to a particular V&V ...
Comments