WSC 2006 Abstracts


Modeling Methodology A Track


Monday 10:30:00 AM 12:00:00 PM
Conceptual Modeling

Chair: Stewart Robinson (Warwick Business School, UK)

Conceptual Modeling for Simulation: Issues and Research Requirements
Stewart Robinson (University of Warwick)

Abstract:
It is generally recognized that conceptual modeling is one of the most vital parts of a simulation study. At the same time, it also seems to be one of the least understood. A review of the extant literature on conceptual modeling reveals a range of issues that need to be addressed: the definition of conceptual model(ling), conceptual model requirements, how to develop a conceptual model, conceptual model representation and communication, conceptual model validation, and teaching conceptual modeling. It is clear that this is an area ripe for further research, for the clarification of ideas and the development of new approaches. Some areas in which further research could be carried out are identified.

Process Modelling Support for the Conceptual Modelling Phase of a Simulation Project
Cathal Heavey (University of Limerick) and John Ryan (Dublin Institute of Technology)

Abstract:
While many developments have taken place around supporting the model coding task of simulation, there are few tools available to assist in the conceptual modelling phase. Several authors have reported the advantages of using process modelling tools in the early phases of a simulation project. This paper provides an overview of process modelling tools in relation to their support for simulation, categorizing the tools into formal method and descriptive methods. A conclusion from this review is that none of the tools available adequately support the requirements gathering phase of simulation. This is not surprising as none of the process modelling tools were developed for explicit support of simulation. The paper then presents results of research into developing a new process modelling method for simulation.

What Can Be Done to Automate Conceptual Simulation Modeling?
Ming Zhou (Indiana State University)

Abstract:
Conceptual modeling is a critical step that directly affects the quality and efficiency of simulation projects. However, current technology can hardly support the process and most practice demonstrated an ad hoc and inefficient approach. Automation can help improve the efficiency and effectiveness of conceptual simulation modeling. However there are a number of issues must be addressed, including the formalization of model concepts, representation of modeling knowledge, and interaction between user and computer system. This paper presents a discussion of these issues based on the research by the authors, and propose suggestions for the design and development of a robust computerized modeling environment that aims to improve conceptual simulation modeling process.

Monday 1:30:00 PM 3:00:00 PM
DEVS Modeling

Chair: Adelinde Uhrmacher (University of Rostock, Germany)

A Simulation Algorithm for Dynamic Structure DEVS Modeling
Gabriel Wainer and Hui Shang (Carleton University)

Abstract:
Real-Time System (RTS) correctness and timeliness are critical. Modeling and Simulation techniques have been widely used for testing particular conditions on these systems. Recently, the DEVS formalism has been successfully used as a framework for RTS validation. Nevertheless, we need to address dynamic adaptation to dynamic changes in the environment. Dynamic Structure DEVS focuses on the possibility to change system structure dynamically according to the system real requirements, which is useful for RTS (in which sometimes it is impossible to interfere with the running of the system, and auto-adaptation is needed). We present a new algorithm derived from the DSDE and the dynDEVS formalisms. We use the DSDE formal specifications, and parts of the dynDEVS simulation algorithms.

Applying DEVS Modeling for Discrete Event Multiple Model Control of a Time Varying Plant
Gabriel Wainer and Alexander Campbell (Carleton University)

Abstract:
In recent years, we have developed a Modeling and Simulation-Driven Engineering methodology for engineering embedded Real-Time systems. This approach relies on the use of the DEVS formalism for developing components of real-time embedded systems using incremental development. Here, we show how to apply these techniques for an application in hybrid control. The model defines a discrete-event controller for a time varying plant based on multiple model control. Our discrete event approach permitted us to define such application, seamlessly integrating discrete event and continuous components. The approach allows secure, reliable testing, analysis of different levels of abstraction in the system, and model reuse. The common problem of "controller wind-up" or "parameter estimation bursting" can be avoided when performing this proposed form of discrete event adaptive control.

Introducing Variable Ports and Multi-Couplings for Cell Biological Modeling in DEVS
Adelinde M. Uhrmacher, Jan Himmelspach, Mathias Röhl, and Roland Ewald (University of Rostock)

Abstract:
Motivated by the requirements of molecular biological applications, we are suggesting an extension of the DEVS formalism. Starting with dynDDEVS, a reflective variant of Devs which supports dynamic behavior, composition, and interaction pattern, we develop rho-DEVS. Dynamic ports and multi-couplings are introduced whose combination allows models to reflect significant state changes to the outside world and enabling or disabling certain interactions at the same time. An abstract simulator describes the operational semantics of the developed formalism, and the Tryptophan operon model illustrates the developed ideas and concepts.

Monday 3:30:00 PM 5:00:00 PM
Simulation Methodologies

Chair: Ralf Mayer (Mitre Corporation, USA)

A Case Study of the Development and Use of a Mana-Based Federation for the Studying U.S. Border Operations
Emmet Beeker and Ernest Page (The MITRE Corporation)

Abstract:
A federation approach is used to expand the geographic extent of MANA (Map Aware Non-uniform Automata), a cellular-automaton based agent simulation, in order to support a study of investment strategies for border protection along a portion of the southern U.S. border. The Federation is implemented using the Department of Defense (DoD) High Level Architecture (HLA). Federation performance is optimized using HLA Data Distribution Management (DDM) services and through a bypass of the normal HLA mechanisms for ownership transfer. Analysis of the running federation indicates that overhead due to federation processing is minimal - less than 6% of the total federation runtime (94% of the runtime is due to processing in the MANA simulations).

A Non-Fragmenting Partitioning Algorithm for Hierarchical Models
Roland Ewald, Jan Himmelspach, and Adelinde M. Uhrmacher (University of Rostock)

Abstract:
The simulation system James II is aimed at supporting a range of modeling formalisms and simulation engines. The partitioning of models is essential for distributed simulation. A suitable partition depends on model, hardware, and simulation algorithm characteristics. Therefore, a partitioning layer has been created in James II which allows to plug in partitioning algorithms on demand. Three different partitioning algorithms have been implemented. In addition to the well known Kernighan-Lin algorithm and a geometric approach, a partitioning algorithm for hierarchically structured models has been developed whose performance is evaluated.

Enhancement of Memory Pools Toward a Multi-Threaded Implementation of the Joint Integrated Mission Model (JIMM)
David Wayne Mutschler (Naval Air Systems Command)

Abstract:
The Joint Integrated Mission Model (JIMM) is a legacy real-time discrete-event simulator. Its initial single-threaded implementation employed a memory pool to speed up run-time performance and easily checkpoint simulation state. Unfortunately, when JIMM started migrating to a multi-threaded implementation, this legacy memory pool was quickly identified as a bottleneck. This problem is addressed by dividing the memory into large chunks managed by a global controller but where thread-specific memory managers handled lower level memory allocation. This paper will focus on the legacy memory pool in JIMM and enhancements necessary for an efficient multi-threaded implementation.

Tuesday 8:30:00 AM 10:00:00 AM
Panel: Simulation Project Life-Cycle

Chair: Robert Sargent (Syacuse University, USA)

The Simulation Project Life-Cycle: Models and Realities
Robert G. Sargent (Syracuse University), Richard E. Nance (ORCA Computer, Inc.), C. Michael Overstreet (Old Dominion University), Stewart Robinson (University of Warwick) and Jayne E. Talbot (Virtual Technology Corporation)

Abstract:
This panel session will discuss various issues regarding simulation life-cycle models. Simulation life-cycles models have received little attention and it is hoped that this panel session will general interest in this topic and some new ideas for these types of models.

Tuesday 10:30:00 AM 12:00:00 PM
Metamodeling

Chair: Cathal Heavey (University of Limerick, Ireland)

Grid Enabled Sequential Design and Adaptive Metamodeling
Wouter Hendrickx, Dirk Gorissen, and Tom Dhaene (University of Antwerp)

Abstract:
Metamodeling is emerging as a valuable new tool in simulation: complex computer codes can be approximated by surrogate models (analytic, neural network, SVM, etc.) which can easily be evaluated on-the-fly. Adaptive modeling and sequential design further improve the performance of metamodeling frameworks. Gridcomputing quickly replaces regular cluster computing when it comes to complex calculations. Several efforts use grid computing to facilitate the exploration of simulator outputs. This contribution combines adaptive modeling and sequential design with distributed, grid-based techniques into one metamodeling framework.

A New Metric for Measuring Metamodels Quality-of-Fit for Deterministic Simulations
Husam A. Hamad (Yarmouk University)

Abstract:
Metamodels are used to provide simpler prediction means than the complex simulation models they approximate. Accuracy of a metamodel is one fundamental criterion that is used as the basis for accepting or rejecting a metamodel. Average-based metrics such as root-mean-square error RMSE and R-square are often used. Like all other average-based statistics, these measures are sensitive to sample sizes unless the number of test points in these samples is adequate. We introduce in this paper a new metric that can be used to measure metamodels fit quality, called metamodel acceptability score MAS. The proposed metric gives readily interpretable meaning to metamodels acceptability. Furthermore, initial studies show that MAS is less sensitive to test sample sizes compared to average-based validation measures.

Meta-Level Control Architecture for Massively Multiagent Simulations
Shohei Yamane (Department of Social Informatics, Kyoto University)

Abstract:
Various situations in a massively multi-agent simulation will emerge in a simulation or the period of the simulation will become too long. These situations cause problems for system operators in that each action scenario becomes too complex to maintain and a simulation costs very long time. Therefore, flexible control of the simulation, such as changing simulation speed and switching agents' action scenarios, is required. We propose a meta-scenario description language and a meta-level control architecture. The meta-scenario description language describes how to control simulations and agents based on an extended finite state machine. Meta-level control architecture achieves control on the basis of meta-scenarios provided by a meta-scenario interpreter, which controls interpreters of agents' action scenarios and the simulation environment. In addition, our proposed architecture does not lose scalability of massively multi-agent systems for some applications.

Tuesday 1:30:00 PM 3:00:00 PM
Formal Methods and Validation

Chair: Gabriel Wainer (Carleton University, Canada)

Analyzing Static and Temporal Properties of Simulation Models
Mamadou Kaba Traoré (LIMOS CNRS UMR 6158, Blaise Pascal University)

Abstract:
This paper shows how a simulation model can be specified so that its static and temporal properties can be formally analyzed. The approach adopted is based on the integration of Formal Methods (FMs) and the DEVS paradigm. FMs are known to allow symbolic manipulation and reasoning, while DEVS is known as being a well-establish Modeling and Simulation (M&S) framework. Combining them makes it possible to develop rigorous proofs of the properties of simulation models as regard to design and use requirements. This paper focuses on the so-called atomic specification. Static aspects of the model are captured with the Z formalism, while dynamic aspects are expressed in first order logic. The specification is supported by the Z/EVES tool. A case study is exhibited.

A Neural Network Approach to the Validation of Simulation Models
Jurgen Martens, Karl Pauwels, and Ferdi Put (Catholic University of Leuven)

Abstract:
We tackle the problem of validating simulation models using neural networks. We propose a neural-network-based method that first learns key properties of the behaviour of alternative simulation models, and then classifies real system behaviour as coming from one of the models. We investigate the use of multi-layer perceptron and radial basis function networks, both of which are popular pattern classification techniques. By a computational experiment, we show that our method successfully allows to distinguish valid from invalid models for a multiserver queueing system.

A Prescriptive Technique for V&V of Simulation Models When No Real-Life Data Are Avaiable
Leonardo Chwif and Paulo Sérgio Muniz Silva (Unifieo) and Lúcio Mitio Shimada (PETROBRAS)

Abstract:
Verification and Validation (V&V) is a key process to guarantee that any model represents adequately a given system. Although no one can guarantee a 100% valid model, it is possible to increase model confidence by the utilization of V&V techniques. There are many V&V techniques which have a descriptive nature (they tell us what to do but not how to do it). There are also prescriptive techniques, that tell us how to do it, but in simulation practice they are underused. The main goal of this paper is based on Kleijnen (1999) procedure. It is to propose a prescriptive V&V technique that is simple enough for practical application and, because of its procedural nature, it could be easily built into any simulation software, thus enabling the automation of the V&V process. This approach was also applied to some test problems confirming its feasibility.

Tuesday 3:30:00 PM 5:00:00 PM
Agent Based Simulation

Chair: Maria Hybinette (University of Georgia, USA)

Efficient Agent-Based Simulation Framework for Multi-Node Supercomputers
Toshihiro Takahashi and Hideyuki Mizuta (Tokyo Research Laboratory, IBM Research)

Abstract:
In recent years the importance of a large-scale Agent-Based Simulation(ABS) that can handle large complex systems is increasing. We developed a large-scale ABS framework on BlueGene, which is a multi-node supercomputer. The ABS processes the agents' communications. When the number of transmissions among the agents is large, the transmission costs seriously affect the performance of the simulation. It is possible to reduce the amount of transmission among the nodes by clustering the agents which communicate heavily with each other. Assuming that an agent is a graph node, and that a data transmission between agents is a graph edge, this problem can be formulated as a Maximum-Flow and Minimum-Cut Problem. In this paper we present an efficient algorithm to find an approximate solution. Our algorithm is reliable, simple, and needs little computation. We demonstrate its beneficial effects with some experiments.

SASSY: A Design for a Scalable Agent-Based Simulation System Using a Distributed Discrete Event Infrastructure
Maria Hybinette, Eileen Kraemer, Yin Xiong, Glenn Matthews, and Jaim Ahmed (The University of Georgia)

Abstract:
The PDES literature offers a rich set of techniques for distributed and efficient simulation. However, there is a growing need for simulators that support agent-based applications, and PDES systems are not always well suited for these applications. Example agent-based applications include simulation of biological systems such as ants and bees, multi-robot systems and battlefield simulations. The robotics research community has developed agent-based simulators that provide useful APIs for agent applications. However, such simulators have performance limitations, and they do not scale well. Our approach is to provide middleware between an agent-based API and a PDES simulation kernel. The result is a simulation system that offers an agent-based API for the programmer to a high performance PDES system. Here we describe our design and initial implementation of SASSY, the Scalable Agents Simulation System. We describe our initial implementation and compare the design with related approaches.

Multi-Agent Learning Model with Bargaining
Haiyan Qiao, Jerzy Rozenblit, Ferenc Szidarovszky, and Lizhi Yang (University of Arizona)

Abstract:
Decision problems with the features of prisoner's dilemma are quite common. A general solution to this kind of social dilemma is that the agents cooperate to play a joint action. The Nash bargaining solution is an attractive approach to such cooperative games. In this paper, a multi-agent learning algorithm based on the Nash bargaining solution is presented. Different experiments are conducted on a testbed of stochastic games. The experimental results demonstrate that the algorithm converges to the policies of the Nash bargaining solution. Compared with the learning algorithms based on a non-cooperative equilibrium, this algorithm is fast and its complexity is linear with respect to the number of agents and number of iterations. In addition, it avoids the disturbing problem of equilibrium selection.

Wednesday 8:30:00 AM 10:00:00 AM
Modeling Methodologies for Manufacturing and Business

Chair: Mike Pidd (Lancaster University, UK)

Assessment of the NIST Shop Data Model as a Neutral File Format
Greg Harward (ProModel Corporation) and Charles Harrell (Brigham Young University)

Abstract:
This paper evaluates the shop data model (SDM) being developed by the National Institute of Standards and Technology (NIST) in terms of its viability as a neutral file format (NFF) for the discrete-event simulation (DES) of manufacturing systems. ProModel simulation software served as the test case for this evaluation. Observations are also provided regarding the challenges that simulation vendors might encounter when implementing the proposed NIST SDM. This paper shows that the NIST SDM doesn’t pose any limitations which would prevent it from syntactically representing a manufacturing simulation model, however, it is not without certain challenges and difficulties. While only 28% of the ProModel data elements are currently supported by the SDM, future enhancements to the SDM should allow the information model to serve as a foundation upon which a common information model and NFF for the DES industry could be built.

Modelling and Simulation of Human Decision-Making in Manufacturing Systems
Gert Zülch (University of Karlsruhe - ifab-Institute of Human and Industrial Engineering)

Abstract:
The simulation of manufacturing processes mainly focuses on the structure of machinery resources and the flow of material, but the inclusion of the personnel in the simulation model is only slowly gaining in importance. When personnel resources are modelled, merely the operative tasks are represented. However, as a result of modern manufacturing concepts, worker decisions at a workshop level are becoming more and more important. This article deals with various concepts for the modelling of human decisions in manufacturing systems, namely from human decision makers as passive resources over the modelling of decisions based on global rules to the modelling of active decision makers with individual, locally valid decision-making rules. Each of these various types of modelling will be elucidated using an application example.

A Dynamic Business Model for Component-Based Simulation Software
Stephan Onggo, Didier Soopramanien, and Mike Pidd (Lancaster University Management School)

Abstract:
Firms, investors, venture capitalists, market analysts and the government, amongst others, are interested in the future evolution and dynamics of a market as it defines their role/participation or future role/participation. This paper proposes a business model showing how the interactions of various actors in the market influence the "demand" and "supply" interaction for an application based software; more specifically component based simulation. In the process we also show how the main stakeholders may gain some financial benefits by adopting the component-based simulation for business decisions in the long run. We identify four main stakeholders: component users, component providers, certification providers, and repository providers. A system dynamic model is built to show the interaction between the two main stakeholders.

Wednesday 10:30:00 AM 12:00:00 PM
Modeling Methodologies for Specific Applications

Chair: Navonil Mustafee (Brunel Univesity, UK)

A Data-Integrated Nurse Activity Simulation Model
Durai Sundaramoorthi, Victoria C. P. Chen, Seoung B. Kim, Jay M. Rosenberger, and Deborah F. Buckley-Behan (The University of Texas at Arlington)

Abstract:
This research develops a data-integrated approach for constructing simulation models based on a real data set provided by Baylor Regional Medical Center (Baylor) in Grapevine, Texas. Tree-based models and kernel density estimation were utilized to extract important knowledge from the data for the simulation. Classification and Regression Tree model, a data mining tool for prediction and classification, was used to develop two tree structures: a) a regression tree, from which the amount of time a nurse spends in a location is predicted based on factors, such as the primary diagnosis of a patient and the type of nurse; and b) a classification tree, from which transition probabilities for nurse movements are determined. Kernel density estimation is used to estimate the continuous distribution for the amount of time a nurse spends in a location. Merits of using our approach for Baylor's nurse activity simulation are discussed.

Using Anecdotal Information to Model the Availability of an Existing Dynamometer System
Valerie G. Caryer Cook (DaimlerChrysler / Lawrence Technological University)

Abstract:
Low-volume, custom-built or specialty equipment, by nature, has little statistically significant data to predict system availability over the equipment life. Their unique constructions are often costly to purchase and install, and are equally costly to maintain. This paper presents a practical method to estimate the availability of custom-built equipment, using a custom 4wd NVH dynamometer system as an example. The proposed method models the availability of an existing custom-built system using anecdotal component information based on interviews with field service personnel. The interview data is used to create estimated probability density functions for the major components of the system. Component probability density functions are assembled into a system model based on a derived system reliability function. This technique provides a low-cost, quick, model of system availability over time which can be used to assess the risk and cost effectiveness of system maintenance strategies.

Computational Investigation of Quasirandom Sequences in Generating Test Cases for Specification-Based Tests
Hongmei Chi (Florida A&M University)

Abstract:
This paper presents work on generation of specification-driven test cases based on quasirandom (low-discrepancy) sequences instead of pseudorandom numbers. This approach is novel in software testing. This enhanced uniformity of quasirandom sequences leads to faster generation of test cases covering all possibilities. We demonstrate by examples that quasirandom sequences can be a viable alternative to pseudorandom numbers in generating test cases. In this paper, we present a method that can generate test cases from a decision table specification more effectively via quasirandom numbers. Analysis of a simple problem in this paper shows that quasirandom sequences achieve better data than pseudorandom numbers, and have the potential to converge faster and so reduce the computational burden. The use of different quasirandom sequences for generating test cases is presented in this paper.

[ Return to Top | Return to Program ]