WSC 2009 Final Abstracts
Applications - General Applications Track
Monday 10:30:00 AM 12:00:00 PM
Simulation in Latin America
Chair: Jairo Montoya-Torres (Universidad de La Sabana)
Using Randomization to Solve the Deterministic Single and Multiple Vehicle Routing Problem with Service Time Constraints
Jairo R. Montoya-Torres, Edgar H. Alfonso Lizarazo, Edgar Gutierrez-Franco, and Ana X. Halabi (Universidad de La Sabana)
This paper considers the deterministic vehicle routing problem with service time requirements for delivery. Service requests are also available at different times known at the initial time of route planning. This paper presents an approach based on the generation of service sequences (routes) using randomness. Both the single-vehicle and the multiple-vehicle cases are studied. Our approach is validated using random-generated data and compared against the optimal solution obtained by mathematical programming for small-sized instances, as well as against known lower bounds for medium to large-sized instances. Results show that our approach is competitive with reference to the value of the objective function, and requires less computational time in comparison with the exact resolution procedure.
Bayesian Models and Stochastic Processes Applied to CSP Sampling Plans for Quality Control in Production in Series and by Lots
Rodrigo A. Barbosa Correa, Carlos D. Paternina Arboleda, and Diana G. Ramírez Ríos (Universidad del Norte)
Nowadays, businesses consider that their methods are perfect, this means, that by having available a department of analysis and statistic control of the process, everything that the inspector or the inspection tools decides are considered to be correct, with not even a minimum of error involved. Yet if they considered the principles of uncertainty of Heisenberg, in which he believes that the uncertainty associated to the observation, does not contradict the existence of laws that govern the behavior of the particles in the universe, not even the capacity of the scientists to discover those laws, which will be seen as precise predictions, which can be substituted by the calculations of probabilities. This investigation focuses on the study of CSP sampling plans for acceptance with Bayesian and Markovian revisions, in the processes of production in series and by lots, that support the quality activities and reduction of costs by inspection.
Design and Development of a Simulator for the Brazilian Data Collecting System Based on Satellites
Germano Kienbaum, Felipe Miranda, Joaquim Pedro Barreto, Marcus Vinicius Cisotto, and Wanderson Gomes Almeida (National Space Research Institute) and Helcio Vieira Junior (AirspaceTechnical Center)
This work describes an ongoing project, aiming at the development of a complete simulation study of the Brazilian Data Collecting System based on Satellites (SBCD). The SBCD system comprehends a spatial and a ground segment and it was built for automatic acquisition of data of meteorological and hydrological nature for use in environmental and business applications. The main objective of the project is to build a simulator for conducting a thorough analysis of the system, in order to assess its performance, to determine its best configuration and boundary conditions, and to optimize its operational cost and efficiency, in regard to different configurations and operating scenarios.
Monday 1:30:00 PM 3:00:00 PM
Product Service Systems
Chair: Gregory Zacharevicz (University of Bordeaux)
Services Modeling and Distributed Simulation DEVS / HLA Supported
Gregory Zacharewicz, Thecle Alix, and Bruno Vallespir (Laboratoire IMS-LAPS UMR CNRS 5218 Groupe Productique (GRAI))
Nowadays, Service design and development is more than ever key challenge for enterprises. Nevertheless, the set up of new services activities is not entirely formalized and guided by a clear methodology or common standard. These facts can lead on wrong services definitions that penalize the enterprise, in particular in case this evidence arrive late. The idea introduced in this paper is to model and simulate services before realizing them in order to validate desired properties and anticipate wrong behavior. Considering this proposal, a problem appears: no whole part of the service (actors, software, machines) can be included in the model, real actors are wanted to act in the loop. In addition, the simulation requires to interoperate and synchronize with heterogeneous and distributed actors of the service. The goal of the paper is to present a Service Modeling Environment based on a graphical Service Modeling language and on Distributed DEVS Simulation.
Enabling Value Co-production in the Provision of Support Service Engineering Solutions Using Digital Manufacturing Methods
Joseph Butterfield (Queen’s University Belfast), Irene C.L. Ng (University of Exeter Business School), Rajkumar Roy (Cranfield University) and William McEwan (Queen’s University Belfast)
Traditional engineering business models in aerospace manufacture and deliver finished equipment to the customer. Service provision is typically limited to procedural documentation and providing spare parts to the end user. Commercial pressures have resulted in end users structuring their own core business activities resulting in the need for original equipment manufacturer (OEM) to integrate and manage support service activities in partnership with the customer to deliver the avail-ability of the equipment. This improves the probability of commercial success for the OEM through shared operational risks while reducing the cost of ownership for the customer. This paper applies four of the seven attributes of value co-creation (AVCs) developed by Ng et. al. (2009) required for an integrated and partnered approach to service provision between the OEM and the customer. It also shows how these are supported through applying digital manufacturing methods for the design and implementation of complex service processes.
Perspectives on High-Tech Product Design for Better Supporting Product-Service Systems
Stelian Brad (Technical University of Cluj-Napoca)
Nowadays, product-service systems are seen as enhanced and optimal integrated solutions of products and services with the purpose of improving business sustainability, customer satisfaction, environmental impact and production-consumption balance. The ability to integrate various value added vectors characterizing the concept of product-service system within high-tech product design is still a complex task, requiring systematic and comprehensive innovation. This means high-tech product design has to be seen from a broader perspective, as a process emerging from an integrated model of product innovation, marketing innovation, production process innovation and business model innovation over their life-cycles. On this basis, the present paper reveals various key strategies required in new high-tech product design for better supporting product-service systems. A model that establishes priorities for integrating these strategies in the design framework of high-tech products is further proposed.
Monday 3:30:00 PM 5:00:00 PM
Production and Innovation
Chair: Benny Tjahjono (Cranfield University)
A Web Service Based Artificial Market
Karla Atkins (Virginia Tech), Christopher L Barrett (Virgnia Tech) and Achla Marathe (Virginia Tech)
We describe a web services based computational tool for studying large commodity markets. The computational model has several distinguishing features. They include: (i) the ability to generate individualistic, demographics based, time-varying demand profiles, (ii) a highly configurable system that supports different market clearing mechanisms, strategies and matching algorithms for buyers and sellers, (iii) ability to aggregate individuals into different hierarchy of classes and (iv) ability to physically clear flow based commodities. From the software standpoint, the architecture has several unique features, including an easy to use web based graphical user interface for specifying input parameters as well as viewing the results, and a work flow language for modeling various markets and market mechanism. The system provides users the unique ability to experiment with a variety of markets such as market for communication spectrum, internet bandwidth, electricity, as well as traditional commodities like corn, cotton etc.
Operational Simulation Model of the Raw Material Handling in an Integrated Steel Making Plant
Robson Jacinto Coelho, Paula Fernandes de Lana, Adriano César Silva, and Takeo Fugiwara Santos (ArcelorMittal Tubarão), Marcelo Moretti Fioroni, Luiz Augusto G. Franzese, and Daniel De Oliveira Mota (Paragon Tecnologia) and Luiz Bueno Da Silva (Universidade Federal da Paraíba)
This article is focused on the design and implementation of an operational simulation model (OSM) of the handling of raw material in an integrated steel making plant, considering operations of receiving, unloading, stocking, handling and supplying the different raw materials related to the production process with an operational perspective. The aim of this focus is to help in the decision making of the team controlling the ore inventory. Based on the methodological structure developed by Coelho (2008), this OSM showed that most of the valid concepts for simulations with a strategic focus do not present relevance from an operational point of view. The advantage of the OSM is the fact that it is not random or stochastic, but deterministic, while exhibiting behavior considered satisfactory by management and steelyard team alike.
Assembly Line Design Principles Using Six Sigma and Simulation
Benny Tjahjono, Peter Ball, John Ladbrook, and John Kay (Cranfield University)
Many variables and constraints must be taken into account when designing a manufacturing facility such as an assembly line, which often depends on common practices and experience of the manufacturing engineers. Six Sigma has shown its benefits particularly in the process improvement and product development. This paper explores the applicability of Six Sigma and simulation techniques to derive a set of principles that can be used by manufacturing engineers to design assembly lines. The idea is to use simulation models as a basis for experimentation of parameters that are critical to the productivity of the lines. The sensitivity of these parameters were analyzed and the results from the experiments were then collated into a set of design principles that can be used as part of the facility design process.
Tuesday 8:30:00 AM 10:00:00 AM
Chair: Mary Thompson (KAIST, Korea)
Large Scale Knowledge-Based Simulation Models: An Aproximation to the North-South Remittances Model
Carlos Ramón García-Alonso, Esther Arenas-Arroyo, and Gabriel María Pérez-Alcalá (ETEA, Business Administration Faculty. University of Córdoba)
This paper studies the evolution throughout the time span of those covariates that remittances depend on. Not only economical variables but also demographic, social and political ones have been taken into account in a Monte Carlo based simulation model. Expert knowledge was incorporated modeling fuzzy dependence relationships (DR) between covariates based on standard macroeconomic models. An improved procedure to make fuzzy rules explicit and to evaluate them automatically was designed and tested in a multilevel fuzzy inference engine. Primary covariates (inputs in a dependence relationship) were defined by standard statistical distributions (uniform). The multilevel fuzzy inference engine evaluated DR outputs, following a hierarchical structure once the input values were known. Using this methodology, a North-South remittances model was designed and evaluated. Results showed that intermediate DR outputs matched the expert-based expectations reasonably as did the remittances
Computer Simulations of Innovation Implementation Strategies
Peter Hovmand (Washington University in St. Louis) and David N Ford (Texas A&M University)
Many interventions that are effective in one setting may be ineffective or even harmful in other settings. This poses a problem for organizations and communities planning the implementation of new programs, policies, and practices. This paper introduces the use of system dynamics computer simulation of real options to design implementation strategies in complex social systems. The approach is illustrated with an example of domestic violence community interventions involving the implementation of coordination and victim advocacy to reduce the unintended consequence of victim arrests from a mandatory arrest policy. Results show that there are potential benefits to using a real options approach.
Simulation Thinking: Where Design and Analysis Meet
Mary Kathryn Thompson (KAIST)
Design and analysis have historically been viewed as opposites. However, design and analysis are closely related activities which involve similar types of thinking. This work examines the similarities and differences in cognition and education for design and numerical simulation. The educational pedagogy and educational outcomes of a first year design program and a graduate level finite element analysis course are discussed, and a list of characteristics and cognitive abilities of a good simulator - dubbed "simulation thinking" - is presented.
Tuesday 10:30:00 AM 12:00:00 PM
Chair: SoonMin Ko (Columbia University)
A Simulation Model to Analyze the Impact of Distance and Direction on Golf Scores
Mark Broadie and Soonmin Ko (Columbia University)
We develop a simulation model of the game of golf. The model accounts for realistic features of a golf course, including rough, sand, water, and trees, and includes many facets of golfer skill. The model is calibrated to extensive data for amateur and professional golfers. Using the calibrated simulation model we quantify the effect of increased tee shot distance and improved tee shot accuracy on golfer scores. Contrary to previous claims, we find that for long tee shots, directional accuracy has a greater impact on scores than distance.
A Dynamic Data Driven Application System for Wildfire Spread Simulation
Xuefeng Yan (Nuaa), feng Gu (Department of Computer Science) and Xiaolin Hu (GSU)
Wildfire spread simulation plays important roles in wildfire management. Existing wildfire simulations are largely decoupled from real wildfires by making little usage of real time data. In this paper, a dynamic data driven application system is presented to incorporate the real time data into the simulation model, thus to improve the simulation results. The developed dynamic data driven application system is based on the DEVS-FIRE model and employs the particle filtering algorithm to estimate the state of fire spread. We describe the overall structure of the dynamic data driven application system for wildfire spread simulation. The major issues and computation models of this work are presented and experiment results are provided.
A Dynamic Architecture for Increased Passenger Queue Model Fidelity
Michael Johnstone, Vu Le, Saeid Nahavandi, and Doug Creighton (Deakin University)
This study presents a dynamic queue controller to generate realistic queue formation and behaviour within a discrete event environment and a new data set to define passenger walking speeds. This new controller provides a detailed visual reference of the queue behaviour and provides information on important metrics, such as queue size. The controller, combined with the walking speed data, is validated against CCTV footage of airport passenger screening points, and the simulation outputs are compared to results obtained from queueing theory. A simulation approach provides superior results over the averaged results from queuing theory and a more useful insight into the behaviour of the system.
Tuesday 1:30:00 PM 3:00:00 PM
Chair: Muer Yang (University of Cincinnati)
Are All Voting Queues Created Equal?
Muer Yang, Michael J. Fry, and W. David Kelton (University of Cincinnati)
Providing equitable voting experiences across voting precincts has been noted as an important goal in elections. We seek to provide equity to all voters so that no one particular group of voters is disadvantaged or disenfranchised. This paper uses the average absolute differences of waiting times across all precincts as a performance metric for equity. A simulation-based greedy improvement algorithm is proposed to generate machine allocations. We examined our allocation solution using a factorial experimental design, and we conclude that our heuristic outperforms the utilization-equalization method which was used by at least one county in the 2008 presidential election.
Sequential Metamodelling with Genetic Programming and Particle Swarms
Birkan Can and Cathal Heavey (Enterprise Research Centre, University of Limerick)
This article presents an application of two main component methodologies of evolutionary algorithms in simulation-based metamodelling. We present an evolutionary framework for constructing analytical metamodels and apply it to simulations of manufacturing lines with buffer allocation problem. In this framework, a particle swarm algorithm is integrated to genetic programming to perform symbolic regression of the problem. The sampling data is sequentially generated by the particle swarm algorithm, while genetic programming evolves symbolic functions of the domain. The results are promising in terms of efficiency in design of experiments and accuracy in global metamodelling.
Buffer Capacity Allocation Using Ant Colony Optimisation Algorithm
Ivan Vitanov (BAE Systems), Valentin Vitanov (Durham University) and David Keith Harrison (Glasgow Caledonian University)
This paper presents an algorithm for the near-optimal allocation of buffer space to an assembly line by means of the ant colony optimisation (ACO) paradigm. Uniquely, the algorithm has been designed to work in conjunction with a simulation model and is adapted to have both combinatorial and stochastic problem-solving capability. The simulation model was developed using the WITNESS simulation package and serves the purpose of an objective function evaluator, encapsulating the dynamics of the line and enabling the production rate for a particular buffer configuration to be determined. Finally, the WITNESS Optimiser module was used as a benchmark in validating the ACO algorithm’s performance. In the simulation experiments conducted, ACO attained slightly better results overall.
Tuesday 3:30:00 PM 5:00:00 PM
Chair: Lukas Kroc (Cornell University)
SessionSim: Activity-Based Session Generation for Network Simulation
Lukas Kroc (Cornell University) and Stephan Eidenbenz and James Smith (Los Alamos National Laboratory)
We present SessionSim, a tool for generating realistic communication sessions such as phone calls, http and email data traffic. Realistic data traffic is a crucial requirement to gauge the realism of any larger communication network simulation study. SessionSim is part of a large-scale communication network simulation environment (MIITS: Multi-scale Integrated Information and Telecommunications System), where detailed information about the individuals in a synthetic population is available, including activities (e.g., sleep, work, lunch) and locations. The key aspect of the SessionSim modeling philosophy is the insight that communication behavior heavily depends on the type of activity people are engaged in; key model parameters in addition to the nature of this dependence are inter-session times, source-destination pairs, and the actual data content that determines session size or duration. We present a mix of empirical data, earlier models and intuition for determining session parameters for phone calls, http and email.
Priority-Based Routing with Strict Deadlines and Server Flexibility Under Uncertainty
Hoda Parvin (University of Michigan), Abhijit Bose (IBM T. J. Watson Research Center) and Mark P Van Oyen (University of Michigan)
In this research we present a simulation-based approach to study alternative dynamic assignment policies in an information technology (IT) service delivery environment. Our overarching goal is to find the most cost-effective assignment of service requests to cross-trained agents in a large-scale network. We present a novel heuristic algorithm that assigns an analytically described allocation index to each service request that has arrived. It incorporates factors such as variability in agents’ capabilities, uncertainty in request inter-arrival times and complex service level agreements (SLA). We investigate the effective-ness of our proposed assignment algorithm using real world data from an IT service environment on a small problem instance. We discuss how the results of this simulation can help improve the terms of service level contracts as well as agent training programs.
Simulation of Large Wireless Sensor Networks Using Cell-DEVS
Blerim Qela (University of Ottawa), Gabriel Wainer (Carleton University) and Hussein Mouftah (University of Ottawa)
The advancement of electronic sensing devices, microcomputers and wireless communication devices has lead to creation of new smart sensor devices, which can monitor actuate, compute and communicate. Typically, sensors are deployed in non-deterministic mode (randomly); when deployed in large numbers, these sensor devices have the capability to self-organize into the so-called Wireless Sensor Networks (WSN). WSN are ad-hoc networks, consisting of spatially distributed sensing and processing devices. In this paper, the objective is to simulate a Large Wireless Sensor Network (WSN) by implementing the Topology Control Algorithm as presented in  using the Cell-DEVS model , an extension of DEVS  (Discrete Event Systems Specifications) which enables efficient execution of cellular automata models. Thereafter, observe and evaluate the behavior of sensor nodes and entire WSN from the simulation results obtained, under different test scenarios.