WSC 2009

WSC 2009 Final Abstracts


Applications - Case Study Track


Monday 10:30:00 AM 12:00:00 PM
Automative Manufacturing

Chair: Ed Williams (PMC)

Simulation & Theory of Constraints: Developing a Throughput Improvement Roadmap for an Engine Core Block Manufacturing Facility
Marcelo Zottolo and Karthik Vasudevan (PMC), Sandeep Soni (PMI) and Edward Williams (PMC)

Abstract:
For over two decades, simulation has been used as a decision analysis tool to experiment with ‘what-if’ scenarios. A classic example of how simulation models can be used effectively for analyzing the cost-benefit of capital investment decisions in a manufacturing plant is presented here. This case also discusses how theory-of-constraints is used in conjunction with simulation to develop a throughput improvement roadmap that summarizes action steps required to achieve target throughput from the plant. A buffer capacity optimization was also conducted. Three engine core block manufacturing lines were modeled & analyzed as a part of this project.

Simulation Analyzes Deadlock Concerns in Automotive Manufacture
Edward J. Williams, Dominic Baffo, and Onur M. Ülgen (PMC)

Abstract:
Discrete-event process simulation is a decades-long friend of the industrial, process, or production engineer analyzing complex manufacturing process replete with heavy capital investment, complex material-handling requirements, and requirements for flexibility in the face of highly volatile marketplace demands. When these challenges are coupled with the economic gauntlet confronting the automotive industry, particularly within the United States, the importance of achieving efficiency and continuous improvement demands the support of simulation analyses even more urgently. In this case study, we examine the contribution of simulation not only to study key performance metrics of throughput and equipment utilization, but also to predict and ameliorate the risk of expensive, disruptive deadlocks of material-handling equipment.

Work Measurement Techniques Used to Avoid Simulation Pitfalls
Justin A. Clark, Ravindra Lote, and Edward Williams (PMC)

Abstract:
Simulation is an invaluable tool for analyzing situations that are too complex for queuing theory, closed-form equations, or probabilistic analysis techniques. However, simulation is a double-edged sword since it, if misused or used with weak scientific discipline, can also lead to misleading insights. Accuracy of simulation results depend heavily on the quality of input data as well as the methodology used to build, run, and analyze simulation models. This presentation sheds light on how a combination of stop-watch study, pre-determined motion time systems techniques like MODAPTS®, and work sampling study can be effectively used to: • Identify interdependencies of important system parameters • Critically assess simulation assumptions • Verify simulation input • Validate simulation output

Monday 1:30:00 PM 3:00:00 PM
Warehousing Operations

Chair: Darrell Starks (Rockwell Automation)

Waco High Rise Warehouse Operations
Martin Watkins (Army Air Force Exchange Service) and Darrell Starks (Rockwell Automation)

Abstract:
The Army Air Force Exchange Service (AAFES) used Arena to model the Waco High Rise (HR) Operations. The HR contains twelve aisles with left and right sections. There is a left, center, and right aisle perpendicular to the HR aisles which fork trucks use to bring in and remove product. Each aisle contains a turret which “puts away” incoming product and “replenishes” outbound product. The putting away and replenishing processes are called putaways and replenishments respectively. Model designs: • “AS IS”: Replenishments are completed then putaways • One for One: Replenishments and putaways occur simultaneously • Different product types • Varying number of fork trucks The information contained in this paper is presented for scientific, technical and educational purposes only. It is not an endorsement, express or implied, by the writer, the Army & Air Force Exchange Service, or the Department of Defense, of a specific company’s products or services.

A Top Down and Bottom Up Strategy For NGES BWI Facility Simulation Study
Haiping Xu (Northrop Grumman Corporation Electronic System)

Abstract:
BWI facility is the major manufacturing site for Northrop Grumman Corporation Electronic System (NGES). It has been experiencing production capacity expansion, equipment relocation and streamlining the production lines. The request of simulation studies for manufacturing processes and whole factory activity are extremely high at all spectrum: The detail level simulation models are requested from cell managers to process engineers. Meanwhile, sector VPs are interested in the integrated high level model to cover all manufacturing activities for the whole facility. In order to meet the demands from both sides with limited simulation resources, a Top Down and Bottom Up strategy was created by the BWI simulation team. Simulation team starts to build black box high level facility model to answer upper management requests and detail level cell and line models to fulfill the requests from cell managers and process engineers. The high level black boxes will be replaced by detail models.

Arena Simulation of Textile Dyeing and Finishing Operations
Darrell Starks (Rockwell Automation) and Ben Martin and David Hawkins (Hanesbrands Inc)

Abstract:
Rolls of fabric started in the greige inventory. Rolls were selected for the dyeing/bleaching operation based on a dye schedule. Rolls were formed into lots composed of port loads at a batching operation. Lots moved to the dye jet queue in front of each dye jet. Once all ports were loaded, the dyeing operation began. After dyeing completed, the wet material moved to the pad/extracting process where the liquid was removed. The material then moved to the drying area and then into the compacting area. Rolls then moved to an inspection area where the rolls were quality checked and reviewed for lot completion before moving into the finished fabric inventory area. The Arena model was designed to evaluate the following: • Impact of changing queue capacities • Impact of changing lot sizes • Impact of changing the dye jet capacities • Impact of different dyeing schedules

Monday 3:30:00 PM 5:00:00 PM
Service and Manufacturing Operations

Chair: Cecilia Temponi (Texas State University - San Marcos)

Simulation of Single Piece Flow in a Service Operations Environment
Ximena McKenna (Flextronics) and Jesus Jimenez (Texas State University)

Abstract:
Accurate resource allocation and layout optimization is a great challenge in manufacturing systems that do not follow a traditional sequential process flow, especially in those systems in which the end product is not a "unit" but a “service”. This case study shows the implementation of single piece flow in a service operations environment within the Electronics Manufacturing Industry. The system under study is characterized by high variability due to highly labor-intensive operations and unpredictable diagnosis and repair cycle times. A simulation model, built in AutoMod, was used to redesign the layout, improve material flow, and reduce non-value added service activities. Simulation results of the case study will be reported at the conference.

Practical Applications of Variable Rate Processing in Simulation Models at Mimeo.com
Paul Babin (Paul Babin Consulting) and Allen G. Greenwood (Mississippi State University)

Abstract:
We have been using Discrete Event Simulation to develop capacity planning and operations support models at Mimeo.com over the last four years. Individual models have been described here in WSC case study presentations each year. This presentation focuses on the practical application of Variable Rate Processing across a number of those simulation models. In Variable Rate Processing, the queueing model is enhanced by varying the service rate (or sometimes arrival rate) based on the number of entities in queue at that time. This gives a more realistic representation of the queue length variability – and has proven useful in capacity models, flow demonstration models, and order routing models.

Applications of Value Stream Mapping for Discrete Event Simulation
Jon Fournier (CT Center for Advanced Technology) and Swee Leong (NIST)

Abstract:
This presentation will show simulation projects conducted by CCAT with local Small and Medium Manufacturers. The projects showcase the use of a custom tool for automatically converting eVSM(tm) Value Stream Maps into DELMIA QUEST models. Also to be shown is CCAT's use of the Core Manufacturing Simulation Data (CMSD) simulation data interchange format with Value Stream Mapping.

Tuesday 8:30:00 AM 10:00:00 AM
Flow and Traffic Models

Chair: Ximena McKenna (Flextronics)

The Prototype Agent-based Model For Policy Evaluation in Hawaii’s Longline Fishery
PingSun Leung and Run Yu (University of Hawaii at Manoa), Minling Pan (National Marine Fisheries Services, NOAA) and Steve Railsback (Lang, Railsback & Associates)

Abstract:
This case study describes a prototype fishery management model of Hawaii's longline fisher developed using the agent-based modeling approach. The model simulates the daily fishing activities of 120 Hawaii longline vessels of diverse characteristics. Following the strategy of pattern oriented modeling (POM), we use the spatio-temporal distribution pattern of fishing efforts to calibrate the model. We then use the calibrated model to evaluate three alternative fishery regulatory policies in Hawaii's longline fishery: 1) no regulation; 2) annual cap of 17 turtle interactions; and 3) close the north central area year round, with respect to their impacts on fishing productivity and by-catch of protected sea turtle. The prototype model, constructed using 1999 data, appears to be able to capture the responses of the fishery to these alternative regulations reasonably well, suggesting its potential as a management tool for policy evaluation in Hawaii's longline fishery.

Simulating Large-scale Traffic Flow with Millions of Driver Agents
Sei Kato and Hideyuki Mizuta (IBM Research - Tokyo), Gaku Yamamoto (IBM Software Development Laboratory) and Hidek Tai (IBM Research - Tokyo)

Abstract:
Simulating large-scale traffic flow in microscopic is a fundamental technology in managing traffic flow in a large city. In order to simulate traffic situation with larger numbers of vehicles, we have developed a large-scale microscopic traffic simulator that can simulate traffic situation with millions of vehicles by applying a thread pooling technologies and distributed computing technologies. The results of observations of traffic volume made in the experimental program conducted by the City of Kyoto were compared with the results of our simulation. The results indicate that our simulator exhibits good reproducibility, which means that the simulator can estimate the vehicle CO2 emissions with high precision. The performance evaluation results show the possibility of a large-scale super-real time traffic flow simulation.

Prediction is Key Enabler for Advanced Scheduling
Michael Anderson and Daniel Muller (Applied Materials)

Abstract:
In complex manufacturing processes, the use of traditional heuristic rule-based dispatching systems are difficult and often unmanageable. Although heuristic rule-based systems are simple to deploy and fast in execution, their ability to evaluate alternate scheduling strategies is limited. Optimizing techniques often fail when applied in industry. The challenges are compounded due to the highly dynamic environment that exists in today’s manufacturing facilities, where the controlling entities of the scheduling problem are continually changing states. In research for developing a short interval scheduling solution for a 300mm facility, a solution has been developed that is based on constraint programming. This solution is unique because it applies prediction methods through the use of simulation in the short interval scheduling problem and it generates near optimal solutions in real time. This presentation shall focus on the role and benefits of prediction in the short term scheduling solution.