WSC 2004 Final Abstracts
Monday 10:30:00 AM 12:00:00 PM
Animation, Simulation, and Navigation
Chair: Thenkurussi Kesavadas (State University of New York)
This paper presents VITASCOPE, a general-purpose, user-extensible 3D visualization system for animating processes that are modeled using Discrete-Event Simulation tools. VITASCOPE is an ASCII stream driven 3D animation tool that runs readily on a wide variety of hardware platforms ranging from commonly used desktops and laptops to high-end, room-sized, immersive virtual environments such as the CAVE. VITASCOPE parses and interprets an ASCII stream, and imports and manipulates existing 3D CAD models of the simulated system entities to visually depict modeled processes. The used CAD models may be created in any CAD modeling package capable of exporting models in the VRML format. VITASCOPE’s architecture is open and loosely-coupled, thus allowing it to serve as a post-processed or concurrent 3D visualization engine for simulation models created in a wide variety of languages and tools. VITASCOPE is built on the industry standard OpenGL graphics library and runs on Windows and Irix based workstations.
A Scenario Generation Tool for DDF Simulation Testbeds
Govindarajan Srimathveeravalli, Navneeth Subramanian, and Thenkurussi Kesavadas (State University of New York)
An interactive tool has been developed for visualizing and creating scaled battlefield based scenarios for use in a simulation testbed to develop and test distributed data fu-sion and ad-hoc networking algorithms. This paper discuses the design requirements and implementation issues for developing such a tool. Two main design goals were to enable design of complex scenarios in an intuitive and easy fashion, and provide a complete set of decision support utilities. The tool, called SceneGen, supports 3D visualization for creating scenarios and overcoming challenges including geospatial (GIS and Terrain) data management, entity information management and waypoint/path specification. The scenario generator includes a number of sensor and target models and provides database support to manage different fusion and network algorithms. This tool was implemented and tested successfully with several sample scenarios
A Real-Time Panoramic Vision System for Autonomous Navigation
Sumantra Dasgupta and Amarnath Banerjee (Texas A&M University)
The paper discusses a panoramic vision system for autonomous navigation purposes. It describes a method for integrating data in real-time from multiple camera sources. The views from adjacent cameras are visualized together as a panorama of the scene using a modified correlation based stitching algorithm. A separate operator is presented with a particular slice of the panorama matching the user’s view-ing direction. Additionally, a simulated environment is created where the operator can choose to augment the video by simultaneously viewing an artificial 3D view of the scene. Potential applications of this system include en-hancing quality and range of visual cues, navigation under hostile circumstances where direct view of the environ-ment is not possible or desirable.
General-Purpose 3D Animation with VITASCOPE
Vineet Rajendra Kamat (University of Michigan) and Julio Cesar Martinez (Virginia Polytechnic Institute and State University)
Monday 1:30:00 PM 3:00:00 PM
Manufacturing Simulation and Product Development
Chair: Ali Akgunduz (Concordia University)
This paper demonstrates the utilization of immersive virtual reality environments in the investigation of operations simulation results. The authors outline the benefits offered by virtual reality over the traditional two-dimensional computer interfaces, such as the monitor, keyboard and mouse. The paper describes the state of the art in the area of operations simulation and the implementation details and functionality of the program developed as a result of this research. The experience of using the developed application for the analysis of a manufacturing operation is presented.
Using Dynamic Multiresolution Modelling to Analyze Large Material Flow Systems
Wilhelm Dangelmaier and Bengt Mueck (University of Paderborn)
The interactive, simulation-aided analysis of material flow systems is often done with the help of virtual reality. If a user wants to influence the simulation run, the simulation and the visualization have to be computed simultaneously. Large, detailed simulation models, which can not be calculated fast enough, can not be analyzed interactively. This paper presents a method which only computes those parts of simulation models that are visited by the user. If a user shifts its attention, the area being simulated in detail is updated. Since the major part of the simulation is not detailed, the necessary calculating complexity is reduced. On the basis of models allowing a multiresolution simulation methods are introduced which regulate the level of detail based on the evaluation of the user’s attention. Changing the levels of details leads to the exchange of models with different level of details.
Two-Step 3-Dimensional Sketching Tool for New Product Development
Ali Akgunduz and Hang Yu (Concordia University)
This paper discusses a two-step virtual reality based con-ceptual design tool that enables industrial designers to cre-ate sketches of their ideas in 3-dimensional space in real time. In the developed sketching tool, the rough shapes of products are generated by tracing the trajectory of the data-gloves worn by the designer. In the model a practical solu-tion is provided to reduce the generation of unnecessary control points. This is achieved by representing each con-trol point by a spherical volume. Once the rough sketching is completed, NURBS surfaces are constructed by the lim-ited number of reference points that are selected from the initial sketch by using a virtual pen. The two-steps sketch-ing technique enables designers to perform their artistic characteristics freely in an intuitive environment and also enables designers to generate parametric representations of the surfaces to be used in CAD/CAM systems for further analysis.
Integrating Operations Simulation Results with an Immersive Virtual Reality Environment
Gordon D. Rehn and Marco Lemessi (Deere & Company) and Judy M. Vance and Denis V. Dorozhkin (Iowa State University)
Monday 3:30:00 PM 5:00:00 PM
Education and Ergonomics
Chair: Sankar Jayaram (Washington State University)
Enhancing Simulation Education with a Virtual Presentation Tool
David He and Pat Banerjee (University of Illinois-Chicago)
This paper describes a research plan to address the undergraduate learning challenges encountered when teaching an introductory simulation course in Industrial Engineering programs. It outlines the implementation tasks of the research methodology, an evaluation plan, and one direction for future work.
Virtual Reality: Its Usefulness for Ergonomic Analysis
Lawrence E Whitman, Michael Jorgensen, Kuresh Hathiyari, and Don Malzahn (Wichita State University)
This paper presents the results of an effort to compare re-sults of an experiment performed in both a virtual and a real environment. The research question addressed is if vir-tual reality is a suitable tool for performing ergonomic analysis. The subjects performed a palletizing task in the virtual environment and then performed the same task in the real environment. The results showed that VR can be compared to a similar experimental task in the real envi-ronment if it involves measuring only range of movements and no velocities or accelerations. This paper presents these results using a lumbar motion monitor and proposes areas for future improvement and research.
Participatory Ergonomics Using VR Integrated with Analysis Tools
Imtiyaz Shaikh, Uma Jayaram, and Sankar Jayaram (Washington State University) and Craig Palmer (Paccar Technical Center)
This paper presents our work on the integrated use of simu-lation tools in real time for participatory occupational er-gonomic studies. The focus of this paper is a synergistic system that consists of an interactive immersive simulation tool that has been developed in-house and integrated with a commercial human modeling simulation system, JackTM . The impetus of the real-time integration is to allow the complementary use of two powerful simulation tools by allowing the user to perform the task naturally in an im-mersive environment, while the body posture information is continuously and automatically passed to the human modeling system for a continuous (and not discrete) analy-sis of the participatory ergonomic issues under considera-tion. This facilitates integration of ergonomic issues early in the design and planning phases of workplace layouts, even where the physical facility does not exist. The pro-posed integration is demonstrated using a manufacturing example.