Development of a virtual laboratory for the study of complex human behavior

Show simple item record

dc.contributor.author Pelz, Jeff en_US
dc.contributor.author Hayhoe, Mary en_US
dc.contributor.author Ballard, Dana en_US
dc.contributor.author Shrivastava, Anurag en_US
dc.contributor.author Bayliss, Jessica en_US
dc.contributor.author von der Heyde, Markus en_US
dc.date.accessioned 2006-12-18T18:06:37Z en_US
dc.date.available 2006-12-18T18:06:37Z en_US
dc.date.issued 1999-05 en_US
dc.identifier.citation Stereoscopic Displays and Virtual Reality Systems VI SPIE 3639 (1999) 416-426 en_US
dc.identifier.isbn 0-8194-3110-9 en_US
dc.identifier.uri http://hdl.handle.net/1850/3160 en_US
dc.description "Development of a virtual laboratory for the study of complex human behavior," Copyright 1999 Society of Photo-Optical Instrumentation Engineers. This paper was published in the Proceedings of Stereoscopic Displays and Virtual Reality Systems VI, SPIE vol. 3639, and is made available as an electronic reprint with permission of SPIE. One print or electronic copy may be made for personal use only. Systematic or multiple reproduction, distribution to multiple locations via electronic or other means, duplication of any material in this paper for a fee or for commercial purposes, or modification of the content of the paper are prohibited. en_US
dc.description.abstract The study of human perception has evolved from examining simple tasks executed in reduced laboratory conditions to the examination of complex, real-world behaviors. Virtual environments represent the next evolutionary step by allowing full stimulus control and repeatability for human subjects, and a testbed for evaluating models of human behavior. Visual resolution varies dramatically across the visual field, dropping orders of magnitude from central to peripheral vision. Humans move their gaze about a scene several times every second, projecting taskcritical areas of the scene onto the central retina. These eye movements are made even when the immediate task does not require high spatial resolution. Such “attentionally-driven” eye movements are important because they provide an externally observable marker of the way subjects deploy their attention while performing complex, real-world tasks. Tracking subjects’ eye movements while they perform complex tasks in virtual environments provides a window into perception. In addition to the ability to track subjects’ eyes in virtual environments, concurrent EEG recording provides a further indicator of cognitive state. We have developed a virtual reality laboratory in which head-mounted displays (HMDs) are instrumented with infrared video-based eyetrackers to monitor subjects’ eye movements while they perform a range of complex tasks such as driving, and manual tasks requiring careful eye-hand coordination. A go-kart mounted on a 6DOF motion platform provides kinesthetic feedback to subjects as they drive through a virtual town; a dual-haptic interface consisting of two SensAble Phantom extended range devices allows free motion and realistic force-feedback within a 1^3 m volume (Refer to PDF file for exact formulas). en_US
dc.description.sponsorship This work was supported in part by NIH Resource Grant P41 RR09283, EY05729, and an RIT College of Science Project Initiation Grant. en_US
dc.format.extent 373427 bytes en_US
dc.format.mimetype application/pdf en_US
dc.language.iso en_US en_US
dc.publisher The Society of Photo-Instrumentation Engineers (SPIE) en_US
dc.relation.ispartofseries vol. 3639 en_US
dc.subject Driving simulator en_US
dc.subject EEG en_US
dc.subject Eye tracking en_US
dc.subject Haptic interface en_US
dc.subject Motion platform en_US
dc.subject Virtual reality en_US
dc.title Development of a virtual laboratory for the study of complex human behavior en_US
dc.type Proceedings en_US

Files in this item

Files Size Format View
JPelzConfProc05-1999.pdf 373.4Kb PDF View/Open

This item appears in the following Collection(s)

Show simple item record

Search RIT DML


Advanced Search

Browse