INDEX A B C D E F G H I J K L M N O P Q R S T U V W X Y Z

Chris Harding, Associate Professor

Department of Geological and Atmospheric Sciences
Virtual Reality Applications Center(VRAC), Human Computer Interaction(HCI) Program

Research Interests

My interests are in combining geoscience research and teaching with the emerging field of Virtual Reality. Besides being a faculty member in the department of geological and atmospheric sciences I am also part of the Virtual Reality Applications Center (VRAC) and ISU's Human-Computer-Interaction (HCI) program, which is housed at Howe hall.

I am interested in building interactive geoscientific Virtual Environments that take advantage not only of the well established visual component (e.g., large displays with stereo vision) but employ touch and sound as well. Geoscientific tasks carry a special set of problems and therefore need specific approaches.

I have several years of practical experiences with geoscientific virtual environments: I worked in the University of Houston's Virtual Environment Research Laboratory where my Ph.D. thesis dealt with interactive fault modeling on bathymetric data. I was involved in VR research into using large stereo displays (virtual theaters) and scientific sonification (Exxon) and into using haptic interaction with volume data (Shell).

At ISU, I work at applying touch and sound to data from the GIS domain. In areas such as natural resource extraction, road planning or landscape architecture, the user is often struggling to deal with several "overlapping" layers of data. My research advocates the use of touch and sound as additional sensory modalities to perceive additional, possibly invisible, data layers. For example, one might visually perceive the 3D shape of a terrain mesh, explore this terrain via touch with a 3D force feedback device (such as the phantom) and hear data that is draped over this terrain as pitches in a melody( shown with color in the image). More data layers could be "visualized" via additional touch/sound parameters such as friction or timbre. The phantom is used as a 3D force-feedback mouse to interact directly with the terrain and to, for example, to digitize (drape) a line feature (yellow line) on the surface. The exploration, design and evaluation of such multi-modal VR interfaces may one day allow a form of multi-sensory synergistic fusion of geospatial data and may provide more natural ways to comprehend and interact with complex, real-world data.

Besides using touch and sound to help domain scientists, such a multi-sensory system may help visually impaired students with developing travel skills and to generally improve their way of understanding spatial data.

Chris

Older Links :