Jan 27 (Thu) @ 3:30pm: "Engineering Haptic Technologies for Interaction in Virtual and Augmented Reality," Bharat Dandu, ECE PhD Defense

Date and Time
Location
Zoom Meeting - Meeting ID: 841 2021 3421

https://ucsb.zoom.us/j/84120213421

Abstract

A longstanding goal in engineering research has been to realize haptic displays for virtual reality that allow their users to touch and manipulate virtual objects which feel realistic. Despite decades of research, this goal remains far from being achieved. Motivated by the challenges involved, this PhD contributes new research findings on perceptual and physical mechanisms of human touch perception, and applies this knowledge for the engineering of efficient haptic technologies for virtual and augmented reality.

The first part of the PhD is motivated by the active nature of haptic perception. During haptic exploration of objects, proprioceptive information about movements of the hands and fingers is integrated with tactile sensory information elicited via touch contact in order to facilitate perceptual judgements about object properties or events. Surprisingly little is known about human abilities to spatially localize the fingers via proprioception. The first part of the dissertation presents findings from perceptual investigations using a novel virtual reality paradigm. The results reveal that errors in spatial localization of the fingers are surprisingly large, averaging several centimeters per finger, but are strongly refined by even partial visual cues about the positions of adjacent fingers.  The results also provide critical information that is needed in order to specify performance requirements for virtual reality technologies, including hand tracking systems.

The next part of this PhD addresses another key aspect in engineering haptic interactions: that of furnishing touch sensations to the skin. A central challenge in haptic engineering is that the skin is an extended sensory medium with high spatial resolution. While many prior haptic devices have been designed to use numerous actuators, such devices are too bulky, costly, and complex for practical application. This PhD proposes a new design strategy that exploits the physics of waves in the skin. First, findings from a study on the mechanics of the vibration transmission  in the skin reveal how the viscoelasticity of skin causes vibrations to attenuate in a highly frequency-dependent manner with distance.  This motivated the design of a new spectral method for controlling the spatial transmission of haptic feedback. This method enables even a single actuator to elicit spatially varying haptic sensations, as revealed in perceptual experiments.  These new effects also proved effective for enhancing engagement in virtual reality interactions.

These findings motivated a next stage of PhD research aimed at engineering compact haptic actuators with sufficient frequency bandwidth and control to reproduce haptic effects like those described in the preceding section.  Achieving this goal is challenging, due to fundamental limitations arising from actuator mechanics and thermal processes. Overcoming these challenges required the development of new fluidic electromagnetic actuators that encapsulate ferrofluid, a functional material, within a multimaterial device. These actuators can outperform all existing compact haptic actuators in several respects.  They are capable of controlled reproduction of mechanical signals over a frequency bandwidth from 0 Hz to more than 1000 Hz, spanning the full range of human tactile sensitivity.

The final part of the PhD presents a new approach to providing haptic feedback to extended areas of the skin via practical devices, with applications in augmented and virtual reality. It proposes a new technique for delivering spatially programmable haptic feedback via projected light, mediated by a novel optoelectronic layer worn on the skin. The resulting devices represent the first haptic virtual reality technology for enabling one to “see” optical patterns via the skin. The efficiency of these devices could one day enable their practical application in extended reality systems.

Bio

Bharat Dandu received the MS degree in Electrical and Computer Engineering from University of California, Santa Barbara (2016) and the B.Tech degree in Electrical Engineering from Indian Institute of Technology, Madras (2015). Since 2016, he has been working with Prof. Yon Visell in the RE Touch Lab at UCSB, where he is currently a PhD candidate. His research interests include haptics, perception, sensing and applications in AR/VR.

Hosted by: Prof. Yon Visell

Submitted by: Bharat Dandu <bharatdandu@ucsb.edu>