Haptic Interface

⠓⠁⠏⠞⠊⠉   ⠊⠝⠞⠑⠗⠋⠁⠉⠑

PixeLite: A Wearable Haptic Array

Previous research has determined that both high spatial (1mm) and temporal (>250Hz) information are essential for rendering realistic virtual textures on the fingerpad. To achieve these resolutions, we are developing haptic interfaces that can produce high frequency distributed forces within the fingerpad.
Using electroadhesion, we created a soft and wearable device, PixeLite, that can produce controllable, high-bandwidth distributed lateral forces within the fingerpad. Each puck produces lateral forces that are a result in changes in frictional forces when electroadhesion is turned on. 
PixeLite is 0.15 mm thick, weighs 1.00 g, and consists of a 4×4 array of electroadhesive brakes (“pucks”) that are each 1.5 mm in diameter and spaced 2.5 mm apart. The array is worn on the fingertip and slid across an electrically grounded countersurface. It can produce perceivable excitation up to 500 Hz.

Kinesthetic Fingertip Guidance and Constraint

Touchscreens have become the main interface to many applications and devices, from vehicle dashboards and airplane cockpits to elevators and home appliances. While touchscreens add reconfigurable functionality, they often come at a cost. For example, automobile touchscreen interfaces lead to distracted driving, while touchscreens in aircraft cockpits require sustained inputs that result in pilot fatigue. More importantly, the use of touchscreens in elevators and home appliances creates an obstacle to visually-impaired individuals.
To address these issues, we are developing a compact haptic device for guiding a fingertip along a predefined path on a planar touch surface through the use of electromagnetic braking to induce pivoting.

 

This device can be used to augment touchscreens and other surfaces to provide constraints that allow for increased performance in fingertip tracking tasks during vibration-heavy interactions, or to allow touchscreen navigation through touch alone.

Active lateral force feedback on surfaces

Active lateral force feedback is essential for haptic applications in which forces on the fingertip are perpendicular to or in the same direction of finger movements. Examples include virtual shape rendering and button click rendering. We have developed a novel device, the UltraShiver, that can provide large active lateral forces (400 mN) while operating in the ultrasonic regime. 
The UltraShiver consists of a sheet of anodized aluminum excited in a compression-extension mode via piezoelectric actuators. By combining in-plane ultrasonic oscillation and out-of-plane Johnsen-Rahbek electroadhesion, both operating at about 30 kHz, lateral forces are generated. The lateral force is a result of friction being greater when electroadhesion is turned on than when it is turned off. The direction and magnitude of the lateral force can be adjusted by varying the phase between the in-plane oscillation and the electroadhesion.
As a controllable lateral force source, the UltraShiver is used to render haptic features on the surface. Once integrated with finger position sensing, the UltraShiver could be used to render 2.5D shape, such as a Gaussian-like potential well. In addition, via integration with a pressing force sensor, it can generate a button click sensation on a flat surface without macroscopic motion of the surface in the lateral or normal direction, and localize this haptic effect to an individual finger, thus providing a promising method for touch-typing keyboard rendering.



Perceptual space of virtual textures (Becca)

Over the past decade, surface haptics has quickly progressed to the point where we can command highly salient friction-induced vibrations with rich frequency content. However, early attempts to create a wide library of distinct texture patterns, or to scale different characteristics of textures in an intentional way, are somewhat haphazard. What makes a friction-modulated texture on a haptic display feel more or less rough, sticky, or fuzzy? How about “more urgent” or “warm and loving”– emotive content that would be useful for haptic communication? 
We only have control over “engineered parameters”, or mathematically descriptive parameters of a textured pattern that we can set or modify. These include parameters such as the amplitude or frequency content of variations in friction force. We would like to determine which of these relate most strongly to “perceptual parameters”, characteristics that people actually perceive and can name. Furthermore, we’d like any parameter to be perceptually independent of others, so that modifying its strength doesn’t have domino effects on all the others.

Eventually, these engineered parameters could define a haptic texture perceptual space, where any texture that we create lies somewhere in that space and is defined by its parameter settings. This is analogous to the color gamut, where any color is defined by just three parameters (e.g. red/green/blue values), and a particular display can cover a particular range of this gamut given what range of those parameters it can achieve.

Concise Representation of Coarse Textural Features

The storage, transfer, and reproduction of virtual textures is one aspect of realizing the “Tactile Internet”: the telecommunications infrastructure and protocol with latency low enough (e.g., <1 ms) to produce “real-time” effects. Such a system would, for the first time, allow touch feedback indistinguishable from live interactions. Necessary in meeting this goal is constructing virtual texture representations with minimal data footprint while introducing no perceptual defects.
To this end, our research involves defining the perceptual information encoded by the human tactile sensory system. Previous research has determined that statistical (spectral) information is sufficient in capturing the perceptual details of fine-grain textures, but this is not the case for coarse-grain textures, where spatial information is perceptually relevant during exploration. Currently, analogous research into a concise representation of coarse textural features is sparse.

The goal of this research is to understand the human tactile perceptual limits, enabling the creation of a virtual texture codec. This algorithm will enable the efficient transfer of textures containing both fine and coarse features by eliminating all aspects that are not perceptually relevant.

Capture, playback, and enhancement of tactile texture

Our interaction with the physical world through touch is incredibly complex. Unlike in vision and audition, tactile sense is an active one. This means that the exact manner of touch such as the pressing force or scanning velocity drastically changes the excitation of the periphery. Remarkably, in the face of varying conditions of touch, our perception stays intact. The goal of this research is to mitigate the effects of complex physics in touch, and produce a controllable and saliable effect that closely mimics one we could encounter in everyday life, such as stroking a textile fabric, on a completely flat surface. Imagine the transformative power of producing tactile renderings on a smartphone to go along with images and audio!   
Recently, we have developed a method that allows us to vary a surface feature most closely described as “stickiness”. This is done by controlling the rate of finger pad evolution from stuck to slipping state during transient kinematics of the finger. Stickiness is one of the three most salient surface features that can now be adjusted by a single knob and added to any displayed texture. Once a finger is in steady state motion against the haptic surface, a recently developed closed-loop application of electrostatic forces can generate a controllable, wide-bandwidth friction profile. In other words, the forces one feels when interacting with a given surface can now be reliably replicated on a digitally programmable one. The key now is to understand if this is perceptually sufficient to provide a physical explanation to this answer.
Home
Research
Haptic Interface
High-speed EA Devices
Robot Dexterity
People
Research Team
Collaborators
Open Positions
Publications
Recent Publications
Full-list
Contact