Seeing with the Hands

A sensory substitution that supports manual interactions

Our team at the University of Chicago engineered a wearable device that enables users to see with their hands when hovering over objects. This is a sensory substitution device that assists with perceiving objects by translating one modality (e.g., vision) into another (e.g., tactile), developed to the need of Blind and Low Vision users. The user can feel tactile images rendered on their hands using an electrotactile display from a miniature camera mounted on the palmar side of their hands. This new perspective of seeing allows users to leverage the hands’ affordance—hand faces towards objects to grasp allows hand preshaping, and flexibility—hands move rapidly, reaching from multiple angles, exploring tight spaces, circling occluded objects, etc. This opens a new possibility to support manual tasks in everyday life without vision.

Video showcase


A new tactile perspective designed for grasping objects

A person using their eyes to perceive their surroundings, with a blue light indicating the direction of their gaze. The same person also uses their hands to see. They are reaching into a cabinet with the device that has a tactile feedback system. An inset shows a tactile image on the back of the hand, represented by a grid of red and green dots. Typical sensory substitution interfaces are thought to substitute the eyes for assisting with perceiving one’s surroundings (e.g., navigation). We explore a new tactile perspective with cameras on the palmar side—This is the side of the hand facing toward objects to grasp for hand manipulation.
A blind person is walking downstairs outdoors, using a white cane. They use the hand device to look for handrails to hold on. Our study participants envision using our device for various manual tasks, such as finding the handrail while walking down the stairs, identifying and grabbing ingredients inside a shelf while cooking, retrieving an object that fell on the floor, etc.


How it works

A photo of a hand weaing the device, with a black strap on the wrist and golden strips on the back of the hand. A callout pointing to the palmer side of the wrist strap is labeled camera, with a zoom-in picture of the camera. A callout pointing to the strips is labeled electrotactile display. The image under the hand captured by the camera is segmented and translated into tactile patterns, using computer vision techniques, in real-time. The tactile patterns are felt on the back of the hand through a custom electrotactile display (5×6) that is flexible and thin (0.1 mm).
A diagram of a cross section of the skin, titled a tactile pixel on the electrotactile display. Two electrodes are shown labeled with plus and minus correspondingly. An arrow pointing from plus electrode goes to the minus electrode. The arrow is labeled with a lightning icon and the word pulse current. The arrow intersect with a drawing of a tactile receptor, with more receptors are drawn nearby. Each "tactile pixel" on the electrotactile display consists of a pair of electrodes. Whenever the pixel is supposed to be felt, our system passes tiny current pulses (3-5 mA) between the electrodes, which stimulate the tactile receptors under the skin, causing a sense of slight touch.


Team

This is a research from Human-Computer Integration Lab at the University of Chicago by:

Publication

This work will be published at ACM CHI 2025. Preprint of the paper is available here (PDF).

Shan-Yuan Teng*, Gene S-H Kim*, Xuanyou Liu*, and Pedro Lopes (*equal contribution). 2025. Seeing with the Hands: A Sensory Substitution That Supports Manual Interactions. In CHI Conference on Human Factors in Computing Systems (CHI ’25), April 26–May 01, 2025, Yokohama, Japan. ACM, New York, NY, USA, 14 pages. https://doi.org/10.1145/3706598.3713419

Press kit

High resolution photos and unannotated video are available here.

Source code

Source code is available here.

Funding source

Google Research logo

This research is supported by Google Award for Inclusion Research.

Human-Computer Integration Lab
University of Chicago | Computer Science