Seeing with the Hands
A sensory substitution that supports manual interactions
Our team at the University of Chicago engineered a wearable device that enables users to see with their hands when hovering over objects. This is a sensory substitution device that assists with perceiving objects by translating one modality (e.g., vision) into another (e.g., tactile), developed to the need of Blind and Low Vision users. The user can feel tactile images rendered on their hands using an electrotactile display from a miniature camera mounted on the palmar side of their hands. This new perspective of seeing allows users to leverage the hands’ affordance—hand faces towards objects to grasp allows hand preshaping, and flexibility—hands move rapidly, reaching from multiple angles, exploring tight spaces, circling occluded objects, etc. This opens a new possibility to support manual tasks in everyday life without vision.
Video showcase
A new tactile perspective designed for grasping objects


How it works


Team
This is a research from Human-Computer Integration Lab at the University of Chicago by:
Publication
This work will be published at ACM CHI 2025. Preprint of the paper is available here (PDF).
Press kit
High resolution photos and unannotated video are available here.
Source code
Source code is available here.
Funding source
This research is supported by Google Award for Inclusion Research.
University of Chicago | Computer Science