example of GVS

Imagine stepping into a flight simulator and, instead of only seeing or hearing cues that your plane is banking left, you actually feel it: your body tilts as if you’re turning, guided not by physical motion but by gentle electrical signals sent to your inner ear. This is the promise of Galvanic Vestibular Stimulation (GVS), a technology already explored for enhancing motion cues in pilot training and rehabilitation. New research shows that GVS can be taken a step further, to help people relearn balance after injury or neurological conditions by tapping directly into the brain’s sense of orientation and movement.

Now, for the first time, researchers at the University of Chicago Department of Computer Science’s Human Computer Integration (HCI) Lab have applied GVS to virtual reality hand movement. This paper, titled Vestibular Stimulation Enhances Hand Redirection, was recently presented at the 2025 ACM Symposium on User Interface Software and Technology (UIST), a globally recognized conference for advances in interface research. The first author, Kensuke Katori, is a student of the UChicago CS Student Summer Research Fellowship Program and joined the HCI lab under the mentorship of PhD student Yudai Tanaka and Associate Professor Pedro Lopes.

 

“Yudai and I explored GVS for a number of years, sometimes mildly successful and some not,” said Lopes. “Behind all our ideas was that we explored GVS for what it does well (i.e., balance)—however, Ken (Katori) took GVS into a new realm by showing it can solve other important problems in VR, such as the popular case of hand redirection!”

Traditionally, hand redirection in virtual reality has depended primarily on altering a user’s visual field. In this paper, Katori and his fellow researchers present a novel technique that leverages GVS to subtly redirect hand movements in virtual reality by demonstrating how even an imperceptible electrical stimulation of the vestibular system—responsible for balance and spatial orientation—can alter users’ perception of motion and thereby influence hand trajectories. They achieve this entirely without physical force or visual effects, enabling VR designers to use a novel tool that stimulates the body’s natural sense of balance far beyond existing technology. This has the potential to reimagine how people interact with immersive digital worlds, in training, rehabilitation, gaming, and more.

“It was thrilling to see our initial insight about body balance play out in the real world,” Katori said. “When a friend tried our first prototype, they clearly had a harder time detecting the redirection, which confirmed we were onto something important. Trying it myself, the effect of the stimulation was more pronounced than I ever expected; it was a powerful reminder of how surprisingly malleable our senses can be.”

Katori was able to showcase his work to a larger audience at this year’s UIST conference.

“For our talk at UIST 2025, I focused on conveying both the elegance of this method and its potential for future development. At my advisor Pedro’s suggestion, we incorporated a live demo of the vestibular stimulation, which really captivated the audience. The best part was after the talk; people came up to us to brainstorm new use cases and share in the excitement. It created a real sense of shared discovery.”

Potential applications for this technique include VR training systems, where novices can learn correct techniques and gestures through imperceptible feedback, and rehabilitation contexts, where therapists can encourage patients to perform specific motor exercises more effectively. The underlying technology’s ability to influence physical motion in virtual spaces also hints at new immersive entertainment experiences and deeper research possibilities in embodied interaction.

Related News

More UChicago CS stories from this research area.
Inside the Lab icon
Video

Inside The Lab: How Can Robots Improve Our Lives?

Oct 27, 2025
child reading to robot
UChicago CS News

Could Robots Help Kids Conquer Reading Anxiety? New Study from the Department of Computer Science at UChicago Suggests So

Sep 10, 2025
UChicago CS News

Hands-On Vision: How a Wrist Camera Can Expand the World for All Users

May 23, 2025
robot interaction
In the News

More Control, Less Connection: How User Control Affects Robot Social Agency

May 16, 2025
Pedro giving speech
UChicago CS News

Pedro Lopes Honored with 2025 IEEE VGTC Virtual Reality Significant New Researcher Award

Mar 13, 2025
UChicago CS News

Sarah Sebo Awarded Prestigious CAREER Grant for Research on Robot Social Skills in Collaborative Learning

Jul 29, 2024
UChicago CS News

Enhancing Multitasking Efficiency: The Role of Muscle Stimulation in Reducing Mental Workload

Jul 10, 2024
UChicago CS News

Unveiling Attention Receipts: Tangible Reflections on Digital Consumption

May 15, 2024
UChicago CS News

University of Chicago Computer Science Researchers To Present Ten Papers at CHI 2024

May 06, 2024
UChicago CS News

FabRobotics: The Fusion of 3D Printing and Mobile Robots

Feb 27, 2024
UChicago CS News

High School Students In The Collegiate Scholars Program Get To Know Robots

Nov 14, 2023
UChicago CS News

UChicago Computer Scientists Design Small Backpack That Mimics Big Sensations

Sep 11, 2023
arrow-down-largearrow-left-largearrow-right-large-greyarrow-right-large-yellowarrow-right-largearrow-right-smallbutton-arrowclosedocumentfacebookfacet-arrow-down-whitefacet-arrow-downPage 1CheckedCheckedicon-apple-t5backgroundLayer 1icon-google-t5icon-office365-t5icon-outlook-t5backgroundLayer 1icon-outlookcom-t5backgroundLayer 1icon-yahoo-t5backgroundLayer 1internal-yellowinternalintranetlinkedinlinkoutpauseplaypresentationsearch-bluesearchshareslider-arrow-nextslider-arrow-prevtwittervideoyoutube