Date & Time:
May 17, 2022 10:30 am – 11:30 am
Location:
Crerar 298, 5730 S. Ellis Ave., Chicago, IL,
05/17/2022 10:30 AM 05/17/2022 11:30 AM America/Chicago Isabel Neto & Yuhan Hu – Fostering Inclusion Using Social Robots UChicago HCI Club Crerar 298, 5730 S. Ellis Ave., Chicago, IL,

Fostering Inclusion using Social Robots in mixed-visual ability classrooms using multimodal feedback.

Visually Impaired children are increasingly included in mainstream schools as an inclusive education practice. However, even though they are side-by-side with their sighted peers, they face classroom participation issues, lack of collaborative learning, reduced social engagement, and a high risk for isolation. Robots have the potential to act as intelligent and accessible social agents that actively promote inclusive classroom experiences by leveraging their physicality, bespoke social behaviors, sensors, and multimodal feedback. However, the design of social robots for mixed-visual abilities classrooms remains mostly unexplored. In the presentation, we will share our research on exploring robot multimodal feedback such as movement, speech and textures to foster inclusion in group activities between mixed-visual abilities children.

Internal and emotional state expression is at the core of Human-Robot Interaction (HRI), and many social robots are designed to convey their states not only with speech but also through nonverbal signals like facial expressions and body movements. To enable emotion expressions for robots in a manner applicable to different robot configurations, we developed an expressive channel for social robots in the form of texture changes on a soft skin. Our approach is inspired by some biological systems, which alter skin textures to express emotional states, such as human goosebumps and cats’ back fur raising. Adding such expressive textures to social robots can enrich the design space of a robot’s expressive spectrum: it can interact both visually and haptically, and even communicate silently, for example in low-visibility scenarios. The soft robotic skin generates pneumatically actuated dynamic textures, deforming in response to changes in pressure inside fluidic chambers. In the presentation, we will share the mechanical design of such robotic textures as well as user studies of mapping texture changes to different emotional states.

Host: Sarah Sebo

This event will be both in-person and remote. For Zoom information, sign up for the HCI Club mailing list.

Speakers

Isabel Neto

Ph.D. Student, Computer Science and Engineering, University of Lisbon

I am a Ph.D. student in Computer Science and Engineering, University of Lisbon, and researcher in GAIPS (Research Group on AI for People and the Society) at INESC-ID and in Interactive Technologies Institute / LARSyS; with a focus on social agents for inclusion in schools.

My research focused on novel experiences between mixed-visual ability children, exploring group engagement within Human Robots Interactions, improving accessibility, and valuing individual abilities to create inclusive experiences.

With 20 + years of experience in senior leadership, design, business, people, and change in the IT and Telecommunications Industry, embraced several technical and management areas in Digital, Customer Operations, and IT. Collaborated with many talented people having a long experience in a customer-centric approach, leading design thinking methods and agile teams to build and develop personalized and ethical experiences in digital channels & assisted channels

I aim to use artificial intelligence, design thinking, and service design to shape great experiences, products & services.

Yuhan Hu

Ph.D Student, Cornell University

Yuhan Hu is a 3rd year Ph.D student in the Human-Robot Collaboration & Companionship (HRC2) Lab at Cornell, working on ShadowSense and human-robot collaborative swarms. She previously worked on texture-changing social robotics.

Related News & Events

Inside the Lab icon
Video

Inside The Lab: How Can Robots Improve Our Lives?

Oct 27, 2025
best demo award acceptance
UChicago CS News

Shape n’ Swarm: Hands-On, Shape-Aware Generative Authoring for Swarm User Interfaces Wins Best Demo at UIST 2025

Oct 22, 2025
gas example
UChicago CS News

Redirecting Hands in Virtual Reality With Galvanic Vestibular Stimulation: UChicago Lab to Present First-of-Its-Kind Work at UIST 2025

Oct 13, 2025
UIST collage
UChicago CS News

UChicago CS Researchers Expand the Boundaries of Interface Technology at UIST 2025

Sep 26, 2025
child reading to robot
UChicago CS News

Could Robots Help Kids Conquer Reading Anxiety? New Study from the Department of Computer Science at UChicago Suggests So

Sep 10, 2025
UChicago CS News

Hands-On Vision: How a Wrist Camera Can Expand the World for All Users

May 23, 2025
robot interaction
In the News

More Control, Less Connection: How User Control Affects Robot Social Agency

May 16, 2025
collage of photos from conference
UChicago CS News

Innovation at the Forefront: UChicago CS Researchers Make Significant Contributions to CHI 2025

Apr 23, 2025
UChicago CS News

Unveiling Attention Receipts: Tangible Reflections on Digital Consumption

May 15, 2024
UChicago CS News

University of Chicago Computer Science Researchers To Present Ten Papers at CHI 2024

May 06, 2024
UChicago CS News

Five UChicago CS students named to Siebel Scholars Class of 2024

Oct 02, 2023
UChicago CS News

UChicago Computer Scientists Design Small Backpack That Mimics Big Sensations

Sep 11, 2023
arrow-down-largearrow-left-largearrow-right-large-greyarrow-right-large-yellowarrow-right-largearrow-right-smallbutton-arrowclosedocumentfacebookfacet-arrow-down-whitefacet-arrow-downPage 1CheckedCheckedicon-apple-t5backgroundLayer 1icon-google-t5icon-office365-t5icon-outlook-t5backgroundLayer 1icon-outlookcom-t5backgroundLayer 1icon-yahoo-t5backgroundLayer 1internal-yellowinternalintranetlinkedinlinkoutpauseplaypresentationsearch-bluesearchshareslider-arrow-nextslider-arrow-prevtwittervideoyoutube