GAMMA Group's Research on Emotional Modeling and Social Robotics Featured in Forbes

GAMMA Group's Research on Emotional Modeling and Social Robotics Featured in Forbes

GAMMA Group's Research on Emotional Modeling and Social Robotics Featured in Forbes

ProxEmo Robot at University of Maryland's GAMMA Lab research into emotion detection using gait analysis. Photo credit: UMIACS GAMMA Lab.
ProxEmo Robot at University of Maryland's GAMMA Lab research into emotion detection using gait analysis. Photo credit: UMIACS GAMMA Lab.

Artificial intelligence (AI) is gaining capability in the areas of speech, pattern, and image recognition. Professor Dinesh Manocha (ECE/UMIACS/ISR/Robotics/CS) and Assistant Research Professor Aniket Bera (UMIACS/CS), along with their graduate students in ECE and CS, have been researching novel combinations of methods and collaborations in the field of robotics computer vision, social sciences, and machine learning, to develop real-time computational models to categorize such behaviors and verify their performance. 

The research group has developed “socially-intelligent robots,” that can understand some characteristics corresponding to the perceived human emotions, and have been highlighted in Forbes magazine. Most current robotic applications are focused primarily on accomplishing tasks focused on time or efficiency, so adding the component of human emotional and social interaction helps make humans feel more at ease with robots in close quarters such as in homes, or in public places and surveillance, delivery, and warehousing applications. Moreover, the current global pandemic is prompting hospitals and healthcare facilities to introduce increasing numbers of autonomous robotic systems into their operations. 

Socially-intelligent robots are able to perceive some emotional and behavioral characteristics, however there are situations when facial expressions can be unreliable if facial data is only partially available or where facial cues are difficult to obtain. To address this, the researchers, who are part of the Geometric Algorithms for Modeling, Motion and Animation (GAMMA) group, are looking at AI systems that can detect emotion based on gait. The lab combines facial expressions with body motion as a way to improve predictions of humans’ emotional states. They built the ProxEmo Robot to illustrate how emotional gait detection works and evaluated the performance in their laboratory. Further use of socially intelligent systems include therapy, rehabilitation, anomaly detection and surveillance, character generation for animation and movies, and more. They can also be used to assess mental health signals, enforce social distancing in public, or with evacuation plans to help the public escape dangerous situations. 

There are some challenges with social robots, but according to Bera, the use of additional emotional cues from gait and body language might make social robots more useful and personal. 

View the article from Forbes here.

Related Articles:
Dinesh Manocha Receives the 2020 Pierre Bézier Award
ECE and CS Team up to Launch Master of Professional Studies in Machine Learning
NSF Awards $2M Grant to UMD-led Team to Develop Quantum-based Machine Learning Algorithms and Hardware
University of Maryland Selected as Partner for New Research Consortium for Artificial Intelligence and Machine Learning
Apple Publishes Artificial Intelligence Paper for the First Time Ever Lead by UMD Alumnus
The Science, Technology and Society Program Build More than Just Robots

March 30, 2020


Prev   Next

Current Headlines

Engineering Robotic Starfish

UMD receives two new DOE Building Technologies awards

Pines: Stand in Solidarity, Unite Against Injustice

UAS Test Site Named as Finalist in AUVSI XCELLENCE Awards

Promoting Diversity and Addressing Unconscious Bias

Pines, UMD Engineers Help Prince George’s County Reopening Efforts

Aerosol Containment Chamber to Make COVID-19 Intubations Safer

More Than a Meal

News Resources

Return to Newsroom

Search News

Archived News

Events Resources

Events Calendar