Information Sciences and Technology

Perception of human emotion in movement focus of $2 million NSF grant

The College of Information Sciences and Technology (IST) will lead the three-year, multi-institution project

The Penn State-led research team will train AI technologies to identify patterns in the bodily movements that accompany emotions. Credit: Keira Heu-Jwyn Chang. All Rights Reserved.

UNIVERSITY PARK, Pa. — A Penn State-led research team received a $2 million National Science Foundation (NSF) grant to study how machines perceive and process human body language. The research could stimulate advancements in health care applications, including the design of caregiving robots.  

The grant is part of the NSF’s Computer and Information Science and Engineering Community Research Infrastructure program. Penn State researchers will collaborate with the University of Illinois at Chicago (UIC) and the Robotics, Automation and Dance (RAD) Lab in Philadelphia.

“Extensive research has been conducted on how the human face conveys emotion, but more research is needed to understand how the body communicates feelings,” said James Wang, distinguished professor of information sciences and technology at Penn State, who is leading the project. “Our work training artificial intelligence technologies to identify patterns in the bodily movements that accompany emotions may provide essential insights for shaping the future of human-machine interaction.”

Wang will work alongside Penn State collaborators Reginald Adams Jr., professor of psychology; Michelle Newman, professor of psychology; and Jia Li, professor of statistics.

The multidisciplinary team will begin the project by exploring the wealth of information about human movement that already exists in films and internet videos. Analyzing tens of thousands of online clips culled from these videos will enable the researchers to compile a large dataset that could lead to a representative evaluation of how actors portray emotions in various contexts. For example, bowing the head with a sinking sense along the spine may be correlated with an expression of sadness. The Penn State team pioneered the data collection process in 2019.

“The subtlety and complexity of human expressions, coupled with the challenges of privacy issues, make it next to impossible to conduct this research ‘in the wild,’ so to speak,” Wang said. “We acknowledge that films use actors and that social media presenters are often self-filtered, but we expect realistic patterns to emerge from the sheer volume of online content we are studying.”

The research team will show the collected video clips to a diverse group of participants and ask them to identify emotions based on the bodily movements they observe. People perceive interactions differently, according to Wang, and culture, age, gender and other variables may result in different interpretations of the same emotional display.

Experts in movement analysis will join the study participants to pinpoint the motor characteristics the participants observed on the screen. The identified characteristics will, in turn, drive algorithms that attempt to classify the emotion expressed by the human mover and potentially promote technical innovations in the computational modeling of bodily expression of emotion.

Among the experts is co-investigator Amy LaViers, director of the RAD Lab, an interdisciplinary nonprofit that connects robotics and artificial intelligence with dance and movement studies. According to LaViers, the dance and movement studies community has unique expertise essential to understanding the phenomenon of human expression through bodily movement.

“We’ll design an annotation interface where human annotators can observe and describe an array of movement patterns using a system of movement analysis — Laban Movement Analysis (LMA) — as outlined in a forthcoming book,” LaViers said. LMA is a method and language for describing, visualizing, interpreting and documenting human movement.

“To ensure the data infrastructure’s compatibility with human-robot interaction research, we’ll also conduct a robotics feasibility study embedded in a theatrical performance with human actors, dancers and participants moving alongside mobile robots.”

Rachelle Tsachor and Tal Shafir, collaborators at UIC, will use the annotation interface developed by RAD to study emotion. This team’s step-by-step methodologies for empirically connecting LMA features to emotion expression — and the challenges they met along the way — inspired the original design of the grant.

“Our work since 2010 has developed methodologies resulting in breakthroughs of LMA-described features of bodily expression in emotion,” Tsachor said. “Our prior work brings exciting new possibilities for data-intensive research on nonverbal, bodily cues and how they relate to emotion.”

For this project, Tsachor and Shafir will develop and co-teach modules to be used by the expert movement analysts who are observing the video clips. The analysts will document the structured descriptions of human movements using LMA.

“Data-driven modeling of human bodily expression of emotion could lead to the creation of crucial health care applications,” Wang said, “which could include caregiving robots, diagnostic tools for mental health, socially aware machinery and safety monitoring systems and consumer electronics, among other innovations.”

Last Updated February 1, 2024

Contact