Profiling facial expressions of emotion: insights from their emotional, semantic, and contextual similarities

Manno, Laura (2023) Profiling facial expressions of emotion: insights from their emotional, semantic, and contextual similarities. Doctoral thesis, University of East Anglia.

[thumbnail of Laura Manno PhD Final Version.pdf]
Preview
PDF
Download (7MB) | Preview

Abstract

Facial expressions of emotions serve as a fundamental form of nonverbal communication, offering a rich tapestry of information about individuals' emotions, intentions, and thoughts. While classical theories of emotion perception have laid a solid foundation for understanding facial expressions, they may oversimplify what we perceive from facial expressions of emotions by reducing it into a few basic emotions or two emotion dimensions, potentially missing the richness and complexity of real-life emotional experience. In this dissertation, I explore a more sensitive approach to the study of facial emotion processing by conducting six behavioural and cross-cultural studies employing multiple-dimensional profiling tasks and Representational Similarity Analysis (RSA). This allowed me to (1) investigate whether people extract complex, high-dimensional emotional content from facial expressions of emotions; (2) determine if high-dimensional representations of facial emotions outperform classical categorical emotion models in predicting perceptual similarities between facial emotions; (3) uncover what stimulus- and observer-based factors underlie perceptual similarity between facial emotions; (4) examine the impact of participants’ cultural background, emotion intensity, facial motion, and context on the profiling and perception of facial emotions; and finally, (5) explore how human facial emotion processing may differ from machine learning approaches to emotion perception.

Throughout the six studies reported here, participants engaged in a series of profiling task where they reported their perception of facial emotions along multiple emotional, se-mantic, and contextual dimensions, generating unique profiles for each facial emotion under different conditions. Response profiles were compared across different cultural backgrounds (Chinese vs. British participants), facial motion (Static vs. Dynamic), emotion intensity (High vs. Low), and emotional contexts (Physical vs. Social scenarios; congruent vs incongruent). Participants also performed a direct similarity rating task to produce a measure of perceptual similarity between facial emotions. Finally, I obtained other measures of similarities based on different sources of information (i.e., Physical, Categorical, Profiling and Intensity similarity) and performed RSA and multiple regression analysis to identify under-lying factors that contribute to perceptual similarity of facial emotions.

The results showed that (1) facial emotion perception is complex and multi-dimensional, integrating rich emotional content, fine-grained semantics, and relevant contextual information; (2) multi-dimensional emotion profiles outperformed traditional categorical emotion models in predicting perceptual similarities between facial expressions; (3) perceptual similarity is influenced by both physical stimulus-based cues and high-level perceiver-based emotion perception; (4) participants’ cultural background, emotion intensity, facial motion, and emotional contexts significantly impact facial emotion perception; and finally, (5) Machine learning models, while achieving human-level emotion categorization, may not capture the richness and complexity of human emotional experience, as reflected in emotion profiles.

These findings underscore the importance of recognizing the complex nature of human emotional experience and the effectiveness of an emotion profiling paradigm in revealing the rich and diverse information perceived from natural facial expression of emotion. Theoretically, the present results challenge the prevailing view that emotion perception is universal, discrete, and best described as a single semantic label. Instead, these results pro-vide further support for the emerging view that perception of emotion is multiple dimensional, blended and varies in a gradient way. Methodologically, the profiling paradigm used in this project is not only able to reproduce many classical findings in emotion research (e.g., difference across cultures, facial motion, and emotional context), it also uncovers novel and fine-grained differences regarding emotional, semantic, and contextual information conveyed by facial expressions of emotions. Practically, theories and models of facial emotion perception play a pivotal role in various aspects of daily life, from machine learning based face processing to therapeutic interventions. By incorporating a more holistic and multi-dimensional perspective into emotion perception, it may help these practical settings design better and sensitive tools, techniques, and interventions that captures the complex nature of human emotion experience with facial expressions of emotions.

Item Type: Thesis (Doctoral)
Faculty \ School: Faculty of Social Sciences > School of Psychology
Depositing User: Nicola Veasy
Date Deposited: 15 Jul 2024 08:13
Last Modified: 15 Jul 2024 08:13
URI: https://ueaeprints.uea.ac.uk/id/eprint/95912
DOI:

Downloads

Downloads per month over past year

Actions (login required)

View Item View Item