Implementation of a novel music-face pairing task to increase facial affect recognition in the broad autism phenotype

Loading...
Thumbnail Image
Date
2019-03-26
Authors
Smith, Scott
Journal Title
Journal ISSN
Volume Title
Publisher
Abstract
Individuals with Autism Spectrum Disorder (ASD) often have difficulty recognizing emotions in facial expressions (Wallace, Coleman, & Bailey, 2008). Facial expression processing impairments are also seen in individuals with traits that fall within the broad autism phenotype (BAP; Kadak, Demirel, Yavuz, & Demir, 2014), a continuum of characteristics involving impaired reciprocal social skills, restrictive and repetitive behaviours, and atypical communication (Pickles et al., 2000). Deficits in facial affect recognition are known to contribute to reductions in social understanding and interaction in individuals with ASD (Wallace, Sebastian, Pellicano, Parr, & Bailey, 2010) and lower BAP scores (Losh & Piven, 2007). In two separate studies, undergraduate psychology students completed a novel Music-Face Pairing Task (MFPT), which involved the presentation of specialized hierarchical patterns of chords and notes called modes (designed to elicit a specific emotion) with congruent or incongruent facial expressions. BAP characteristics (i.e., pragmatic language difficulties, aloofness, and rigidity) were measured using the Broad Autism Phenotype Questionnaire (BAPQ; Hurley, Losh, Parlier, Reznick, & Piven, 2007). Previous research demonstrates that individuals with ASD have accurate perception in music and respond well to music-based interventions designed to improve emotional understanding (Katagiri, 2009; LaGasse,2014; Sharda et al., 2018). As such, the MFPT was expected to help individuals, particularly those scoring high BAPQ Total and subscale scores, to perceive expressions in faces faster and more accurately than before training. In both studies, all participants were more accurate and faster at recognizing negative emotions (i.e., fear, anger, sadness, and disgust) after completing the MFPT. In Study 1, participants with pragmatic language difficulties improved less on the expression recognition task after training than participants with no such difficulties. In Study 2, participants with higher aloof scores showed greater improvement in expression recognition after training than those with lower aloof scores. Overall, this study demonstrates the effectiveness of repeated music face pairing trials to enhance facial affect processing in those with characteristics of the BAP. These results suggest that individuals with specific characteristics of the BAP may respond differently to music-based training designed to improve facial affect recognition. Future research on facial affect recognition interventions should consider differences among subcomponents of the BAP in addition to overall autistic characteristics.
Description
Keywords
Autism Spectrum Disorder, Music, Broad Autism Phenotype, Facial Affect Recognition, Emotion Recognition
Citation
APA