Project Reference: DMT137

Student’s Name: Patrick-Luc Murphy

Project title: A system which interprets facial expression to affect a generative musical piece

Course Title: Digital Media Technology

Supervisor’s Name: Matthew Cheshire

To develop an interactive system which gives users the control to influence a musical piece through facial expressions and their reflected emotions.

This project involved the development of a system that recognises emotions from facial expressions to shift the musical quality of a generative piece. It aims to offer accessible interactive possibilities between music and emotion, offering control over an ambient-electronic piece through facial recognition.

The primary research targets participants who express their faces based on semantic information representing different musical intervals. Findings connect associations between facial expressive movements and emotional correlations to affect musical shifts, developed through working with two pre-existing applications (FaceOSC and MaxMSP). A testing phase lets users interact with the system and respond to the effectiveness of its emotional portrayal and musical connectivity.

The changes in the musical landscape are found to successfully alter depending on changes in facial expressivity, with high rates of correct emotional recognition. However, the emotional depth of the end piece is found to be lacking in musicality, leading toward a decrease in users experiencing feelings of control over the piece. Participants who hold familiarity with the genre of the musical piece with which they interact are revealed to feel levels of less control in altering it, as well as experience levels of less emotional recognition. Overall, the project is found to have a significant level of success in the system’s reach of aims.

The benefits of one playing music can help cognitive development and provide physiological rewards when listening. Despite this, the field of musical instruments for people with disabilities is still an emerging area, whilst music therapy continues to be an effective approach in encouraging everyday communication and interactivity.

The benefits of these areas have been recognised by leading health organisations for apps such as Cove, which primarily uses music instead of words to express a feeling or mood and is ‘designed to enable everyone to express and capture a mood or emotion’ by Humane Engineering.

This project hopes to break down any walls where users may feel disheartened or overwhelmed when confronted with the creative barriers that making and shaping a piece of music may propose. Furthermore, it hopes to expand on opportunities possible by enabling these interactions to become easier and accessible, enabling one to explore their own emotions and find more mindfulness through the nature of their expressivity, or for a user to decide if they wish their interaction to become less mindful by offering them control.

The project will open up a user’s interactive possibilities by offering control over a generative musical piece which is accessible to anyone (that is able to portray their emotions through facial expression) with or without even the most basic of musical training. This is made achievable through the aid of existing technologies and the development of a novel system to integrate them.

The findings from this project’s completion will help to identify current successes and failures that become present when currently attempting to develop a system which generates music to fully reflect one’s true emotions. It will also offer a chance to reflect on the relationships possible with interactive musical landscapes for future work.