Authorship

PAPER TITLE

Music Recommendation using Emotion Recognition

PUBLICATION

Institute of Electrical and Electronics Engineers (IEEE)

DATE

Conference: 16-17 October 2022

IEEE Xplore: 13 December 2022

MOTIVATION

During the COVID-19 pandemic, I became increasingly aware of how emotionally drained people felt — yet our devices remained constant companions. This led me to explore how UX design could serve and support users emotionally. Drawing from research in music therapy, I developed a system that uses real-time emotion recognition to curate playlists designed to elevate or stabilize mood.

Rather than passively recommending music, the interface responds to the user’s emotional state, creating a more empathetic, intentional experience. By aligning design with emotional context, the project aimed to improve everyday user experiences subtly but deeply personally — making people feel seen, supported, and understood through design.

Why IEEE?

I chose to publish through IEEE because of its rigorous peer-review process and global reputation for advancing high-impact research in technology, UX, and human-computer interaction.

  • Understand the problem, identify symptoms, research fundamentals

  • Playlists were informed by music therapy research, ensuring each recommendation aligned with therapeutic principles to support emotional well-being.

  • Design the first prototype, ensure basic funcationalities

  • Test the solution, note issues, iterate

  • Itemise and document the problem, the literature review, the gap we addressed, the design, the results and the conclusion

  • Publish in IEEE

Procedure

Method

Using a trained emotion recognition model, we could detect emotions from a live video feed or a photo.

Once the emotion was identified, the user was directed to a curated playlist, built using principles from music therapy.

These playlists were carefully made to reflect and support the user’s mood — not just to entertain, but to acknowledge their feelings.

Throughout, we focused on keeping the interface simple and accessible, so the process felt intuitive from start to finish.

Emotion Recognition: Happy

Emotion Recognition: Neutral

Outcome

The model successfully detected emotions like happiness, sadness, anger, and surprise with strong accuracy.

We compared it against a method called Support Vector Machine (SVM), a traditional algorithm used for classification tasks.

SVM tends to work well with smaller, simpler datasets but struggles with more complex patterns like facial expressions, especially in real-world conditions.

In contrast, we used a Convolutional Neural Network (CNN)

CNN is a type of deep learning model that’s especially good at understanding images by recognizing patterns and features at different levels.

CNN provided higher accuracy and consistency, making it more reliable for detecting subtle emotional cues from facial expressions.

Comparision: CNN & SVM

Accuracy

Comparision: CNN & SVM

Specificity

Comparision: CNN & SVM

Sensitivity

SCOPE

This project was a first step toward designing user experiences that are more emotionally aware. While currently limited to a few core emotions and basic playlist delivery, the system has room to grow — potentially integrating with smart devices, more complex emotional models, or even real-time interactions.

At its core, the project explored how UX design & technology can respond to emotion respectfully and meaningfully.