Emotion-Based Music Recommendation System

Problem Statement:

Emotional well-being is essential for mental health, and music can profoundly influence and regulate emotions. However, finding music that matches or enhances one's mood can be a cumbersome experience. This project addressed the challenge of creating a seamless, user-centered solution that uses emotion detection through facial recognition to recommend personalized music playlists. The goal was to design a system prioritizing emotional needs while ensuring usability and an intuitive experience.

Project Description:

This project combined emotion detection technology with UX design principles to create a tool that aligns music recommendations with the user's emotional state in real-time. By integrating facial recognition and music therapy insights, the system not only supports mood regulation but also offers a frictionless user experience. The outcome was a user-friendly tool that empowers emotional well-being through personalized music recommendations. This project was published in IEEE, highlighting its contribution to the field of emotion detection and music therapy.

Role and Responsibilities:

As the Lead UX Researcher and Developer, I was responsible for:

  • Conducting research on user needs, emotion detection, and music therapy to align the system with real-world challenges.

  • Designing a seamless user flow that integrates facial recognition and playlist recommendation.

  • Developing and testing the emotion detection algorithm focusing on accuracy and user feedback.

  • Curating music playlists tailored to the user’s emotional states, ensuring relevance and impact.

  • Implementing user testing sessions to refine the tool for usability and accessibility.

Process and Methodologies:

1. User Research and Ideation:

  • Conducted user interviews to understand pain points in finding mood-specific music.

  • Mapped user journeys, focusing on emotional touchpoints, from initial interaction to playlist recommendation.

  • Defined the primary use case: seamlessly aligning music suggestions with the user’s current emotional needs.

2. Data Collection and Preprocessing:

  • Inclusive Data Collection: Gathered diverse facial expressions to represent various emotional states to encompass a wide range of users.

  • Real-World Optimization: Refined data for accurate emotion detection in everyday scenarios.

3. Emotion Detection Algorithm and UX Integration:

  • Developed the facial recognition algorithm using machine learning to detect emotional states.

  • Integrated emotion detection into a streamlined user interface that provides real-time feedback and music recommendations.

  • Focused on designing an accessible user interface, ensuring smooth transitions from emotion detection to playlist playback.

4. Playlist Curation and Personalisation:

  • Conducted research on music therapy principles to match playlists with emotional states of users.

  • Created empathy-driven content categories, such as uplifting, calming, or reflective playlists, based on emotional needs.

  • Incorporated user input to allow customization, ensuring relevance and engagement..

5. System Implementation and Testing:

  • Conducted usability testing to evaluate the system’s intuitiveness, accessibility, and emotional impact.

  • Iterated on the design based on user feedback, focusing on reducing cognitive load and improving playlist personalization.

Challenges and Solutions:

  • Challenge: Seamless Emotion Detection Integration

    Solution: Simplified the interaction by automating emotion detection in the background while offering clear visual and auditory cues.

  • Challenge: Personalized Music Experiences

    Solution: Leveraged UX research insights and iterative design to build playlists that resonated with diverse emotional states.

Key UX Learnings and Skills:

  • Human-Centered Design: Learned to prioritize user needs in an emotion-driven tool by integrating empathy and accessibility.

  • Cognitive Load Management: Gained expertise in designing interfaces that balance advanced technology with simplicity and ease of use.

  • Behavioral Insights: Acquired a deeper understanding of the intersection between emotional triggers and music therapy through user testing and research.

  • Iterative Design: Refined the system through multiple rounds of testing and feedback to create a polished, user-focused product.

Results and Impact:

  • Enhanced Emotional Well-being

  • Users reported that the tool successfully matched or enhanced their mood, helping regulate emotions through personalized playlists.

  • The system provided an intuitive and empathetic experience, making it easy for users to engage with their emotional needs.

    2. Ethical and Inclusive Design

  • The system ensured inclusivity to varied users by accommodating diverse facial expressions and emotional responses in its dataset.

  • Introduced user controls to customize playlists, ensuring users felt empowered and understood.

    3. Contribution to Research and Industry

  • Published findings in IEEE, highlighting the intersection of UX design, emotion detection, and music therapy.

  • Demonstrated how advanced technologies like facial recognition can be applied ethically to support user well-being.

Previous
Previous

UX Design for Virtual Therapy: Enhancing User Engagement

Next
Next

Lung Cancer Detection