Loading…

Dynamic Playlist Recommendations based on Multi-Modal Mood Detection and Contextual Learning

This paper presents a novel system for dynamic playlist recommendations by integrating multi-modal mood detection with contextual learning. The system utilizes three primary data sources: text-based sentiment analysis using BERT, facial expression recognition through ResNet-50, and physiological dat...

Full description

Saved in:
Bibliographic Details
Main Authors: Sundaravadivel, P., Raj, J. Relin Francis, Kumar, S. Senthil, Krishnan, R. Santhana, Muthu, A. Essaki, Karthikeyan, M. Saravana
Format: Conference Proceeding
Language:English
Subjects:
Online Access:Request full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:This paper presents a novel system for dynamic playlist recommendations by integrating multi-modal mood detection with contextual learning. The system utilizes three primary data sources: text-based sentiment analysis using BERT, facial expression recognition through ResNet-50, and physiological data analysis with LSTM networks. By processing user input from social media, capturing facial expressions, and analyzing physiological signals such as heart rate and skin conductance, the system effectively determines the user's current mood. Playlist recommendations are then generated through a hybrid approach combining collaborative and content-based filtering, augmented by contextual bandits to adapt dynamically to changing user preferences and time-of-day patterns. The proposed system provides a personalized music experience by continuously learning from user interactions and mood states. Comparative evaluations against existing recommendation systems demonstrate the system's efficacy in enhancing user satisfaction and engagement through accurate and context-aware music suggestions.
ISSN:2768-0673
DOI:10.1109/I-SMAC61858.2024.10714756