Dual-Function Integrated Emotion-Based Music Classification System Using Features From Physiological Signals | IEEE Journals & Magazine | IEEE Xplore

Dual-Function Integrated Emotion-Based Music Classification System Using Features From Physiological Signals


Abstract:

In this paper, we propose an emotion-based music classification system using features from physiological signals. The proposed system integrates two functions; the first ...Show More

Abstract:

In this paper, we propose an emotion-based music classification system using features from physiological signals. The proposed system integrates two functions; the first uses physiological sensors to recognize the emotions of users listening to music, and the second classifies music according to the feelings evoked in the listeners, without using physiological sensors. Moreover, to directly predict the user’s emotions from sensor data acquired through wearable physiological sensors, we developed and implemented a hierarchical inner attention-mechanism-based deep neural network. To relieve the discomfort of users wearing physiological sensors every time to receive content recommendations, the relation between emotion-specific features that are extracted from previously generated physiological signals, and musical features that are extracted from music is learned through a regression neural network. Based on these models, the proposed system classifies input music automatically according to users’ emotional reactions without measuring human physiological signals. The experimental results not only demonstrate the accuracy of the proposed automatic music classification framework, but also provide a new perspective in which human experience-based characteristics related to emotion are applied to artificial-intelligence-based content classification.
Published in: IEEE Transactions on Consumer Electronics ( Volume: 67, Issue: 4, November 2021)
Page(s): 341 - 349
Date of Publication: 15 October 2021

ISSN Information:

Funding Agency:

No metrics found for this document.

I. Introduction

Music is an artistic medium that brings joy to people and expresses human thoughts and feelings through sound. Owing to the development of the Internet and the growth of the digital music market, the need for a music search and recommendation system [1] that can easily and quickly access various types of music from large music datasets has emerged. In conventional music search and recommendation systems, music is automatically classified based on genre [2], [3], associated emotion [4], [5], artist [6], lyrics [7], album [8], emotion displayed in videos [9], user profiling [10]–[12], content-based features [13]–[17], and users’ social media interactions [18], and appropriate search and recommendation results are provided to users. Recent music recommendation models use variations of a hybrid system [19], [20] combining collaborative filtering [21], content-based filtering [22], context-based filtering [23]–[26], and metadata-based models [27] along with several other parameters. However, most music search and recommendation systems have been developed based on a system-centric rather than user-centric perspective; further, although emotions are an important factor in music selection, studies on emotions or expressions of music listeners remain insufficient.

Usage
Select a Year
2025

View as

Total usage sinceOct 2021:674
05101520JanFebMarAprMayJunJulAugSepOctNovDec31118000000000
Year Total:32
Data is updated monthly. Usage includes PDF downloads and HTML views.
Contact IEEE to Subscribe

References

References is not available for this document.