Loading…
Automatic classification of music based on the correlation between mood, linguistic and audio features
The emergence of the music in recent times has been enviable. Some people consider music to be an integral part of their regular lives, while others sometimes even consider music to be some divine inspiration setting the mood for them for the rest of the day. For such people, a well-trimmed precise...
Saved in:
Published in: | Bangladesh journal of scientific and industrial research 2016-03, Vol.51 (1), p.55-60 |
---|---|
Main Authors: | , , , |
Format: | Article |
Language: | English |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | The emergence of the music in recent times has been enviable. Some people consider music to be an integral part of their regular lives, while others sometimes even consider music to be some divine inspiration setting the mood for them for the rest of the day. For such people, a well-trimmed precise playlist of the songs that they would love to listen to, based on genre or mood of the songs, is priceless. Genre of an individual song is very much available, as that information is mostly provided within the song, but getting to judge the mood of the song is much more of a challenge. If it is a challenge itself for one distinct song, then one can easily imagine the hassle that a person faces when selecting a playlist of songs from a huge library of music. This ultimately gives rise to the importance of the classification of music based on the mood of the individual songs.This paper establishes such a method, which ultimately works with a combination of features, such as the linguistic and audio features of a song to classify a song according to the mood the song represents or is appropriate for. These features are then used in conjunction with several metrics to find out their relevance or relationships and measured for validation purposes.Bangladesh J. Sci. Ind. Res. 51(1), 55-60, 2016 |
---|---|
ISSN: | 0304-9809 2224-7157 |
DOI: | 10.3329/bjsir.v51i1.27063 |