EMIR: A novel emotion-based music retrieval system
Zhou, Lijuan Marissa and Lin, Hongfei and Gurrin, Cathal (2012) EMIR: A novel emotion-based music retrieval system. In: The 18th International Conference on Multimedia Modeling, 4-6 Jan 2012, Klagenfurt, Austria.
Full text available as:
Music is inherently expressive of emotion meaning and affects the mood of people. In this paper, we present a novel EMIR (Emotional Music Information Retrieval) System that uses latent emotion elements both in music and non-descriptive queries (NDQs) to detect implicit emotional association between users and music to enhance Music Information Retrieval (MIR). We try to understand the latent emotional intent of queries via machine learning for emotion classification and compare the performance of emotion detection approaches on different feature sets. For this purpose, we extract music emotion features from lyrics and social tags crawled from the Internet, label some for training and model them in high-dimensional emotion space and recognize latent emotion of users by query emotion analysis. The similarity between queries and music is computed by verified BM25 model.
Archive Staff Only: edit this record