Multimodal & Cross-Device Emotion Understanding with Privacy-Preservation for Affective User Experience-Based Applications

Authors

  • Barath Raj Kandur Raja

Abstract

Emotion identification is a complex research area that can enable unique multi-device experiences. Smartphones, the dominant mode of communication, can aid in emotion prediction. However, there is a lack of datasets with precise ground truth labels based on user smartphone behavior due to challenges associated with dataset annotation. Present annotation techniques rely either on self -reporting or recording on desktop applications, which is less natural. In this research, these issues are addressed by devising a user-centric approach to collect and annotate user data in a non-intrusive way on smartphones. The insights are derived from the annotated data comprising behavior, emotion, and personality. The data consists of categorical features that do not include personally identifiable information, thus preserving user privacy. The annotated data is validated by an emotion prediction model using the Random Forest classifier, achieving an accuracy score of 67.73%. Further, an accuracy of 77.95% is achieved on sentiment prediction (positive, negative, and neutral) using the Support Vector Machine (SVM) classifier.

Downloads

Published

2024-06-20

How to Cite

Kandur Raja, B. R. (2024). Multimodal & Cross-Device Emotion Understanding with Privacy-Preservation for Affective User Experience-Based Applications. Global Journal of Business and Integral Security. Retrieved from https://gbis.ch/index.php/gbis/article/view/400