Oversampling Approach Using Radius-SMOTE for Imbalance Electroencephalography Datasets

Retantyo Wardoyo, I Made Agus Wirawan, I Gede Angga Pradipta

Abstract


Several studies related to emotion recognition based on Electroencephalogram signals have been carried out in feature extraction, feature representation, and classification. However, emotion recognition is strongly influenced by the distribution or balance of Electroencephalogram data. On the other hand, the limited data obtained significantly affects the imbalance condition of the resulting Electroencephalogram signal data. It has an impact on the low accuracy of emotion recognition. Therefore, based on these problems, the contribution of this research is to propose the Radius SMOTE method to overcome the imbalance of the DEAP dataset in the emotion recognition process. In addition to the EEG data oversampling process, there are several vital processes in emotion recognition based on EEG signals, including the feature extraction process and the emotion classification process. This study uses the Differential Entropy (DE) method in the EEG feature extraction process. The classification process in this study compares two classification methods, namely the Decision Tree method and the Convolutional Neural Network method. Based on the classification process using the Decision Tree method, the application of oversampling with the Radius SMOTE method resulted in the accuracy of recognizing arousal and valence emotions of 78.78% and 75.14%, respectively. Meanwhile, the Convolutional Neural Network method can accurately identify the arousal and valence emotions of 82.10% and 78.99%, respectively.

 

Doi: 10.28991/ESJ-2022-06-02-013

Full Text: PDF


Keywords


Electroencephalogram; Radius-SMOTE; Emotion Recognition; Oversampling; Imbalance Data.

References


Subramanian, R., Wache, J., Abadi, M. K., Vieriu, R. L., Winkler, S., & Sebe, N. (2018). Ascertain: Emotion and personality recognition using commercial sensors. IEEE Transactions on Affective Computing, 9(2), 147–160. doi:10.1109/TAFFC.2016.2625250.

Setyohadi, D. B., Sri Kusrohmaniah, Christian, E., Dewi, L. T., & Sukci, B. P. (2017). M-Learning interface design based on emotional aspect analysis. Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 10127 LNCS, 276–287. doi:10.1007/978-3-319-52503-7_22.

Daher, W., Baya’a, N., & Anabousy, A. (2021). Emotions and self-efficacy as mediators of pre-service teachers’ adoption of digital tools. Emerging Science Journal, 5(5), 636–649. doi:10.28991/esj-2021-01301.

Seyeditabari, A., Tabari, N., & Zadrozny, W. (2018). Emotion Detection in Text: a Review. http://arxiv.org/abs/1806.00674

Alswaidan, N., & Menai, M. E. B. (2020). Hybrid Feature Model for Emotion Recognition in Arabic Text. IEEE Access, 8, 37843–37854. doi:10.1109/ACCESS.2020.2975906.

Gunadi, I. G. A., Harjoko, A., Wardoyo, R., & Ramdhani, N. (2015). The extraction and the recognition of facial feature state to emotion recognition based on certainty factor. Journal of Theoretical and Applied Information Technology, 82(1), 113–121.

Ko, B. C. (2018). A brief review of facial emotion recognition based on visual information. Sensors (Switzerland), 18(2). doi:10.3390/s18020401.

Lamba, P. S., & Virmani, D. (2018). Information retrieval from emotions and eye blinks with help of sensor nodes. International Journal of Electrical and Computer Engineering, 8(4), 2433–2441. doi:10.11591/ijece.v8i4.pp2433-2441.

Mehta, D., Siddiqui, M. F. H., & Javaid, A. Y. (2018). Facial emotion recognition: A survey and real-world user experiences in mixed reality. Sensors (Switzerland), 18(2), 1–24. doi:10.3390/s18020416.

Salmam, F. Z., Madani, A., & Kissi, M. (2018). Emotion recognition from facial expression based on fiducial points detection and using neural network. International Journal of Electrical and Computer Engineering, 8(1), 52–59. doi:10.11591/ijece.v8i1.pp52-59.

Noroozi, F., Corneanu, C. A., Kamińska, D., Sapiński, T., Escalera, S., & Anbarjafari, G. (2018). Survey on emotional body gesture recognition. IEEE transactions on affective computing, 12(2), 505-523. doi.org/10.1109/TAFFC.2018.2874986.

Khalil, R. A., Jones, E., Babar, M. I., Jan, T., Zafar, M. H., & Alhussain, T. (2019). Speech Emotion Recognition Using Deep Learning Techniques: A Review. IEEE Access, 7, 117327–117345. doi:10.1109/ACCESS.2019.2936124.

Ekman, P., Friesen, W. V., & Simons, R. C. (1985). Is the startle reaction an emotion?. Journal of personality and social psychology, 49(5), 1416. doi.org/10.1037/0022-3514.49.5.1416.

Shu, L., Xie, J., Yang, M., Li, Z., Li, Z., Liao, D., Xu, X., & Yang, X. (2018). A review of emotion recognition using physiological signals. Sensors (Switzerland), 18(7). doi:10.3390/s18072074.

Li, Y., Huang, J., Zhou, H., & Zhong, N. (2017). Human emotion recognition with electroencephalographic multidimensional features by hybrid deep neural networks. Applied Sciences (Switzerland), 7(10). doi:10.3390/app7101060.

Setyohadi, D. B., Kusrohmaniah, S., Gunawan, S. B., Pranowo, & Prabuwono, A. S. (2018). Galvanic skin response data classification for emotion detection. International Journal of Electrical and Computer Engineering, 8(5), 4004–4014. doi:10.11591/ijece.v8i5.pp4004-4014.

Hsu, Y. L., Wang, J. S., Chiang, W. C., & Hung, C. H. (2020). Automatic ECG-Based Emotion Recognition in Music Listening. IEEE Transactions on Affective Computing, 11(1), 85–99. doi:10.1109/TAFFC.2017.2781732.

Song, T., Zheng, W., Lu, C., Zong, Y., Zhang, X., & Cui, Z. (2019). MPED: A multi-modal physiological emotion database for discrete emotion recognition. IEEE Access, 7(c), 12177–12191. doi:10.1109/ACCESS.2019.2891579.

Miranda-Correa, J. A., Abadi, M. K., Sebe, N., & Patras, I. (2021). AMIGOS: A Dataset for Affect, Personality and Mood Research on Individuals and Groups. IEEE Transactions on Affective Computing, 12(2), 479–493. doi:10.1109/TAFFC.2018.2884461.

Koelstra, S., Mühl, C., Soleymani, M., Lee, J. S., Yazdani, A., Ebrahimi, T., Pun, T., Nijholt, A., & Patras, I. (2012). DEAP: A database for emotion analysis; Using physiological signals. IEEE Transactions on Affective Computing, 3(1), 18–31. doi:10.1109/T-AFFC.2011.15.

Al-Shargie, F., Tariq, U., Alex, M., Mir, H., & Al-Nashash, H. (2019). Emotion Recognition Based on Fusion of Local Cortical Activations and Dynamic Functional Networks Connectivity: An EEG Study. IEEE Access, 7, 143550–143562. doi:10.1109/ACCESS.2019.2944008.

Xu, T., Zhou, Y., Wang, Z., & Peng, Y. (2018). Learning Emotions EEG-based Recognition and Brain Activity: A Survey Study on BCI for Intelligent Tutoring System. Procedia Computer Science, 130, 376–382. doi:10.1016/j.procs.2018.04.056.

Hu, X., Chen, J., Wang, F., & Zhang, D. (2019). Ten challenges for EEG-based affective computing. Brain Science Advances, 5(1), 1–20. doi:10.1177/2096595819896200.

Zhang, J., Yin, Z., Chen, P., & Nichele, S. (2020). Emotion recognition using multi-modal data and machine learning techniques: A tutorial and review. Information Fusion, 59(January), 103–126. doi:10.1016/j.inffus.2020.01.011.

Bhandari, N. K., & Jain, M. (2020). Emotion recognition and classification using EEG: A review. International Journal of Scientific and Technology Research, 9(2), 1827–1836.

Al-Nafjan, A., Hosny, M., Al-Ohali, Y., & Al-Wabil, A. (2017). Review and classification of emotion recognition based on EEG brain-computer interface system research: A systematic review. Applied Sciences (Switzerland), 7(12). doi:10.3390/app7121239.

Ladakis, I., & Chouvarda, I. (2021). Overview of biosignal analysis methods for the assessment of stress. Emerging Science Journal, 5(2), 233–244. doi:10.28991/esj-2021-01267.

Made Agus Wirawan, I., Wardoyo, R., & Lelono, D. (2022). The challenges of emotion recognition methods based on electroencephalogram signals: A literature review. International Journal of Electrical and Computer Engineering, 12(2), 1508–1519. doi:10.11591/ijece.v12i2.pp1508-1519.

Sarma, P., & Barma, S. (2020). Review on Stimuli Presentation for Affect Analysis Based on EEG. IEEE Access, 8, 51991–52009. doi:10.1109/ACCESS.2020.2980893.

Pereira, E. T., & Martins Gomes, H. (2016). The role of data balancing for emotion classification using EEG signals. International Conference on Digital Signal Processing, DSP, 0, 555–559. doi:10.1109/ICDSP.2016.7868619.

Fernández, A., García, S., Herrera, F., & Chawla, N. V. (2018). SMOTE for Learning from Imbalanced Data: Progress and Challenges, Marking the 15-year Anniversary. In Journal of Artificial Intelligence Research 61, 863–905. doi:10.1613/jair.1.11192.

Nekooeimehr, I., & Lai-Yuen, S. K. (2016). Adaptive semi-unsupervised weighted oversampling (A-SUWO) for imbalanced datasets. Expert Systems with Applications, 46, 405–416. doi:10.1016/j.eswa.2015.10.031.

Pradipta, G. A., Wardoyo, R., Musdholifah, A., & Sanjaya, I. N. H. (2021). Radius-SMOTE: A New Oversampling Technique of Minority Samples Based on Radius Distance for Learning from Imbalanced Data. IEEE Access, 9, 74763–74777. doi:10.1109/ACCESS.2021.3080316.

Mohammed, R., Rawashdeh, J., & Abdullah, M. (2020). Machine Learning with Oversampling and Undersampling Techniques: Overview Study and Experimental Results. 2020 11th International Conference on Information and Communication Systems, ICICS 2020, April, 243–248. doi:10.1109/ICICS49469.2020.239556.

Fernández, A., García, S., Herrera, F., & Chawla, N. V. (2018). SMOTE for Learning from Imbalanced Data: Progress and Challenges, Marking the 15-year Anniversary. Journal of Artificial Intelligence Research, 61, 863–905. doi:10.1613/jair.1.11192.

Ding, X. W., Liu, Z. T., Li, D. Y., He, Y., & Wu, M. (2021). Electroencephalogram Emotion Recognition Based on Dispersion Entropy Feature Extraction Using Random Over-Sampling Imbalanced Data Processing. IEEE Transactions on Cognitive and Developmental Systems, 8920(c), 1–10,. doi:10.1109/TCDS.2021.3074811.

Sanguanmak, Y., & Hanskunatai, A. (2016). DBSM: The combination of DBSCAN and SMOTE for imbalanced data classification. 2016 13th International Joint Conference on Computer Science and Software Engineering, JCSSE 2016, 1–5. doi:10.1109/JCSSE.2016.7748928.

Sánchez, A. I., Morales, E. F., & Gonzalez, J. A. (2013). Synthetic oversampling of instances using clustering. International Journal on Artificial Intelligence Tools, 22(2), 1–21. doi:10.1142/S0218213013500085.

Barua, S., Islam, M. M., Yao, X., & Murase, K. (2014). MWMOTE - Majority weighted minority oversampling technique for imbalanced data set learning. IEEE Transactions on Knowledge and Data Engineering, 26(2), 405–425. doi:10.1109/TKDE.2012.232.

Bunkhumpornpat, C., Sinapiromsaran, K., & Lursinsap, C. Safe-Level-SMOTE : Safe-Level-Synthetic Minority Over-Sampling TEchnique. Pacific-Asia Conf. Knowl. Discov. Data Min, 475–482.

Sáez, J. A., Luengo, J., Stefanowski, J., & Herrera, F. (2015). SMOTE-IPF: Addressing the noisy and borderline examples problem in imbalanced classification by a re-sampling method with filtering. Information Sciences, 291(C), 184–203. doi:10.1016/j.ins.2014.08.051.

Chen, D. W., Miao, R., Yang, W. Q., Liang, Y., Chen, H. H., Huang, L., Deng, C. J., & Han, N. (2019). A feature extraction method based on differential entropy and linear discriminant analysis for emotion recognition. Sensors (Switzerland), 19(7). doi:10.3390/s19071631.

Cheng, J., Chen, M., Li, C., Liu, Y., Song, R., Liu, A., & Chen, X. (2020). Emotion recognition from multi-channel EEG via deep forest. IEEE Journal of Biomedical and Health Informatics, 25(2), 453-464. doi.org/10.1109/JBHI.2020.2995767.

Li, J., Zhang, Z., & He, H. (2018). Hierarchical Convolutional Neural Networks for EEG-Based Emotion Recognition. Cognitive Computation, 10(2), 368–380. doi:10.1007/s12559-017-9533-x.

Jiang, H., & Jia, J. (2020). Research on EEG Emotional Recognition Based on LSTM. In Communications in Computer and Information Science: Vol. 1160 CCIS, 409–417. doi:10.1007/978-981-15-3415-7_34.

Yang, Y., Wu, Q., Fu, Y., & Chen, X. (2018). Continuous convolutional neural network with 3D input for EEG-based emotion recognition. In International Conference on Neural Information Processing 433-443. Springer, Cham, Switzerland. doi.org/10.1007/978-3-030-04239-4_39.

Liu, Y., Ding, Y., Li, C., Cheng, J., Song, R., Wan, F., & Chen, X. (2020). Multi-channel EEG-based emotion recognition via a multi-level features guided capsule network. Computers in Biology and Medicine, 123(March), 103927. doi:10.1016/j.compbiomed.2020.103927.

Chao, H., Dong, L., Liu, Y., & Lu, B. (2019). Emotion recognition from multiband EEG signals using capsnet. Sensors (Switzerland), 19(9). doi:10.3390/s19092212.

Elgayar, S., A.Abdelhamid, A. E., & Fayed, Z. T. A. (2017). Emotion Detection from Text: Survey. IOSR Journal of Computer Engineering, 19(4), 30–37. doi:10.9790/0661-1904053037.

Gebhard, P. (2005). ALMA - A layered model of affect. Proceedings of the International Conference on Autonomous Agents, 177–184. doi:10.1145/1082473.1082478.

Russell, J. A. (1980). A circumplex model of affect. Journal of Personality and Social Psychology, 39(6), 1161–1178. doi:10.1037/h0077714.

Yang, Y., Wu, Q., Qiu, M., Wang, Y., & Chen, X. (2018). Emotion Recognition from Multi-Channel EEG through Parallel Convolutional Recurrent Neural Network. Proceedings of the International Joint Conference on Neural Networks, 2018(July), 1–7. doi:10.1109/IJCNN.2018.8489331.

Tyng, C. M., Amin, H. U., Saad, M. N. M., & Malik, A. S. (2017). The influences of emotion on learning and memory. Frontiers in Psychology, 8(August), 1-22. doi:10.3389/fpsyg.2017.01454.


Full Text: PDF

DOI: 10.28991/ESJ-2022-06-02-013

Refbacks

  • There are currently no refbacks.


Copyright (c) 2022 Retantyo Wardoyo, I Made Agus Wirawan, I Made Agus Wirawan, Gede Angga Pradipta, Gede Angga Pradipta