Integrating Multimodal Data Processing Techniques to Enhance User Experience Evaluation in Interactive Digital Platforms

Authors

  • Khoirudin Khoirudin Universitas Semarang
  • Nurtriana Hidayati Universitas Semarang

Keywords:

Multimodal Data, UX Evaluation, User Behavior, Eye-Tracking Systems, Real-Time Feedback

Abstract

User experience (UX) evaluation plays a crucial role in understanding how users interact with digital platforms and in improving product design. Traditional UX evaluation methods, such as surveys and interaction logs, often rely on a single data source, which limits the depth of analysis. This study explores the integration of multimodal data processing techniques in UX research, aiming to enhance the accuracy and comprehensiveness of UX evaluations. By combining interaction logs, visual attention data, and physiological measurements, this approach provides a more holistic understanding of user behavior, emotional responses, and satisfaction. Interaction logs offer objective data on user actions, while eye-tracking and physiological data capture users' emotional states, providing richer insights into usability and user experience. This study highlights the effectiveness of multimodal integration in identifying patterns that traditional methods overlook, such as emotional responses to interface elements and real-time feedback from users. The findings reveal that multimodal data processing improves the precision of UX assessment by combining objective behaviors with subjective emotional responses, offering a more complete view of user interactions. The study also discusses the challenges of data synchronization and the potential ethical concerns related to the use of physiological data. The integration of these data sources shows great potential for enhancing the design process, allowing designers to make informed decisions based on comprehensive insights. Finally, this research underscores the future potential of multimodal analytics in UX research, suggesting further exploration of additional data modalities and real-time applications in various digital environments.

References

[1] J. Liu, “AI in Automated and Remote UX Evaluation: A Systematic Review (2014–2024),” Adv. Human-Computer Interact., vol. 2025, no. 1, 2025, doi: 10.1155/ahci/7442179.

[2] L. Rivero and T. Conte, “Using a study to assess user eXperience evaluation methods from the point of view of users,” in ICEIS 2015 - 17th International Conference on Enterprise Information Systems, Proceedings, 2015, pp. 88 – 95. doi: 10.5220/0005377300880095.

[3] A. Aitim and M. Abdulla, “Data processing and analysing techniques in UX research,” in Procedia Computer Science, 2024, pp. 591 – 596. doi: 10.1016/j.procs.2024.11.154.

[4] C. Rico-Olarte, D. M. López, and S. Kepplinger, “Towards a conceptual framework for the objective evaluation of user experience,” Lect. Notes Comput. Sci. (including Subser. Lect. Notes Artif. Intell. Lect. Notes Bioinformatics), vol. 10918 LNCS, pp. 546 – 559, 2018, doi: 10.1007/978-3-319-91797-9_39.

[5] H. Jo, J. Lee, H. W. Park, M. Kim, Y. Kim, and W. H. Lee, “Developing an Integrated Dashboard to Analyze Multimodal Data for User Experience Evaluation,” in 2023 IEEE International Conference on Consumer Electronics-Asia, ICCE-Asia 2023, 2023. doi: 10.1109/ICCE-Asia59966.2023.10326366.

[6] L. Rivero and T. Conte, “A systematic mapping study on research contributions on UX evaluation technologies,” in ACM International Conference Proceeding Series, 2017. doi: 10.1145/3160504.3160512.

[7] A. S. M. Tsui and A. Kuzminykh, “Detect and Interpret: Towards Operationalization of Automated User Experience Evaluation,” Lect. Notes Comput. Sci. (including Subser. Lect. Notes Artif. Intell. Lect. Notes Bioinformatics), vol. 14032 LNCS, pp. 82 – 100, 2023, doi: 10.1007/978-3-031-35702-2_6.

[8] I. Pettersson, A. Riener, A.-K. Frison, J. Nolhage, and F. Lachner, “Triangulation in UX studies: Learning from experience,” in DIS 2017 Companion - Proceedings of the 2017 ACM Conference on Designing Interactive Systems, 2017, pp. 341 – 344. doi: 10.1145/3064857.3064858.

[9] H. M. Mustafa, O. Alsbaihi, S. Jahameh, A. Sayed, and M. Othman, “User interface, usability, and user experience: A bibliometric mapping for previous research and future research insight,” Inf. Des. J., vol. 29, no. 3, pp. 204 – 222, 2024, doi: 10.1075/idj.24006.mus.

[10] J. Qiu and S. Tokuhisa, “Dual-Track UX: A User Experience Evaluation Method for B2B SaaS Development,” in Conference on Human Factors in Computing Systems - Proceedings, 2025. doi: 10.1145/3706599.3719968.

[11] M. Shamim, A. R. Faridi, and F. Masood, “Analysis of Emerging Trends and Challenges in Multimodal Data,” in Proceedings of the 2025 12th International Conference on Computing for Sustainable Global Development, INDIACom 2025, 2025. doi: 10.23919/INDIACom66777.2025.11115680.

[12] A. Bosta and S. Vosinakis, “A Multimodal Approach to User Experience Evaluation: Integration of Physiological, Behavioral, Contextual and Self-Report data with UserSence,” in Proceedings of 3rd International Conference of the Greece ACM SIGCHI Chapter, CHIGreece 2025, 2025, pp. 182 – 187. doi: 10.1145/3749012.3749047.

[13] C. J. Lin and L.-Y. Cheng, “Product attributes and user experience design: how to convey product information through user-centered service,” J. Intell. Manuf., vol. 28, no. 7, pp. 1743 – 1754, 2017, doi: 10.1007/s10845-015-1095-8.

[14] A. L. Damian et al., “Evaluating UX Factors on Mobile Devices: A Feasibility Study,” in International Conference on Enterprise Information Systems, ICEIS - Proceedings, 2024, pp. 265 – 272. doi: 10.5220/0012623600003690.

[15] P. Pannattee, Y. Fukuchi, and N. Nishiuchi, “MUXAS-VR: A Multi-Dimensional User Experience Assessment System for Virtual Reality,” IEEE Access, vol. 13, pp. 93063 – 93083, 2025, doi: 10.1109/ACCESS.2025.3573382.

[16] M. Toribio-Candela, G. González-Serna, A. Magadan-Salazar, N. González-Franco, and M. López-Sánchez, “Automated Facial Expression Analysis for Cognitive State Prediction During an Interaction with a Digital Interface,” Lect. Notes Comput. Sci. (including Subser. Lect. Notes Artif. Intell. Lect. Notes Bioinformatics), vol. 14502 LNAI, pp. 41 – 49, 2024, doi: 10.1007/978-3-031-51940-6_5.

[17] M. Peruzzini, F. Grandi, M. Pellicciari, and C. E. Campanella, “User experience analysis based on physiological data monitoring and mixed prototyping to support human-centre product design,” Adv. Intell. Syst. Comput., vol. 777, pp. 401 – 412, 2019, doi: 10.1007/978-3-319-94706-8_44.

[18] E. Cavalcante, L. Rivero, and T. Conte, “MAX: A method for evaluating the post-use user eXperience through cards and a board,” in Proceedings of the International Conference on Software Engineering and Knowledge Engineering, SEKE, 2015, pp. 495 – 500. doi: 10.18293/SEKE2015-136.

[19] E. B. Şahin and P. Onay Durdu, “Evaluating the potential of using EEG based BCI method in user experience research,” Univers. Access Inf. Soc., vol. 24, no. 3, pp. 2507 – 2530, 2025, doi: 10.1007/s10209-025-01207-5.

[20] L. Marques, W. Nakamura, N. Valentim, L. Rivero, and T. Conte, “Do scale type techniques identify problems that affect user experience? user experience evaluation of a mobile application,” in Proceedings of the International Conference on Software Engineering and Knowledge Engineering, SEKE, 2018, pp. 451 – 455. doi: 10.18293/SEKE2018-161.

[21] X. Li, T. Anukul, and F. Ying, “Multimodal Content Analysis for Enhanced User Experience,” Int. J. Basic Appl. Sci., vol. 14, no. Special Issue 3, pp. 63 – 72, 2025, doi: 10.14419/txmrrk92.

[22] E. B. Ince, K. Cha, and J. Cho, “Towards a Conceptual Model of Users’ Expectations of an Autonomous In-Vehicle Multimodal Experience,” Hum. Behav. Emerg. Technol., vol. 2024, 2024, doi: 10.1155/2024/7418597.

[23] A. Becerra, R. Cobos, and C. Lang, “Enhancing online learning by integrating biosensors and multimodal learning analytics for detecting and predicting student behaviour: a review,” Behav. Inf. Technol., 2025, doi: 10.1080/0144929X.2025.2562322.

[24] N. Song, X. He, and Y. Kuang, “Research hotspots and trends analysis of user experience: Knowledge maps visualization and theoretical framework construction,” Front. Psychol., vol. 13, 2022, doi: 10.3389/fpsyg.2022.990663.

[25] N. H. Hasbi, A. Bade, F. P. Chee, and M. I. Rumaling, “Pattern Recognition for Human Diseases Classification in Spectral Analysis,” Computation, vol. 10, no. 6, 2022, doi: 10.3390/computation10060096.

[26] Á. Becerra, R. Daza, R. Cobos, A. Morales, and J. Fierrez, “User experience study using a system for generating multimodal learning analytics dashboards,” in ACM International Conference Proceeding Series, 2023. doi: 10.1145/3612783.3612813.

[27] J. D. T. Guerrero-Sosa, F. P. Romero, V. H. Menéndez-Domínguez, J. Serrano-Guerrero, A. Montoro-Montarroso, and J. A. Olivas, “A Comprehensive Review of Multimodal Analysis in Education,” Appl. Sci., vol. 15, no. 11, 2025, doi: 10.3390/app15115896.

Downloads

Published

2026-01-20