MENA: A Multimodal Framework for Analyzing Caregiver Emotions and Competencies in AR Geriatric Simulations

  • Behdokht Kiafar
  • , Pavan Uttej Ravva
  • , Salam Daher
  • , Asif Ahmmed Joy
  • , Roghayeh Leila Barmaki

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Improving the quality of geriatric care is a challenge that requires insights from stakeholders. While simulated trainings can boost competencies, extracting meaningful insights from these practices to enhance simulation effectiveness remains a challenge. In this study, we introduce Multimodal Epistemic Network Analysis (MENA), a novel framework for analyzing caregiver attitudes and emotional responses in an Augmented Reality simulation. By integrating a multimodal Emotional State Classifier, MENA extends traditional epistemic network analysis to reveal complex relationships between caregiving competencies and positive emotions. Applied in a pilot study (N = 20) comparing caregiver interactions with an unaware versus an aware virtual geriatric patient (VGP), MENA visualizations demonstrated how awareness in the VGP fostered more supportive and person-centered caregiving behaviors. These findings suggest that MENA not only enhances the analysis of multimodal interactions but also provides a powerful tool for designing emotionally intelligent training systems that prepare caregivers for the nuanced demands of real-world practice. The code and setup to reproduce the experiments are publicly available here, and data is available upon request.

Original languageEnglish (US)
Title of host publicationICMI 2025 - Proceedings of the 27th International Conference on Multimodal Interaction
EditorsRam Subramanian, Yukiko I. Nakano, Tom Gedeon, Mohan Kankanhalli, Tanaya Guha, Jainendra Shukla, Gelareh Mohammadi, Oya Celiktutan
PublisherAssociation for Computing Machinery, Inc
Pages181-190
Number of pages10
ISBN (Electronic)9798400714993
DOIs
StatePublished - Oct 12 2025
Externally publishedYes
Event27th International Conference on Multimodal Interaction, ICMI 2025 - Canberra, Australia
Duration: Oct 13 2025Oct 17 2025

Publication series

NameICMI 2025 - Proceedings of the 27th International Conference on Multimodal Interaction

Conference

Conference27th International Conference on Multimodal Interaction, ICMI 2025
Country/TerritoryAustralia
CityCanberra
Period10/13/2510/17/25

All Science Journal Classification (ASJC) codes

  • Hardware and Architecture
  • Human-Computer Interaction
  • Computer Science Applications
  • Computer Vision and Pattern Recognition

Keywords

  • Multimodal epistemic network analysis; Multimodal analytics; Data visualization; Emotion classification; Knowledge graphs

Fingerprint

Dive into the research topics of 'MENA: A Multimodal Framework for Analyzing Caregiver Emotions and Competencies in AR Geriatric Simulations'. Together they form a unique fingerprint.

Cite this