The use of augmented reality (AR) has great potential in education and training. However, many students have had difficulty with complex interfaces, leading to mental fatigue. Some have also experienced discomfort that affects their ability to effectively engage. These limitations have inspired our team to explore ways to enhance AR learning environments by integrating other technologies.
We utilize D-Lab eye-tracking, MVN motion capture, and real-time indoor location tracking systems along with metacognitive monitoring feedback to improve student performance and metacognitive awareness during AR-based learning. The D-Lab eye-tracking system has been combined with the HoloLens 2 device to monitor changes in pupil size and measure mental workload, metacognitive awareness, and attention levels by comparing the eye-tracking data to the NASA Task Load Index (TLX), the Metacognitive Awareness Inventory (MAI), and The Attentional Allocation Scale
The MVN motion capture system is utilized for the evaluation of body posture, primarily focusing on the occurrence of slouching, which can serve as an indicator of physical fatigue. Prolonged utilization of AR devices often leads to the adoption of slouched postures, which is associated with an increase in physical strain. Through the capture of joint angles and segment positions, this study aims to quantify physical strain and identify key indicators for the detection of physical fatigue during AR-based learning. Also, we have developed a dynamic ergonomics training system using mixed reality (MR) technology. This system integrates the HoloLens 2 headset and the Xsens MVN Awinda inertia motion capture system to provide immersive real-time posture assessment and feedback.
Our study aims to enhance AR environments by minimizing cognitive and physical stress. We seek to understand how mental workload and physical discomfort impact user performance, with the goal of creating a more intuitive and effective learning platform. This integrated approach has the potential to greatly improve AR-based learning and training, aligning technology with human cognitive and physical capabilities to develop more efficient and sustainable educational tools. This research is supported by NSF.
Team Members: Jung Hyup Kim, Kangwon Seo, Danielle Oprean, Fang Wang, Yi Wang, Siddarth Mohanty, Varun Pulipati, Ching-Yun Yu, Sara Mostowfi, Yuanyuan Gu, Madeline Easley, Will Mastrantuono, Ella Swaters
(If you are interested in having your research featured with XRTG, email hfes.xrtg@gmail.com or shafiqul@vt.edu).
Follow XRTG on:
LinkedIn: https://www.linkedin.com/company/hfes-xrtg/
Twitter: https://twitter.com/HFES_XRTG
Instagram: https://www.instagram.com/hfes.xrtg/
Facebook: https://www.facebook.com/hfes.xrtg
------------------------------
Md Shafiqul Islam
Virginia Tech
Communications Chair, XRTG
------------------------------