D4.4 Method and System for Post Analysis of AR-based Learning and Biofeedback Data

The objectives for this deliverable D4.4 relate to post analysis of AR based learning and biofeedback, based on data capture from multiple sensors during AR-supported tasks. This data match includes several core components such as 3D scanning, head position, vision area video, and voice. All this data should be capable of post-action review and interpretation at desktop level by data scientists and task/performance analysts as well as by end-users e.g. using the Hololens.

The above objectives are met in this deliverable using instances of data capture in our end-user trials of the WEKIT prototype platform and linked sensor ecosystem (‘the garment’). Trial data are collected using three categories of sensors. These categories are human, environment and device sensors. Integrated into the garment are sensors to detect heart activity, Galvanic Skin Response, electromyographic activity for the forearm and hand, positional tracking, 9-axis inertial movement as well as temperature and humidity. The construction of the sensor ecosystem has to relate to the architecture proposed in D2.1. The sensor ecosystem has to be integrated with the design of the wearable prototype garment covered in deliverable 5.8 as well as the final electronics architecture (D3.5) and visualisation of sensor data covered in D5.10.