this paper presents an innovative graphical user interface to visualize the attitude of a sensing device in a three-dimensional space, serving a wide-range of medical applications.
Material and methods
based on inertial measurement units (IMU) or on magnetic, angular rate and gravity (MARG) sensors, a processing unit provides Euler angles using a sensor fusion technique to display the orientation of the device relative to the Earth frame in real-time. The device is schematized by linking six polygonal regions, and is subject to sequential rotations by updating the graph each 350 ms. We conduct comparative studies between the two sensing devices, i.e. IMUs and MARGs, as well as two orientation filters, namely Madgwick's algorithm and Mahony's algorithm.
the accuracy of the system is reported as a function of (i) the sampling frequency, (ii) the sensing unit, and (iii) the orientation filter, following two elderly care applications, namely fall risk assessment and body posture monitoring. The experiments are conducted using public datasets. The corresponding results show that Madgwick's algorithm is best suited for low sampling rates, whereas MARG sensors are best suited for the detection of postural transitions.
this paper addresses the different aspects and discusses the limitations of attitude estimation systems, which are important tools to help clinicians in their diagnosis.Le texte complet de cet article est disponible en PDF.
3D visualization of wearable device attitude.
A reliable tool for activity monitoring.
The importance of sensing unit (IMU vs MARG).
Comparison between orientation filters.
Keywords : Graphical user interface, Device attitude, Sequential rotations, Elderly care
|☆|| This work was supported by the French National Research Agency (ANR) in the context of the ACCORDS Project under Grant ANR-17-CE19-0024-01.