No Arabic abstract
Goal: This paper presents an algorithm for estimating pelvis, thigh, shank, and foot kinematics during walking using only two or three wearable inertial sensors. Methods: The algorithm makes novel use of a Lie-group-based extended Kalman filter. The algorithm iterates through the prediction (kinematic equation), measurement (pelvis position pseudo-measurements, zero-velocity update, and flat-floor assumption), and constraint update (hinged knee and ankle joints, constant leg lengths). Results: The inertial motion capture algorithm was extensively evaluated on two datasets showing its performance against two standard benchmark approaches in optical motion capture (i.e., plug-in gait (commonly used in gait analysis) and a kinematic fit (commonly used in animation, robotics, and musculoskeleton simulation)), giving insight into the similarity and differences between the said approaches used in different application areas. The overall mean body segment position (relative to mid-pelvis origin) and orientation error magnitude of our algorithm ($n=14$ participants) for free walking was $5.93 pm 1.33$ cm and $13.43 pm 1.89^circ$ when using three IMUs placed on the feet and pelvis, and $6.35 pm 1.20$ cm and $12.71 pm 1.60^circ$ when using only two IMUs placed on the feet. Conclusion: The algorithm was able to track the joint angles in the sagittal plane for straight walking well, but requires improvement for unscripted movements (e.g., turning around, side steps), especially for dynamic movements or when considering clinical applications. Significance: This work has brought us closer to comprehensive remote gait monitoring using IMUs on the shoes. The low computational cost also suggests that it can be used in real-time with gait assistive devices.
This paper presents an algorithm that makes novel use of a Lie group representation of position and orientation alongside a constrained extended Kalman filter (CEKF) to accurately estimate pelvis, thigh, and shank kinematics during walking using only three wearable inertial sensors. The algorithm iterates through the prediction update (kinematic equation), measurement update (pelvis height, zero velocity update, flat-floor assumption, and covariance limiter), and constraint update (formulation of hinged knee joints and ball-and-socket hip joints). The paper also describes a novel Lie group formulation of the assumptions implemented in the said measurement and constraint updates. Evaluation of the algorithm on nine healthy subjects who walked freely within a $4 times 4$ m$^2$ room shows that the knee and hip joint angle root-mean-square errors (RMSEs) in the sagittal plane for free walking were $10.5 pm 2.8^circ$ and $9.7 pm 3.3^circ$, respectively, while the correlation coefficients (CCs) were $0.89 pm 0.06$ and $0.78 pm 0.09$, respectively. The evaluation demonstrates a promising application of Lie group representation to inertial motion capture under reduced-sensor-count configuration, improving the estimates (i.e., joint angle RMSEs and CCs) for dynamic motion, and enabling better convergence for our non-linear biomechanical constraints. To further improve performance, additional information relating the pelvis and ankle kinematics is needed.
Goal: This paper presents an algorithm for accurately estimating pelvis, thigh, and shank kinematics during walking using only three wearable inertial sensors. Methods: The algorithm makes novel use of a constrained Kalman filter (CKF). The algorithm iterates through the prediction (kinematic equation), measurement (pelvis position pseudo-measurements, zero velocity update, flat-floor assumption, and covariance limiter), and constraint update (formulation of hinged knee joints and ball-and-socket hip joints). Results: Evaluation of the algorithm using an optical motion capture-based sensor-to-segment calibration on nine participants ($7$ men and $2$ women, weight $63.0 pm 6.8$ kg, height $1.70 pm 0.06$ m, age $24.6 pm 3.9$ years old), with no known gait or lower body biomechanical abnormalities, who walked within a $4 times 4$ m$^2$ capture area shows that it can track motion relative to the mid-pelvis origin with mean position and orientation (no bias) root-mean-square error (RMSE) of $5.21 pm 1.3$ cm and $16.1 pm 3.2^circ$, respectively. The sagittal knee and hip joint angle RMSEs (no bias) were $10.0 pm 2.9^circ$ and $9.9 pm 3.2^circ$, respectively, while the corresponding correlation coefficient (CC) values were $0.87 pm 0.08$ and $0.74 pm 0.12$. Conclusion: The CKF-based algorithm was able to track the 3D pose of the pelvis, thigh, and shanks using only three inertial sensors worn on the pelvis and shanks. Significance: Due to the Kalman-filter-based algorithms low computation cost and the relative convenience of using only three wearable sensors, gait parameters can be computed in real-time and remotely for long-term gait monitoring. Furthermore, the system can be used to inform real-time gait assistive devices.
This paper presents an algorithm that makes novel use of distance measurements alongside a constrained Kalman filter to accurately estimate pelvis, thigh, and shank kinematics for both legs during walking and other body movements using only three wearable inertial measurement units (IMUs). The distance measurement formulation also assumes hinge knee joint and constant body segment length, helping produce estimates that are near or in the constraint space for better estimator stability. Simulated experiments shown that inter-IMU distance measurement is indeed a promising new source of information to improve the pose estimation of inertial motion capture systems under a reduced sensor count configuration. Furthermore, experiments show that performance improved dramatically for dynamic movements even at high noise levels (e.g., $sigma_{dist} = 0.2$ m), and that acceptable performance for normal walking was achieved at $sigma_{dist} = 0.1$ m. Nevertheless, further validation is recommended using actual distance measurement sensors.
The recently introduced matrix group SE2(3) provides a 5x5 matrix representation for the orientation, velocity and position of an object in the 3-D space, a triplet we call extended pose. In this paper we build on this group to develop a theory to associate uncertainty with extended poses represented by 5x5 matrices. Our approach is particularly suited to describe how uncertainty propagates when the extended pose represents the state of an Inertial Measurement Unit (IMU). In particular it allows revisiting the theory of IMU preintegration on manifold and reaching a further theoretic level in this field. Exact preintegration formulas that account for rotating Earth, that is, centrifugal force and Coriolis force, are derived as a byproduct, and the factors are shown to be more accurate. The approach is validated through extensive simulations and applied to sensor-fusion where a loosely-coupled fixed-lag smoother fuses IMU and LiDAR on one hour long experiments using our experimental car. It shows how handling rotating Earth may be beneficial for long-term navigation within incremental smoothing algorithms.
With the recent advance of deep learning based object recognition and estimation, it is possible to consider object level SLAM where the pose of each object is estimated in the SLAM process. In this paper, based on a novel Lie group structure, a right invariant extended Kalman filter (RI-EKF) for object based SLAM is proposed. The observability analysis shows that the proposed algorithm automatically maintains the correct unobservable subspace, while standard EKF (Std-EKF) based SLAM algorithm does not. This results in a better consistency for the proposed algorithm comparing to Std-EKF. Finally, simulations and real world experiments validate not only the consistency and accuracy of the proposed algorithm, but also the practicability of the proposed RI-EKF for object based SLAM problem. The MATLAB code of the algorithm is made publicly available.