Literature DB >> 34644610

American society of biomechanics early career achievement award 2020: Toward portable and modular biomechanics labs: How video and IMU fusion will change gait analysis.

Eni Halilaj1, Soyong Shin2, Eric Rapp2, Donglai Xiang3.   

Abstract

The field of biomechanics is at a turning point, with marker-based motion capture set to be replaced by portable and inexpensive hardware, rapidly improving markerless tracking algorithms, and open datasets that will turn these new technologies into field-wide team projects. Despite progress, several challenges inhibit both inertial and vision-based motion tracking from reaching the high accuracies that many biomechanics applications require. Their complementary strengths, however, could be harnessed toward better solutions than those offered by either modality alone. The drift from inertial measurement units (IMUs) could be corrected by video data, while occlusions in videos could be corrected by inertial data. To expedite progress in this direction, we have collected the CMU Panoptic Dataset 2.0, which contains 86 subjects captured with 140 VGA cameras, 31 HD cameras, and 15 IMUs, performing on average 6.5 min of activities, including range of motion activities and tasks of daily living. To estimate ground-truth kinematics, we imposed simultaneous consistency with the video and IMU data. Three-dimensional joint centers were first computed by geometrically triangulating proposals from a convolutional neural network applied to each video independently. A statistical meshed model parametrized in terms of body shape and pose was then fit through a top-down optimization approach that enforced consistency with both the video-based joint centers and IMU data. As proof of concept, we used this dataset to benchmark pose estimation from a sparse set of sensors, showing that incorporation of complementary modalities is a promising frontier that can be further strengthened through physics-informed frameworks.
Copyright © 2021. Published by Elsevier Ltd.

Entities:  

Keywords:  Computer vision; Deep Learning; Inertial measurement units; Markerless motion tracking; Wearables

Mesh:

Year:  2021        PMID: 34644610     DOI: 10.1016/j.jbiomech.2021.110650

Source DB:  PubMed          Journal:  J Biomech        ISSN: 0021-9290            Impact factor:   2.712


  1 in total

1.  OpenSense: An open-source toolbox for inertial-measurement-unit-based measurement of lower extremity kinematics over long durations.

Authors:  Mazen Al Borno; Johanna O'Day; Vanessa Ibarra; James Dunne; Ajay Seth; Ayman Habib; Carmichael Ong; Jennifer Hicks; Scott Uhlrich; Scott Delp
Journal:  J Neuroeng Rehabil       Date:  2022-02-20       Impact factor: 4.262

  1 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.