| Literature DB >> 23845933 |
Luis Vicente Calderita1, Juan Pedro Bandera, Pablo Bustos, Andreas Skiadopoulos.
Abstract
Motion capture systems have recently experienced a strong evolution. New cheap depth sensors and open source frameworks, such as OpenNI, allow for perceiving human motion on-line without using invasive systems. However, these proposals do not evaluate the validity of the obtained poses. This paper addresses this issue using a model-based pose generator to complement the OpenNI human tracker. The proposed system enforces kinematics constraints, eliminates odd poses and filters sensor noise, while learning the real dimensions of the performer's body. The system is composed by a PrimeSense sensor, an OpenNI tracker and a kinematics-based filter and has been extensively tested. Experiments show that the proposed system improves pure OpenNI results at a very low computational cost.Entities:
Mesh:
Year: 2013 PMID: 23845933 PMCID: PMC3758625 DOI: 10.3390/s130708835
Source DB: PubMed Journal: Sensors (Basel) ISSN: 1424-8220 Impact factor: 3.576
Figure 1.System overview.
Figure 2.(a) 3D centroids of body parts provided by OpenNI tracker (red dots); and (b) kinematics model employed to encode and refine perceived human pose.
Figure 3.An example of the two-level collision detection algorithm.
Figure 4.Examples of OpenNI issues and corresponding model poses.
Figure 5.(a) Capture volume for the quantitative evaluation of the system; and (b) selection of the three markers in the calibration plane.
Figure 6.Forearm lengths provided by the OpenNI tracker and by the proposed model-based adaptive system.
Mean errors and standard deviations of right hand joints, in centimeters.
|
| ||||||
|---|---|---|---|---|---|---|
| OpenNI centroids | 11.3 cm | 5.3 cm | 6.7 cm | 2.7 cm | 2.4 cm | 1.0 cm |
| Fixed limb lengths | 11.7 cm | 5.0 cm | 7.2 cm | 2.3 cm | 2.5 cm | 1.0 cm |
| Adaptive limb lengths | 11.7 cm | 5.0 cm | 4.9 cm | 1.7 cm | 2.5 cm | 1.0 cm |
Motion sequences employed in Experiment 2.
| #1 | Head | 50 | Hands touch the nape. Elbows move covering the face. |
| #2 | Head | 50 | Hands alternatively touch the nape and the nose. |
| #3 | Torso | 100 | Hands move behind the torso. |
| #4 | Torso | 200 | Torso rotates left and right. |
Limb length errors in Experiment 2.
| Upper arm | 0.42 | 0.37 |
| Forearm | 3.3 | 1.97 |
Percentages of correctly estimated hands and elbows poses.
|
| |||
|---|---|---|---|
| Total | 36,810 | 58.47% | 74.90% |
| Head-related | 18,759 | 67.56% | 69.19% |
| Torso-related | 18,051 | 49.03% | 80.82% |
|
| |||
|
| |||
|
| |||
| Total | 29,129 | 68.34% | 85.32% |
|
| |||
| Head-related | 12,070 | 92.37% | 96.44% |
| Torso-related | 17,059 | 51.35% | 77.45% |
Figure 7.Camera-joint distances for a head-related test sequence.
Figure 8.Camera-joint distances for a torso-related test sequence.