| Literature DB >> 26197272 |
Pengfei Sun1, Changku Sun1, Wenqiang Li2, Peng Wang2.
Abstract
Pose estimation aims at measuring the position and orientation of a calibrated camera using known image features. The pinhole model is the dominant camera model in this field. However, the imaging precision of this model is not accurate enough for an advanced pose estimation algorithm. In this paper, a new camera model, called incident ray tracking model, is introduced. More importantly, an advanced pose estimation algorithm based on the perspective ray in the new camera model, is proposed. The perspective ray, determined by two positioning points, is an abstract mathematical equivalent of the incident ray. In the proposed pose estimation algorithm, called perspective-ray-based scaled orthographic projection with iteration (PRSOI), an approximate ray-based projection is calculated by a linear system and refined by iteration. Experiments on the PRSOI have been conducted, and the results demonstrate that it is of high accuracy in the six degrees of freedom (DOF) motion. And it outperforms three other state-of-the-art algorithms in terms of accuracy during the contrast experiment.Entities:
Mesh:
Year: 2015 PMID: 26197272 PMCID: PMC4509906 DOI: 10.1371/journal.pone.0134029
Source DB: PubMed Journal: PLoS One ISSN: 1932-6203 Impact factor: 3.240
Fig 1An incident ray passing through an imaging system which is absorbed by detector elements (pixel).
Fig 2Geometrization of the IRT.
Fig 3A mapping between reference plane and image plane.
The intersection points of dotted lines in plane ABCD corresponds to the ones in plane abcd.
Fig 4The perspective rays used for object pose.
Fig 5An imaging model of the perspective rays.
It includes a perspective projection and a scaled orthographic projection.
Fig 6Captured images at six positions.
Camera parameters.
| Parameters | Π | Π | ||
|---|---|---|---|---|
|
|
|
|
| |
|
| -2.654E+02 | -1.865E+02 | -3.226E+02 | -2.208E+02 |
|
| 6.726E-01 | -4.893E-04 | 8.756E-01 | -8.562E-03 |
|
| 2.479E-02 | 7.093E-01 | 2.791E-03 | 8.171E-01 |
|
| 9.571E-05 | -1.246E-05 | -2.688E-04 | 2.292E-05 |
|
| -1.179E-04 | -1.156E-04 | 1.512E-04 | 8.542E-05 |
|
| 1.432E-06 | -1.468E-05 | -6.406E-05 | 8.410E-05 |
|
| -3.328E-07 | 2.558E-08 | 5.994E-07 | -9.413E-08 |
|
| 4.067E-07 | 1.902E-07 | -6.830E-07 | -3.960E-07 |
|
| -2.716E-08 | 5.363E-08 | -8.002E-08 | -2.445E-07 |
|
| -6.565E-08 | -8.220E-09 | 2.443E-07 | -1.878E-07 |
|
| 5.180E-10 | -6.346E-11 | -5.699E-10 | 1.449E-10 |
|
| -7.111E-10 | -8.815E-12 | 1.226E-09 | 8.289E-10 |
|
| -2.782E-12 | 1.190E-10 | -4.663E-10 | 1.900E-10 |
|
| -3.097E-11 | -1.939E-10 | 3.740E-10 | 8.478E-11 |
|
| 2.582E-10 | 3.799E-11 | -1.669E-11 | 5.031E-10 |
|
| -2.645E-13 | 4.620E-14 | 2.081E-13 | -9.186E-14 |
|
| 4.766E-13 | -1.756E-13 | -7.830E-13 | -6.358E-13 |
|
| 4.283E-14 | -1.214E-13 | 3.333E-13 | -5.590E-14 |
|
| -1.650E-13 | 1.326E-13 | -1.334E-13 | -1.493E-13 |
|
| -5.200E-14 | -2.771E-13 | 2.536E-13 | -3.852E-13 |
|
| 1.835E-14 | 3.582E-13 | -5.181E-13 | 2.097E-13 |
Fig 7Position distribution of the calibration points.
The “●” represents the standard two-dimensional coordinate of the calibration points while the “+” represents the calculated two-dimensional coordinate of them.
Fig 8Experiment devices.
Fig 9A sample image used for pose estimation.
Fig 10Pose estimation error distribution.
The RMSE of the PnP algorithms.
| method |
|
|
|
|
|
|
|---|---|---|---|---|---|---|
|
| 0.290 | 0.243 | 0.071 | 0.369 | 0.241 | 0.552 |
|
| 0.258 | 0.277 | 0.110 | 0.340 | 0.248 | 0.448 |
|
| 0.201 | 0.176 | 0.083 | 0.146 | 0.130 | 0.362 |
|
| 0.136 | 0.115 | 0.062 | 0.152 | 0.128 | 0.272 |
ary is rotation in yaw direction,
brp is rotation in pitch direction,
crr is rotation in roll direction,
dtx is translation in x direction,
ety is translation in y direction, and
ftz is translation in z direction.