Literature DB >> 33430149

Evaluation of the Azure Kinect and Its Comparison to Kinect V1 and Kinect V2.

Michal Tölgyessy1, Martin Dekan1, Ľuboš Chovanec1, Peter Hubinský1.   

Abstract

The Azure Kinect is the successor of Kinect v1 and Kinect v2. In this paper we perform brief data analysis and comparison of all Kinect versions with focus on precision (repeatability) and various aspects of noise of these three sensors. Then we thoroughly evaluate the new Azure Kinect; namely its warm-up time, precision (and sources of its variability), accuracy (thoroughly, using a robotic arm), reflectivity (using 18 different materials), and the multipath and flying pixel phenomenon. Furthermore, we validate its performance in both indoor and outdoor environments, including direct and indirect sun conditions. We conclude with a discussion on its improvements in the context of the evolution of the Kinect sensor. It was shown that it is crucial to choose well designed experiments to measure accuracy, since the RGB and depth camera are not aligned. Our measurements confirm the officially stated values, namely standard deviation ≤17 mm, and distance error <11 mm in up to 3.5 meters distance from the sensor in all four supported modes. The device, however, has to be warmed up for at least 40-50 min to give stable results. Due to the time-of-flight technology, the Azure Kinect cannot be reliably used in direct sunlight. Therefore, it is convenient mostly for indoor applications.

Entities:  

Keywords:  3D scanning; Azure Kinect; HRI (human–robot interaction); Kinect; SLAM (simultaneous localization and mapping); depth imaging; gesture recognition; mapping; object recognition; robotics

Year:  2021        PMID: 33430149     DOI: 10.3390/s21020413

Source DB:  PubMed          Journal:  Sensors (Basel)        ISSN: 1424-8220            Impact factor:   3.576


  15 in total

1.  Microsoft Azure Kinect Calibration for Three-Dimensional Dense Point Clouds and Reliable Skeletons.

Authors:  Laura Romeo; Roberto Marani; Anna Gina Perri; Tiziana D'Orazio
Journal:  Sensors (Basel)       Date:  2022-07-01       Impact factor: 3.847

2.  Evaluating Automatic Body Orientation Detection for Indoor Location from Skeleton Tracking Data to Detect Socially Occupied Spaces Using the Kinect v2, Azure Kinect and Zed 2i.

Authors:  Violeta Ana Luz Sosa-León; Angela Schwering
Journal:  Sensors (Basel)       Date:  2022-05-17       Impact factor: 3.847

3.  Markerless 3D Skeleton Tracking Algorithm by Merging Multiple Inaccurate Skeleton Data from Multiple RGB-D Sensors.

Authors:  Sang-Hyub Lee; Deok-Won Lee; Kooksung Jun; Wonjun Lee; Mun Sang Kim
Journal:  Sensors (Basel)       Date:  2022-04-20       Impact factor: 3.847

4.  TIMo-A Dataset for Indoor Building Monitoring with a Time-of-Flight Camera.

Authors:  Pascal Schneider; Yuriy Anisimov; Raisul Islam; Bruno Mirbach; Jason Rambach; Didier Stricker; Frédéric Grandidier
Journal:  Sensors (Basel)       Date:  2022-05-25       Impact factor: 3.847

5.  Towards a Singing Voice Multi-Sensor Analysis Tool: System Design, and Assessment Based on Vocal Breathiness.

Authors:  Evangelos Angelakis; Natalia Kotsani; Anastasia Georgaki
Journal:  Sensors (Basel)       Date:  2021-11-30       Impact factor: 3.576

Review 6.  Review-Emerging Portable Technologies for Gait Analysis in Neurological Disorders.

Authors:  Christina Salchow-Hömmen; Matej Skrobot; Magdalena C E Jochner; Thomas Schauer; Andrea A Kühn; Nikolaus Wenger
Journal:  Front Hum Neurosci       Date:  2022-02-03       Impact factor: 3.169

7.  Portable, open-source solutions for estimating wrist position during reaching in people with stroke.

Authors:  Jeffrey Z Nie; James W Nie; Na-Teng Hung; R James Cotton; Marc W Slutzky
Journal:  Sci Rep       Date:  2021-11-18       Impact factor: 4.379

8.  Validity and Reliability of Kinect v2 for Quantifying Upper Body Kinematics during Seated Reaching.

Authors:  Germain Faity; Denis Mottet; Jérôme Froger
Journal:  Sensors (Basel)       Date:  2022-04-02       Impact factor: 3.576

9.  Evaluating the Accuracy of the Azure Kinect and Kinect v2.

Authors:  Gregorij Kurillo; Evan Hemingway; Mu-Lin Cheng; Louis Cheng
Journal:  Sensors (Basel)       Date:  2022-03-23       Impact factor: 3.576

10.  Functional movement screen dataset collected with two Azure Kinect depth sensors.

Authors:  Qing-Jun Xing; Yuan-Yuan Shen; Run Cao; Shou-Xin Zong; Shu-Xiang Zhao; Yan-Fei Shen
Journal:  Sci Data       Date:  2022-03-25       Impact factor: 6.444

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.