Literature DB >> 34968626

Wide-angle, monocular head tracking using passive markers.

Balazs P Vagvolgyi1, Ravikrishnan P Jayakumar2, Manu S Madhav3, James J Knierim4, Noah J Cowan5.   

Abstract

BACKGROUND: Camera images can encode large amounts of visual information of an animal and its environment, enabling high fidelity 3D reconstruction of the animal and its environment using computer vision methods. Most systems, both markerless (e.g. deep learning based) and marker-based, require multiple cameras to track features across multiple points of view to enable such 3D reconstruction. However, such systems can be expensive and are challenging to set up in small animal research apparatuses. NEW
METHODS: We present an open-source, marker-based system for tracking the head of a rodent for behavioral research that requires only a single camera with a potentially wide field of view. The system features a lightweight visual target and computer vision algorithms that together enable high-accuracy tracking of the six-degree-of-freedom position and orientation of the animal's head. The system, which only requires a single camera positioned above the behavioral arena, robustly reconstructs the pose over a wide range of head angles (360° in yaw, and approximately ± 120° in roll and pitch).
RESULTS: Experiments with live animals demonstrate that the system can reliably identify rat head position and orientation. Evaluations using a commercial optical tracker device show that the system achieves accuracy that rivals commercial multi-camera systems. COMPARISON WITH EXISTING
METHODS: Our solution significantly improves upon existing monocular marker-based tracking methods, both in accuracy and in allowable range of motion.
CONCLUSIONS: The proposed system enables the study of complex behaviors by providing robust, fine-scale measurements of rodent head motions in a wide range of orientations.
Copyright © 2021 Elsevier B.V. All rights reserved.

Entities:  

Keywords:  Machine vision; Motion tracking; Rat; Rodent; Video

Mesh:

Year:  2021        PMID: 34968626      PMCID: PMC8857048          DOI: 10.1016/j.jneumeth.2021.109453

Source DB:  PubMed          Journal:  J Neurosci Methods        ISSN: 0165-0270            Impact factor:   2.390


  30 in total

Review 1.  Global positioning system and associated technologies in animal behaviour and ecological research.

Authors:  Stanley M Tomkiewicz; Mark R Fuller; John G Kie; Kirk K Bates
Journal:  Philos Trans R Soc Lond B Biol Sci       Date:  2010-07-27       Impact factor: 6.237

2.  Using DeepLabCut for 3D markerless pose estimation across species and behaviors.

Authors:  Tanmay Nath; Alexander Mathis; An Chi Chen; Amir Patel; Matthias Bethge; Mackenzie Weygandt Mathis
Journal:  Nat Protoc       Date:  2019-06-21       Impact factor: 13.491

Review 3.  Machine vision methods for analyzing social interactions.

Authors:  Alice A Robie; Kelly M Seagraves; S E Roian Egnor; Kristin Branson
Journal:  J Exp Biol       Date:  2017-01-01       Impact factor: 3.312

4.  Hippocampal electrical activity and voluntary movement in the rat.

Authors:  C H Vanderwolf
Journal:  Electroencephalogr Clin Neurophysiol       Date:  1969-04

5.  Real-time, low-latency closed-loop feedback using markerless posture tracking.

Authors:  Gary A Kane; Gonçalo Lopes; Jonny L Saunders; Alexander Mathis; Mackenzie W Mathis
Journal:  Elife       Date:  2020-12-08       Impact factor: 8.140

6.  A passive, camera-based head-tracking system for real-time, three-dimensional estimation of head position and orientation in rodents.

Authors:  Walter Vanzella; Natalia Grion; Daniele Bertolini; Andrea Perissinotto; Marco Gigante; Davide Zoccolan
Journal:  J Neurophysiol       Date:  2019-09-25       Impact factor: 2.714

7.  The Dome: A virtual reality apparatus for freely locomoting rodents.

Authors:  Manu S Madhav; Ravikrishnan P Jayakumar; Shahin G Lashkari; Francesco Savelli; Hugh T Blair; James J Knierim; Noah J Cowan
Journal:  J Neurosci Methods       Date:  2021-08-26       Impact factor: 2.987

8.  Wing and body motion during flight initiation in Drosophila revealed by automated visual tracking.

Authors:  Ebraheem I Fontaine; Francisco Zabala; Michael H Dickinson; Joel W Burdick
Journal:  J Exp Biol       Date:  2009-05       Impact factor: 3.312

9.  Recalibration of path integration in hippocampal place cells.

Authors:  Ravikrishnan P Jayakumar; Manu S Madhav; Noah J Cowan; James J Knierim; Francesco Savelli; Hugh T Blair
Journal:  Nature       Date:  2019-02-11       Impact factor: 49.962

10.  A Manufacturing-Oriented Intelligent Vision System Based on Deep Neural Network for Object Recognition and 6D Pose Estimation.

Authors:  Guoyuan Liang; Fan Chen; Yu Liang; Yachun Feng; Can Wang; Xinyu Wu
Journal:  Front Neurorobot       Date:  2021-01-07       Impact factor: 2.650

View more
  1 in total

1.  The Dome: A virtual reality apparatus for freely locomoting rodents.

Authors:  Manu S Madhav; Ravikrishnan P Jayakumar; Shahin G Lashkari; Francesco Savelli; Hugh T Blair; James J Knierim; Noah J Cowan
Journal:  J Neurosci Methods       Date:  2021-08-26       Impact factor: 2.987

  1 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.