| Literature DB >> 33289631 |
Gary A Kane1, Gonçalo Lopes2, Jonny L Saunders3, Alexander Mathis1,4, Mackenzie W Mathis1,4.
Abstract
The ability to control a behavioral task or stimulate neural activity based on animal behavior in real-time is an important tool for experimental neuroscientists. Ideally, such tools are noninvasive, low-latency, and provide interfaces to trigger external hardware based on posture. Recent advances in pose estimation with deep learning allows researchers to train deep neural networks to accurately quantify a wide variety of animal behaviors. Here, we provide a new <monospace>DeepLabCut-Live!</monospace> package that achieves low-latency real-time pose estimation (within 15 ms, >100 FPS), with an additional forward-prediction module that achieves zero-latency feedback, and a dynamic-cropping mode that allows for higher inference speeds. We also provide three options for using this tool with ease: (1) a stand-alone GUI (called <monospace>DLC-Live! GUI</monospace>), and integration into (2) <monospace>Bonsai,</monospace> and (3) <monospace>AutoPilot</monospace>. Lastly, we benchmarked performance on a wide range of systems so that experimentalists can easily decide what hardware is required for their needs.Entities:
Keywords: DeepLabCut; any animal; computational biology; low-latency; mouse; neuroscience; pose-estimation; real-time tracking; systems biology
Mesh:
Year: 2020 PMID: 33289631 PMCID: PMC7781595 DOI: 10.7554/eLife.61909
Source DB: PubMed Journal: Elife ISSN: 2050-084X Impact factor: 8.140