Literature DB >> 33787490

Analysis of ultrasonic vocalizations from mice using computer vision and machine learning.

Antonio Ho Fonseca1,2, Gustavo M Santana1,2,3, Gabriela M Bosque Ortiz1,4, Sérgio Bampi2, Marcelo O Dietrich1,3,4,5.   

Abstract

Mice emit ultrasonic vocalizations (USVs) that communicate socially relevant information. To detect and classify these USVs, here we describe VocalMat. VocalMat is a software that uses image-processing and differential geometry approaches to detect USVs in audio files, eliminating the need for user-defined parameters. VocalMat also uses computational vision and machine learning methods to classify USVs into distinct categories. In a data set of >4000 USVs emitted by mice, VocalMat detected over 98% of manually labeled USVs and accurately classified ≈86% of the USVs out of 11 USV categories. We then used dimensionality reduction tools to analyze the probability distribution of USV classification among different experimental groups, providing a robust method to quantify and qualify the vocal repertoire of mice. Thus, VocalMat makes it possible to perform automated, accurate, and quantitative analysis of USVs without the need for user inputs, opening the opportunity for detailed and high-throughput analysis of this behavior.
© 2021, Fonseca et al.

Entities:  

Keywords:  communication; computer vision; machine learning; mouse; neuroscience; social behavior; ultrasonic vocalization; vocalization

Mesh:

Year:  2021        PMID: 33787490      PMCID: PMC8057810          DOI: 10.7554/eLife.59161

Source DB:  PubMed          Journal:  Elife        ISSN: 2050-084X            Impact factor:   8.140


  30 in total

Review 1.  Ultrasonic vocalisation emitted by infant rodents: a tool for assessment of neurobehavioural development.

Authors:  I Branchi; D Santucci; E Alleva
Journal:  Behav Brain Res       Date:  2001-11-01       Impact factor: 3.332

2.  Spectrographic analyses reveal signals of individuality and kinship in the ultrasonic courtship vocalizations of wild house mice.

Authors:  Frauke Hoffmann; Kerstin Musolf; Dustin J Penn
Journal:  Physiol Behav       Date:  2011-10-21

3.  DeepSqueak: a deep learning-based system for detection and analysis of ultrasonic vocalizations.

Authors:  Kevin R Coffey; Russell G Marx; John F Neumaier
Journal:  Neuropsychopharmacology       Date:  2019-01-04       Impact factor: 7.853

4.  Analysis of ultrasonic vocalizations emitted by infant rodents.

Authors:  Igor Branchi; Daniela Santucci; Enrico Alleva
Journal:  Curr Protoc Toxicol       Date:  2006

Review 5.  Quantifying behavior to understand the brain.

Authors:  Talmo D Pereira; Joshua W Shaevitz; Mala Murthy
Journal:  Nat Neurosci       Date:  2020-11-09       Impact factor: 24.884

6.  Comparative studies of the ultrasonic calls of infant murid rodents.

Authors:  G D Sales; J C Smith
Journal:  Dev Psychobiol       Date:  1978-11       Impact factor: 3.038

7.  Unusual repertoire of vocalizations in adult BTBR T+tf/J mice during three types of social encounters.

Authors:  M L Scattoni; L Ricceri; J N Crawley
Journal:  Genes Brain Behav       Date:  2011-02       Impact factor: 3.449

Review 8.  Mouse vocal communication system: are ultrasounds learned or innate?

Authors:  Gustavo Arriaga; Erich D Jarvis
Journal:  Brain Lang       Date:  2013-01-04       Impact factor: 2.381

9.  Ultrasonic songs of male mice.

Authors:  Timothy E Holy; Zhongsheng Guo
Journal:  PLoS Biol       Date:  2005-11-01       Impact factor: 8.029

10.  Male mice song syntax depends on social contexts and influences female preferences.

Authors:  Jonathan Chabout; Abhra Sarkar; David B Dunson; Erich D Jarvis
Journal:  Front Behav Neurosci       Date:  2015-04-01       Impact factor: 3.558

View more
  7 in total

1.  Capturing the songs of mice with an improved detection and classification method for ultrasonic vocalizations (BootSnap).

Authors:  Reyhaneh Abbasi; Peter Balazs; Maria Adelaide Marconi; Doris Nicolakis; Sarah M Zala; Dustin J Penn
Journal:  PLoS Comput Biol       Date:  2022-05-12       Impact factor: 4.779

2.  Automated annotation of birdsong with a neural network that segments spectrograms.

Authors:  Yarden Cohen; David Aaron Nicholson; Alexa Sanchioni; Emily K Mallaber; Viktoriya Skidanova; Timothy J Gardner
Journal:  Elife       Date:  2022-01-20       Impact factor: 8.713

3.  Investigating the Neurobiology of Abnormal Social Behaviors.

Authors:  S William Li; Ziv M Williams; Raymundo Báez-Mendoza
Journal:  Front Neural Circuits       Date:  2021-11-30       Impact factor: 3.492

4.  HybridMouse: A Hybrid Convolutional-Recurrent Neural Network-Based Model for Identification of Mouse Ultrasonic Vocalizations.

Authors:  Yizhaq Goussha; Kfir Bar; Shai Netser; Lior Cohen; Yacov Hel-Or; Shlomo Wagner
Journal:  Front Behav Neurosci       Date:  2022-01-25       Impact factor: 3.558

5.  Computational bioacoustics with deep learning: a review and roadmap.

Authors:  Dan Stowell
Journal:  PeerJ       Date:  2022-03-21       Impact factor: 2.984

6.  Advanced paternal age diversifies individual trajectories of vocalization patterns in neonatal mice.

Authors:  Lingling Mai; Hitoshi Inada; Ryuichi Kimura; Kouta Kanno; Takeru Matsuda; Ryosuke O Tachibana; Valter Tucci; Fumiyasu Komaki; Noboru Hiroi; Noriko Osumi
Journal:  iScience       Date:  2022-08-11

7.  Acoustic camera system for measuring ultrasound communication in mice.

Authors:  Jumpei Matsumoto; Kouta Kanno; Masahiro Kato; Hiroshi Nishimaru; Tsuyoshi Setogawa; Choijiljav Chinzorig; Tomohiro Shibata; Hisao Nishijo
Journal:  iScience       Date:  2022-07-21
  7 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.